Uncertainties Related to Extreme Event Statistics of Sewer System Surcharge and Overflow
DEFF Research Database (Denmark)
Schaarup-Jensen, Kjeld; Johansen, C.; Thorndahl, Søren Liedtke
2005-01-01
Today it is common practice - in the major part of Europe - to base design of sewer systems in urban areas on recommended minimum values of flooding frequencies related to either pipe top level, basement level in buildings or level of road surfaces. Thus storm water runoff in sewer systems is only...... proceeding in an acceptable manner, if flooding of these levels is having an average return period bigger than a predefined value. This practice is also often used in functional analysis of existing sewer systems. If a sewer system can fulfil recommended flooding frequencies or not, can only be verified...... by performing long term simulations - using a sewer flow simulation model - and draw up extreme event statistics from the model simulations. In this context it is important to realize that uncertainties related to the input parameters of rainfall runoff models will give rise to uncertainties related...
Properties of incident reporting systems in relation to statistical trend and pattern analysis
International Nuclear Information System (INIS)
Kalfsbeek, H.W.; Arsenis, S.P.
1990-01-01
This paper describes the properties deemed desirable for an incident reporting system in order to render it useful for extracting valid statistical trend and pattern information. The perspective under which a data collection system is seen in this paper is the following: data are essentially gathered on a set of variables describing an event or incident (the items featuring on a reporting format) in order to learn about (multiple) dependencies (called interactions) between these variables. Hence, the necessary features of the data source are highlighted and potential problem sources limiting the validity of the results to be obtained are identified. In this frame, important issues are the reporting completeness, related to the reporting criteria and reporting frequency, and of course the reporting contents and quality. The choice of the report items (the variables) and their categorization (code dictionary) may influence (bias) the insights gained from trend and pattern analyses, as may the presence or absence of a structure for correlating the reported issues within an incident. The issues addressed in this paper are brought in relation to some real world reporting systems on safety related events in Nuclear Power Plants, so that their possibilities and limitations with regard to statistical trend and pattern analysis become manifest
Sharma, Sanjib; Siddique, Ridwan; Reed, Seann; Ahnert, Peter; Mendoza, Pablo; Mejia, Alfonso
2018-03-01
The relative roles of statistical weather preprocessing and streamflow postprocessing in hydrological ensemble forecasting at short- to medium-range forecast lead times (day 1-7) are investigated. For this purpose, a regional hydrologic ensemble prediction system (RHEPS) is developed and implemented. The RHEPS is comprised of the following components: (i) hydrometeorological observations (multisensor precipitation estimates, gridded surface temperature, and gauged streamflow); (ii) weather ensemble forecasts (precipitation and near-surface temperature) from the National Centers for Environmental Prediction 11-member Global Ensemble Forecast System Reforecast version 2 (GEFSRv2); (iii) NOAA's Hydrology Laboratory-Research Distributed Hydrologic Model (HL-RDHM); (iv) heteroscedastic censored logistic regression (HCLR) as the statistical preprocessor; (v) two statistical postprocessors, an autoregressive model with a single exogenous variable (ARX(1,1)) and quantile regression (QR); and (vi) a comprehensive verification strategy. To implement the RHEPS, 1 to 7 days weather forecasts from the GEFSRv2 are used to force HL-RDHM and generate raw ensemble streamflow forecasts. Forecasting experiments are conducted in four nested basins in the US Middle Atlantic region, ranging in size from 381 to 12 362 km2. Results show that the HCLR preprocessed ensemble precipitation forecasts have greater skill than the raw forecasts. These improvements are more noticeable in the warm season at the longer lead times (> 3 days). Both postprocessors, ARX(1,1) and QR, show gains in skill relative to the raw ensemble streamflow forecasts, particularly in the cool season, but QR outperforms ARX(1,1). The scenarios that implement preprocessing and postprocessing separately tend to perform similarly, although the postprocessing-alone scenario is often more effective. The scenario involving both preprocessing and postprocessing consistently outperforms the other scenarios. In some cases
Statistical imitation system using relational interest points and Gaussian mixture models
CSIR Research Space (South Africa)
Claassens, J
2009-11-01
Full Text Available The author proposes an imitation system that uses relational interest points (RIPs) and Gaussian mixture models (GMMs) to characterize a behaviour. The system's structure is inspired by the Robot Programming by Demonstration (RDP) paradigm...
Functional statistics and related fields
Bongiorno, Enea; Cao, Ricardo; Vieu, Philippe
2017-01-01
This volume collects latest methodological and applied contributions on functional, high-dimensional and other complex data, related statistical models and tools as well as on operator-based statistics. It contains selected and refereed contributions presented at the Fourth International Workshop on Functional and Operatorial Statistics (IWFOS 2017) held in A Coruña, Spain, from 15 to 17 June 2017. The series of IWFOS workshops was initiated by the Working Group on Functional and Operatorial Statistics at the University of Toulouse in 2008. Since then, many of the major advances in functional statistics and related fields have been periodically presented and discussed at the IWFOS workshops. .
2013-01-01
This book offers a comprehensive picture of nonequilibrium phenomena in nanoscale systems. Written by internationally recognized experts in the field, this book strikes a balance between theory and experiment, and includes in-depth introductions to nonequilibrium fluctuation relations, nonlinear dynamics and transport, single molecule experiments, and molecular diffusion in nanopores. The authors explore the application of these concepts to nano- and biosystems by cross-linking key methods and ideas from nonequilibrium statistical physics, thermodynamics, stochastic theory, and dynamical s
2012-11-06
... into effect. Exclusions for exceeding a CNL will be based on full 2012 calendar-year import statistics...--Ferrosilicon containing between 55% and 80% of silicon (Russia) 2106.90.99--Miscellaneous food preparations not canned or frozen (Thailand) 9506.70.40--Ice skates w/footwear permanently attached (Thailand) The list...
Statistically significant relational data mining :
Energy Technology Data Exchange (ETDEWEB)
Berry, Jonathan W.; Leung, Vitus Joseph; Phillips, Cynthia Ann; Pinar, Ali; Robinson, David Gerald; Berger-Wolf, Tanya; Bhowmick, Sanjukta; Casleton, Emily; Kaiser, Mark; Nordman, Daniel J.; Wilson, Alyson G.
2014-02-01
This report summarizes the work performed under the project (3z(BStatitically significant relational data mining.(3y (BThe goal of the project was to add more statistical rigor to the fairly ad hoc area of data mining on graphs. Our goal was to develop better algorithms and better ways to evaluate algorithm quality. We concetrated on algorithms for community detection, approximate pattern matching, and graph similarity measures. Approximate pattern matching involves finding an instance of a relatively small pattern, expressed with tolerance, in a large graph of data observed with uncertainty. This report gathers the abstracts and references for the eight refereed publications that have appeared as part of this work. We then archive three pieces of research that have not yet been published. The first is theoretical and experimental evidence that a popular statistical measure for comparison of community assignments favors over-resolved communities over approximations to a ground truth. The second are statistically motivated methods for measuring the quality of an approximate match of a small pattern in a large graph. The third is a new probabilistic random graph model. Statisticians favor these models for graph analysis. The new local structure graph model overcomes some of the issues with popular models such as exponential random graph models and latent variable models.
Energy Technology Data Exchange (ETDEWEB)
Panka, Istvan; Hegyi, Gyoergy; Maraczy, Csaba; Temesvari, Emese [Hungarian Academy of Sciences, Budapest (Hungary). Reactor Analysis Dept.
2017-11-15
The best-estimate KARATE code system has been widely used for core design calculations and simulations of slow transients of VVER reactors. Recently there has been an increasing need for assessing the uncertainties of such calculations by propagating the basic input uncertainties of the models through the full calculation chain. In order to determine the uncertainties of quantities of interest during the burnup, the statistical version of the KARATE code system has been elaborated. In the first part of the paper, the main features of the new code system are discussed. The applied statistical method is based on Monte-Carlo sampling of the considered input data taking into account mainly the covariance matrices of the cross sections and/or the technological uncertainties. In the second part of the paper, only the uncertainties of cross sections are considered and an equilibrium cycle related to a VVER-440 type reactor is investigated. The burnup dependence of the uncertainties of some safety related parameters (e.g. critical boron concentration, rod worth, feedback coefficients, assembly-wise radial power and burnup distribution) are discussed and compared to the recently used limits.
Bowden, Peter; Beavis, Ron; Marshall, John
2009-11-02
A goodness of fit test may be used to assign tandem mass spectra of peptides to amino acid sequences and to directly calculate the expected probability of mis-identification. The product of the peptide expectation values directly yields the probability that the parent protein has been mis-identified. A relational database could capture the mass spectral data, the best fit results, and permit subsequent calculations by a general statistical analysis system. The many files of the Hupo blood protein data correlated by X!TANDEM against the proteins of ENSEMBL were collected into a relational database. A redundant set of 247,077 proteins and peptides were correlated by X!TANDEM, and that was collapsed to a set of 34,956 peptides from 13,379 distinct proteins. About 6875 distinct proteins were only represented by a single distinct peptide, 2866 proteins showed 2 distinct peptides, and 3454 proteins showed at least three distinct peptides by X!TANDEM. More than 99% of the peptides were associated with proteins that had cumulative expectation values, i.e. probability of false positive identification, of one in one hundred or less. The distribution of peptides per protein from X!TANDEM was significantly different than those expected from random assignment of peptides.
International Nuclear Information System (INIS)
Dai, Wu-Sheng; Xie, Mi
2013-01-01
In this paper, we give a general discussion on the calculation of the statistical distribution from a given operator relation of creation, annihilation, and number operators. Our result shows that as long as the relation between the number operator and the creation and annihilation operators can be expressed as a † b=Λ(N) or N=Λ −1 (a † b), where N, a † , and b denote the number, creation, and annihilation operators, i.e., N is a function of quadratic product of the creation and annihilation operators, the corresponding statistical distribution is the Gentile distribution, a statistical distribution in which the maximum occupation number is an arbitrary integer. As examples, we discuss the statistical distributions corresponding to various operator relations. In particular, besides the Bose–Einstein and Fermi–Dirac cases, we discuss the statistical distributions for various schemes of intermediate statistics, especially various q-deformation schemes. Our result shows that the statistical distributions corresponding to various q-deformation schemes are various Gentile distributions with different maximum occupation numbers which are determined by the deformation parameter q. This result shows that the results given in much literature on the q-deformation distribution are inaccurate or incomplete. -- Highlights: ► A general discussion on calculating statistical distribution from relations of creation, annihilation, and number operators. ► A systemic study on the statistical distributions corresponding to various q-deformation schemes. ► Arguing that many results of q-deformation distributions in literature are inaccurate or incomplete
Statistical Thermodynamics of Disperse Systems
DEFF Research Database (Denmark)
Shapiro, Alexander
1996-01-01
Principles of statistical physics are applied for the description of thermodynamic equilibrium in disperse systems. The cells of disperse systems are shown to possess a number of non-standard thermodynamic parameters. A random distribution of these parameters in the system is determined....... On the basis of this distribution, it is established that the disperse system has an additional degree of freedom called the macro-entropy. A large set of bounded ideal disperse systems allows exact evaluation of thermodynamic characteristics. The theory developed is applied to the description of equilibrium...
Nuclear material statistical accountancy system
International Nuclear Information System (INIS)
Argentest, F.; Casilli, T.; Franklin, M.
1979-01-01
The statistical accountancy system developed at JRC Ispra is refered as 'NUMSAS', ie Nuclear Material Statistical Accountancy System. The principal feature of NUMSAS is that in addition to an ordinary material balance calcultation, NUMSAS can calculate an estimate of the standard deviation of the measurement error accumulated in the material balance calculation. The purpose of the report is to describe in detail, the statistical model on wich the standard deviation calculation is based; the computational formula which is used by NUMSAS in calculating the standard deviation and the information about nuclear material measurements and the plant measurement system which are required as data for NUMSAS. The material balance records require processing and interpretation before the material balance calculation is begun. The material balance calculation is the last of four phases of data processing undertaken by NUMSAS. Each of these phases is implemented by a different computer program. The activities which are carried out in each phase can be summarised as follows; the pre-processing phase; the selection and up-date phase; the transformation phase, and the computation phase
International Nuclear Information System (INIS)
Nemnes, G A; Anghel, D V
2010-01-01
We present a stochastic method for the simulation of the time evolution in systems which obey generalized statistics, namely fractional exclusion statistics and Gentile's statistics. The transition rates are derived in the framework of canonical ensembles. This approach introduces a tool for describing interacting fermionic and bosonic systems in non-equilibrium as ideal FES systems, in a computationally efficient manner. The two types of statistics are analyzed comparatively, indicating their intrinsic thermodynamic differences and revealing key aspects related to the species size
Statistical equilibrium and symplectic geometry in general relativity
International Nuclear Information System (INIS)
Iglesias, P.
1981-09-01
A geometrical construction is given of the statistical equilibrium states of a system of particles in the gravitational field in general relativity. By a method of localization variables, the expression of thermodynamic values is given and the compatibility of this description is shown with a macroscopic model of a relativistic continuous medium for a given value of the free-energy function [fr
Statistical mechanics of program systems
International Nuclear Information System (INIS)
Neirotti, Juan P; Caticha, Nestor
2006-01-01
We discuss the collective behaviour of a set of operators and variables that constitute a program and the emergence of meaningful computational properties in the language of statistical mechanics. This is done by appropriately modifying available Monte Carlo methods to deal with hierarchical structures. The study suggests, in analogy with simulated annealing, a method to automatically design programs. Reasonable solutions can be found, at low temperatures, when the method is applied to simple toy problems such as finding an algorithm that determines the roots of a function or one that makes a nonlinear regression. Peaks in the specific heat are interpreted as signalling phase transitions which separate regions where different algorithmic strategies are used to solve the problem
Statistics and Corporate Environmental Management: Relations and Problems
DEFF Research Database (Denmark)
Madsen, Henning; Ulhøi, John Parm
1997-01-01
Statistical methods have long been used to analyse the macroeconomic consequences of environmentally damaging activities, political actions to control, prevent, or reduce these damages, and environmental problems in the natural environment. Up to now, however, they have had a limited and not very...... specific use in corporate environmental management systems. This paper will address some of the special problems related to the use of statistical techniques in corporate environmental management systems. One important aspect of this is the interaction of internal decisions and activities with conditions...
Automated statistical modeling of analytical measurement systems
International Nuclear Information System (INIS)
Jacobson, J.J.
1992-01-01
The statistical modeling of analytical measurement systems at the Idaho Chemical Processing Plant (ICPP) has been completely automated through computer software. The statistical modeling of analytical measurement systems is one part of a complete quality control program used by the Remote Analytical Laboratory (RAL) at the ICPP. The quality control program is an integration of automated data input, measurement system calibration, database management, and statistical process control. The quality control program and statistical modeling program meet the guidelines set forth by the American Society for Testing Materials and American National Standards Institute. A statistical model is a set of mathematical equations describing any systematic bias inherent in a measurement system and the precision of a measurement system. A statistical model is developed from data generated from the analysis of control standards. Control standards are samples which are made up at precise known levels by an independent laboratory and submitted to the RAL. The RAL analysts who process control standards do not know the values of those control standards. The object behind statistical modeling is to describe real process samples in terms of their bias and precision and, to verify that a measurement system is operating satisfactorily. The processing of control standards gives us this ability
Former Prisoner of War Statistical Tracking System
Department of Veterans Affairs — The Former Prisoner of War (POW) Statistical Tracking System database is a registry designed to comply with Public Law 97-37, the Former Prisoner of War Benefits Act...
A statistical model for instable thermodynamical systems
International Nuclear Information System (INIS)
Sommer, Jens-Uwe
2003-01-01
A generic model is presented for statistical systems which display thermodynamic features in contrast to our everyday experience, such as infinite and negative heat capacities. Such system are instable in terms of classical equilibrium thermodynamics. Using our statistical model, we are able to investigate states of instable systems which are undefined in the framework of equilibrium thermodynamics. We show that a region of negative heat capacity in the adiabatic environment, leads to a first order like phase transition when the system is coupled to a heat reservoir. This phase transition takes place without a phase coexistence. Nevertheless, all intermediate states are stable due to fluctuations. When two instable system are brought in thermal contact, the temperature of the composed system is lower than the minimum temperature of the individual systems. Generally, the equilibrium states of instable system cannot be simply decomposed into equilibrium states of the individual systems. The properties of instable system depend on the environment, ensemble equivalence is broken
Thermal equilibrium and statistical thermometers in special relativity.
Cubero, David; Casado-Pascual, Jesús; Dunkel, Jörn; Talkner, Peter; Hänggi, Peter
2007-10-26
There is an intense debate in the recent literature about the correct generalization of Maxwell's velocity distribution in special relativity. The most frequently discussed candidate distributions include the Jüttner function as well as modifications thereof. Here we report results from fully relativistic one-dimensional molecular dynamics simulations that resolve the ambiguity. The numerical evidence unequivocally favors the Jüttner distribution. Moreover, our simulations illustrate that the concept of "thermal equilibrium" extends naturally to special relativity only if a many-particle system is spatially confined. They make evident that "temperature" can be statistically defined and measured in an observer frame independent way.
Statistical evaluation of design-error related nuclear reactor accidents
International Nuclear Information System (INIS)
Ott, K.O.; Marchaterre, J.F.
1981-01-01
In this paper, general methodology for the statistical evaluation of design-error related accidents is proposed that can be applied to a variety of systems that evolves during the development of large-scale technologies. The evaluation aims at an estimate of the combined ''residual'' frequency of yet unknown types of accidents ''lurking'' in a certain technological system. A special categorization in incidents and accidents is introduced to define the events that should be jointly analyzed. The resulting formalism is applied to the development of U.S. nuclear power reactor technology, considering serious accidents (category 2 events) that involved, in the accident progression, a particular design inadequacy. 9 refs
Topics in computer simulations of statistical systems
International Nuclear Information System (INIS)
Salvador, R.S.
1987-01-01
Several computer simulations studying a variety of topics in statistical mechanics and lattice gauge theories are performed. The first study describes a Monte Carlo simulation performed on Ising systems defined on Sierpinsky carpets of dimensions between one and four. The critical coupling and the exponent γ are measured as a function of dimension. The Ising gauge theory in d = 4 - epsilon, for epsilon → 0 + , is then studied by performing a Monte Carlo simulation for the theory defined on fractals. A high statistics Monte Carlo simulation for the three-dimensional Ising model is presented for lattices of sizes 8 3 to 44 3 . All the data obtained agrees completely, within statistical errors, with the forms predicted by finite-sizing scaling. Finally, a method to estimate numerically the partition function of statistical systems is developed
Statistical methods for spatio-temporal systems
Finkenstadt, Barbel
2006-01-01
Statistical Methods for Spatio-Temporal Systems presents current statistical research issues on spatio-temporal data modeling and will promote advances in research and a greater understanding between the mechanistic and the statistical modeling communities.Contributed by leading researchers in the field, each self-contained chapter starts with an introduction of the topic and progresses to recent research results. Presenting specific examples of epidemic data of bovine tuberculosis, gastroenteric disease, and the U.K. foot-and-mouth outbreak, the first chapter uses stochastic models, such as point process models, to provide the probabilistic backbone that facilitates statistical inference from data. The next chapter discusses the critical issue of modeling random growth objects in diverse biological systems, such as bacteria colonies, tumors, and plant populations. The subsequent chapter examines data transformation tools using examples from ecology and air quality data, followed by a chapter on space-time co...
Statistical validation of earthquake related observations
Kossobokov, V. G.
2011-12-01
The confirmed fractal nature of earthquakes and their distribution in space and time implies that many traditional estimations of seismic hazard (from term-less to short-term ones) are usually based on erroneous assumptions of easy tractable or, conversely, delicately-designed models. The widespread practice of deceptive modeling considered as a "reasonable proxy" of the natural seismic process leads to seismic hazard assessment of unknown quality, which errors propagate non-linearly into inflicted estimates of risk and, eventually, into unexpected societal losses of unacceptable level. The studies aimed at forecast/prediction of earthquakes must include validation in the retro- (at least) and, eventually, in prospective tests. In the absence of such control a suggested "precursor/signal" remains a "candidate", which link to target seismic event is a model assumption. Predicting in advance is the only decisive test of forecast/predictions and, therefore, the score-card of any "established precursor/signal" represented by the empirical probabilities of alarms and failures-to-predict achieved in prospective testing must prove statistical significance rejecting the null-hypothesis of random coincidental occurrence in advance target earthquakes. We reiterate suggesting so-called "Seismic Roulette" null-hypothesis as the most adequate undisturbed random alternative accounting for the empirical spatial distribution of earthquakes: (i) Consider a roulette wheel with as many sectors as the number of earthquake locations from a sample catalog representing seismic locus, a sector per each location and (ii) make your bet according to prediction (i.e., determine, which locations are inside area of alarm, and put one chip in each of the corresponding sectors); (iii) Nature turns the wheel; (iv) accumulate statistics of wins and losses along with the number of chips spent. If a precursor in charge of prediction exposes an imperfection of Seismic Roulette then, having in mind
Features of statistical dynamics in a finite system
International Nuclear Information System (INIS)
Yan, Shiwei; Sakata, Fumihiko; Zhuo Yizhong
2002-01-01
We study features of statistical dynamics in a finite Hamilton system composed of a relevant one degree of freedom coupled to an irrelevant multidegree of freedom system through a weak interaction. Special attention is paid on how the statistical dynamics changes depending on the number of degrees of freedom in the irrelevant system. It is found that the macrolevel statistical aspects are strongly related to an appearance of the microlevel chaotic motion, and a dissipation of the relevant motion is realized passing through three distinct stages: dephasing, statistical relaxation, and equilibrium regimes. It is clarified that the dynamical description and the conventional transport approach provide us with almost the same macrolevel and microlevel mechanisms only for the system with a very large number of irrelevant degrees of freedom. It is also shown that the statistical relaxation in the finite system is an anomalous diffusion and the fluctuation effects have a finite correlation time
A statistical approach to root system classification.
Directory of Open Access Journals (Sweden)
Gernot eBodner
2013-08-01
Full Text Available Plant root systems have a key role in ecology and agronomy. In spite of fast increase in root studies, still there is no classification that allows distinguishing among distinctive characteristics within the diversity of rooting strategies. Our hypothesis is that a multivariate approach for plant functional type identification in ecology can be applied to the classification of root systems. We demonstrate that combining principal component and cluster analysis yields a meaningful classification of rooting types based on morphological traits. The classification method presented is based on a data-defined statistical procedure without a priori decision on the classifiers. Biplot inspection is used to determine key traits and to ensure stability in cluster based grouping. The classification method is exemplified with simulated root architectures and morphological field data. Simulated root architectures showed that morphological attributes with spatial distribution parameters capture most distinctive features within root system diversity. While developmental type (tap vs. shoot-borne systems is a strong, but coarse classifier, topological traits provide the most detailed differentiation among distinctive groups. Adequacy of commonly available morphologic traits for classification is supported by field data. Three rooting types emerged from measured data, distinguished by diameter/weight, density and spatial distribution respectively. Similarity of root systems within distinctive groups was the joint result of phylogenetic relation and environmental as well as human selection pressure. We concluded that the data-define classification is appropriate for integration of knowledge obtained with different root measurement methods and at various scales. Currently root morphology is the most promising basis for classification due to widely used common measurement protocols. To capture details of root diversity efforts in architectural measurement
Statistics and Corporate Environmental Management: Relations and Problems
DEFF Research Database (Denmark)
Madsen, Henning; Ulhøi, John Parm
1997-01-01
Statistical methods have long been used to analyse the macroeconomic consequences of environmentally damaging activities, political actions to control, prevent, or reduce these damages, and environmental problems in the natural environment. Up to now, however, they have had a limited and not very...... in the external environment. The nature and extent of the practical use of quantitative techniques in corporate environmental management systems is discussed on the basis of a number of company surveys in four European countries.......Statistical methods have long been used to analyse the macroeconomic consequences of environmentally damaging activities, political actions to control, prevent, or reduce these damages, and environmental problems in the natural environment. Up to now, however, they have had a limited and not very...... specific use in corporate environmental management systems. This paper will address some of the special problems related to the use of statistical techniques in corporate environmental management systems. One important aspect of this is the interaction of internal decisions and activities with conditions...
Statistical mechanics in the context of special relativity.
Kaniadakis, G
2002-11-01
In Ref. [Physica A 296, 405 (2001)], starting from the one parameter deformation of the exponential function exp(kappa)(x)=(sqrt[1+kappa(2)x(2)]+kappax)(1/kappa), a statistical mechanics has been constructed which reduces to the ordinary Boltzmann-Gibbs statistical mechanics as the deformation parameter kappa approaches to zero. The distribution f=exp(kappa)(-beta E+betamu) obtained within this statistical mechanics shows a power law tail and depends on the nonspecified parameter beta, containing all the information about the temperature of the system. On the other hand, the entropic form S(kappa)= integral d(3)p(c(kappa) f(1+kappa)+c(-kappa) f(1-kappa)), which after maximization produces the distribution f and reduces to the standard Boltzmann-Shannon entropy S0 as kappa-->0, contains the coefficient c(kappa) whose expression involves, beside the Boltzmann constant, another nonspecified parameter alpha. In the present effort we show that S(kappa) is the unique existing entropy obtained by a continuous deformation of S0 and preserving unaltered its fundamental properties of concavity, additivity, and extensivity. These properties of S(kappa) permit to determine unequivocally the values of the above mentioned parameters beta and alpha. Subsequently, we explain the origin of the deformation mechanism introduced by kappa and show that this deformation emerges naturally within the Einstein special relativity. Furthermore, we extend the theory in order to treat statistical systems in a time dependent and relativistic context. Then, we show that it is possible to determine in a self consistent scheme within the special relativity the values of the free parameter kappa which results to depend on the light speed c and reduces to zero as c--> infinity recovering in this way the ordinary statistical mechanics and thermodynamics. The statistical mechanics here presented, does not contain free parameters, preserves unaltered the mathematical and epistemological structure of
Statistical Model Checking for Biological Systems
DEFF Research Database (Denmark)
David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel
2014-01-01
Statistical Model Checking (SMC) is a highly scalable simulation-based verification approach for testing and estimating the probability that a stochastic system satisfies a given linear temporal property. The technique has been applied to (discrete and continuous time) Markov chains, stochastic...
Statistics of multi-tube detecting systems
International Nuclear Information System (INIS)
Grau Carles, P.; Grau Malonda, A.
1994-01-01
In this paper three new statistical theorems are demonstrated and applied. These theorems simplify very much the obtention of the formulae to compute the counting efficiency when the detection system is formed by several photomultipliers associated in coincidence and sume. These theorems are applied to several photomultiplier arrangements in order to show their potential and the application. way
Statistical mechanics of systems of unbounded spins
Energy Technology Data Exchange (ETDEWEB)
Lebowitz, J L [Yeshiva Univ., New York (USA). Belfer Graduate School of Science; Presutti, E [L' Aquila Univ. (Italy). Istituto di Matematica
1976-11-01
We develop the statistical mechanics of unbounded n-component spin systems interacting via potentials which are superstable and strongly tempered. The uniqueness of the equilibrium state is then proven for one component ferromagnetic spins whose free energy is differentiable with respect to the magnetic field.
Statistics of multi-tube detecting systems
International Nuclear Information System (INIS)
Grau Carles, P.; Grau Malonda, A.
1994-01-01
In this paper three new statistical theorems are demonstrated and applied. These theorems simplify very much the obtention of the formulae to compute the counting efficiency when the detection system is formed by several photomultipliers associated in coincidence and sum. These theorems are applied to several photomultiplier arrangements in order to show their potential and the application way. (Author) 6 refs
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
Statistical evaluation of design-error related accidents
International Nuclear Information System (INIS)
Ott, K.O.; Marchaterre, J.F.
1980-01-01
In a recently published paper (Campbell and Ott, 1979), a general methodology was proposed for the statistical evaluation of design-error related accidents. The evaluation aims at an estimate of the combined residual frequency of yet unknown types of accidents lurking in a certain technological system. Here, the original methodology is extended, as to apply to a variety of systems that evolves during the development of large-scale technologies. A special categorization of incidents and accidents is introduced to define the events that should be jointly analyzed. The resulting formalism is applied to the development of the nuclear power reactor technology, considering serious accidents that involve in the accident-progression a particular design inadequacy
Quantum statistics of many-particle systems
International Nuclear Information System (INIS)
Kraeft, W.D.; Ebeling, W.; Kremp, D.; Ropke, G.
1986-01-01
This paper presents the elements of quantum statistics and discusses the quantum mechanics of many-particle systems. The method of second quantization is discussed and the Bogolyubov hierarchy is examined. The general properties of the correlation function and one-particle Green's function are examined. The paper presents dynamical and thermodynamical information contained in the spectral function. An equation of motion is given for the one-particle Green's function. T-matrix and thermodynamic properties in binary collision approximation are discussed
Statistical mechanics of driven diffusive systems
Schmittmann, B
1995-01-01
Far-from-equilibrium phenomena, while abundant in nature, are not nearly as well understood as their equilibrium counterparts. On the theoretical side, progress is slowed by the lack of a simple framework, such as the Boltzmann-Gbbs paradigm in the case of equilibrium thermodynamics. On the experimental side, the enormous structural complexity of real systems poses serious obstacles to comprehension. Similar difficulties have been overcome in equilibrium statistical mechanics by focusing on model systems. Even if they seem too simplistic for known physical systems, models give us considerable insight, provided they capture the essential physics. They serve as important theoretical testing grounds where the relationship between the generic physical behavior and the key ingredients of a successful theory can be identified and understood in detail. Within the vast realm of non-equilibrium physics, driven diffusive systems form a subset with particularly interesting properties. As a prototype model for these syst...
Statistical modeling to support power system planning
Staid, Andrea
This dissertation focuses on data-analytic approaches that improve our understanding of power system applications to promote better decision-making. It tackles issues of risk analysis, uncertainty management, resource estimation, and the impacts of climate change. Tools of data mining and statistical modeling are used to bring new insight to a variety of complex problems facing today's power system. The overarching goal of this research is to improve the understanding of the power system risk environment for improved operation, investment, and planning decisions. The first chapter introduces some challenges faced in planning for a sustainable power system. Chapter 2 analyzes the driving factors behind the disparity in wind energy investments among states with a goal of determining the impact that state-level policies have on incentivizing wind energy. Findings show that policy differences do not explain the disparities; physical and geographical factors are more important. Chapter 3 extends conventional wind forecasting to a risk-based focus of predicting maximum wind speeds, which are dangerous for offshore operations. Statistical models are presented that issue probabilistic predictions for the highest wind speed expected in a three-hour interval. These models achieve a high degree of accuracy and their use can improve safety and reliability in practice. Chapter 4 examines the challenges of wind power estimation for onshore wind farms. Several methods for wind power resource assessment are compared, and the weaknesses of the Jensen model are demonstrated. For two onshore farms, statistical models outperform other methods, even when very little information is known about the wind farm. Lastly, chapter 5 focuses on the power system more broadly in the context of the risks expected from tropical cyclones in a changing climate. Risks to U.S. power system infrastructure are simulated under different scenarios of tropical cyclone behavior that may result from climate
Statistical Physics of Complex Substitutive Systems
Jin, Qing
Diffusion processes are central to human interactions. Despite extensive studies that span multiple disciplines, our knowledge is limited to spreading processes in non-substitutive systems. Yet, a considerable number of ideas, products, and behaviors spread by substitution; to adopt a new one, agents must give up an existing one. This captures the spread of scientific constructs--forcing scientists to choose, for example, a deterministic or probabilistic worldview, as well as the adoption of durable items, such as mobile phones, cars, or homes. In this dissertation, I develop a statistical physics framework to describe, quantify, and understand substitutive systems. By empirically exploring three collected high-resolution datasets pertaining to such systems, I build a mechanistic model describing substitutions, which not only analytically predicts the universal macroscopic phenomenon discovered in the collected datasets, but also accurately captures the trajectories of individual items in a complex substitutive system, demonstrating a high degree of regularity and universality in substitutive systems. I also discuss the origins and insights of the parameters in the substitution model and possible generalization form of the mathematical framework. The systematical study of substitutive systems presented in this dissertation could potentially guide the understanding and prediction of all spreading phenomena driven by substitutions, from electric cars to scientific paradigms, and from renewable energy to new healthy habits.
A system for learning statistical motion patterns.
Hu, Weiming; Xiao, Xuejuan; Fu, Zhouyu; Xie, Dan; Tan, Tieniu; Maybank, Steve
2006-09-01
Analysis of motion patterns is an effective approach for anomaly detection and behavior prediction. Current approaches for the analysis of motion patterns depend on known scenes, where objects move in predefined ways. It is highly desirable to automatically construct object motion patterns which reflect the knowledge of the scene. In this paper, we present a system for automatically learning motion patterns for anomaly detection and behavior prediction based on a proposed algorithm for robustly tracking multiple objects. In the tracking algorithm, foreground pixels are clustered using a fast accurate fuzzy K-means algorithm. Growing and prediction of the cluster centroids of foreground pixels ensure that each cluster centroid is associated with a moving object in the scene. In the algorithm for learning motion patterns, trajectories are clustered hierarchically using spatial and temporal information and then each motion pattern is represented with a chain of Gaussian distributions. Based on the learned statistical motion patterns, statistical methods are used to detect anomalies and predict behaviors. Our system is tested using image sequences acquired, respectively, from a crowded real traffic scene and a model traffic scene. Experimental results show the robustness of the tracking algorithm, the efficiency of the algorithm for learning motion patterns, and the encouraging performance of algorithms for anomaly detection and behavior prediction.
Statistical analysis of the uncertainty related to flood hazard appraisal
Notaro, Vincenza; Freni, Gabriele
2015-12-01
The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.
Statistical dynamics of ultradiffusion in hierarchical systems
International Nuclear Information System (INIS)
Gardner, S.
1987-01-01
In many types of disordered systems which exhibit frustration and competition, an ultrametric topology is found to exist in the space of allowable states. This ultrametric topology of states is associated with a hierarchical relaxation process called ultradiffusion. Ultradiffusion occurs in hierarchical non-linear (HNL) dynamical systems when constraints cause large scale, slow modes of motion to be subordinated to small scale, fast modes. Examples of ultradiffusion are found throughout condensed matter physics and critical phenomena (e.g. the states of spin glasses), in biophysics (e.g. the states of Hopfield networks) and in many other fields including layered computing based upon nonlinear dynamics. The statistical dynamics of ultradiffusion can be treated as a random walk on an ultrametric space. For reversible bifurcating ultrametric spaces the evolution equation governing the probability of a particle being found at site i at time t has a highly degenerate transition matrix. This transition matrix has a fractal geometry similar to the replica form proposed for spin glasses. The authors invert this fractal matrix using a recursive quad-tree (QT) method. Possible applications of hierarchical systems to communications and symbolic computing are discussed briefly
Fundamental link between system theory and statistical mechanics
International Nuclear Information System (INIS)
Atmanspacher, H.; Scheingraber, H.
1987-01-01
A fundamental link between system theory and statistical mechanics has been found to be established by the Kolmogorov entropy. By this quantity the temporal evolution of dynamical systems can be classified into regular, chaotic, and stochastic processes. Since K represents a measure for the internal information creation rate of dynamical systems, it provides an approach to irreversibility. The formal relationship to statistical mechanics is derived by means of an operator formalism originally introduced by Prigogine. For a Liouville operator L and an information operator M tilde acting on a distribution in phase space, it is shown that i[L, M tilde] = KI (I = identity operator). As a first consequence of this equivalence, a relation is obtained between the chaotic correlation time of a system and Prigogine's concept of a finite duration of presence. Finally, the existence of chaos in quantum systems is discussed with respect to the existence of a quantum mechanical time operator
Managing risk in statistics - "Relative risk" | Durrheim | South African ...
African Journals Online (AJOL)
South African Family Practice. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 45, No 8 (2003) >. Log in or Register to get access to full text downloads. Username, Password, Remember me, or Register. Managing risk in statistics - "Relative risk". DN Durrheim ...
Sensitivity analysis and related analysis : A survey of statistical techniques
Kleijnen, J.P.C.
1995-01-01
This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical
An application of an optimal statistic for characterizing relative orientations
Jow, Dylan L.; Hill, Ryley; Scott, Douglas; Soler, J. D.; Martin, P. G.; Devlin, M. J.; Fissel, L. M.; Poidevin, F.
2018-02-01
We present the projected Rayleigh statistic (PRS), a modification of the classic Rayleigh statistic, as a test for non-uniform relative orientation between two pseudo-vector fields. In the application here, this gives an effective way of investigating whether polarization pseudo-vectors (spin-2 quantities) are preferentially parallel or perpendicular to filaments in the interstellar medium. For example, there are other potential applications in astrophysics, e.g. when comparing small-scale orientations with larger scale shear patterns. We compare the efficiency of the PRS against histogram binning methods that have previously been used for characterizing the relative orientations of gas column density structures with the magnetic field projected on the plane of the sky. We examine data for the Vela C molecular cloud, where the column density is inferred from Herschel submillimetre observations, and the magnetic field from observations by the Balloon-borne Large-Aperture Submillimetre Telescope in the 250-, 350- and 500-μm wavelength bands. We find that the PRS has greater statistical power than approaches that bin the relative orientation angles, as it makes more efficient use of the information contained in the data. In particular, the use of the PRS to test for preferential alignment results in a higher statistical significance, in each of the four Vela C regions, with the greatest increase being by a factor 1.3 in the South-Nest region in the 250 - μ m band.
Augmented Automated Material Accounting Statistics System (AMASS)
International Nuclear Information System (INIS)
Lumb, R.F.; Messinger, M.; Tingey, F.H.
1983-01-01
This paper describes an extension of the AMASS methodology which was previously presented at the 1981 INMM annual meeting. The main thrust of the current effort is to develop procedures and a computer program for estimating the variance of an Inventory Difference when many sources of variability, other than measurement error, are admitted in the model. Procedures also are included for the estimation of the variances associated with measurement error estimates and their effect on the estimated limit of error of the inventory difference (LEID). The algorithm for the LEID measurement component uncertainty involves the propagated component measurement variance estimates as well as their associated degrees of freedom. The methodology and supporting computer software is referred to as the augmented Automated Material Accounting Statistics System (AMASS). Specifically, AMASS accommodates five source effects. These are: (1) measurement errors (2) known but unmeasured effects (3) measurement adjustment effects (4) unmeasured process hold-up effects (5) residual process variation A major result of this effort is a procedure for determining the effect of bias correction on LEID, properly taking into account all the covariances that exist. This paper briefly describes the basic models that are assumed; some of the estimation procedures consistent with the model; data requirements, emphasizing availability and other practical considerations; discusses implications for bias corrections; and concludes by briefly describing the supporting computer program
Information transport in classical statistical systems
Wetterich, C.
2018-02-01
For "static memory materials" the bulk properties depend on boundary conditions. Such materials can be realized by classical statistical systems which admit no unique equilibrium state. We describe the propagation of information from the boundary to the bulk by classical wave functions. The dependence of wave functions on the location of hypersurfaces in the bulk is governed by a linear evolution equation that can be viewed as a generalized Schrödinger equation. Classical wave functions obey the superposition principle, with local probabilities realized as bilinears of wave functions. For static memory materials the evolution within a subsector is unitary, as characteristic for the time evolution in quantum mechanics. The space-dependence in static memory materials can be used as an analogue representation of the time evolution in quantum mechanics - such materials are "quantum simulators". For example, an asymmetric Ising model on a Euclidean two-dimensional lattice represents the time evolution of free relativistic fermions in two-dimensional Minkowski space.
Statistical regularities in art: Relations with visual coding and perception.
Graham, Daniel J; Redies, Christoph
2010-07-21
Since at least 1935, vision researchers have used art stimuli to test human response to complex scenes. This is sensible given the "inherent interestingness" of art and its relation to the natural visual world. The use of art stimuli has remained popular, especially in eye tracking studies. Moreover, stimuli in common use by vision scientists are inspired by the work of famous artists (e.g., Mondrians). Artworks are also popular in vision science as illustrations of a host of visual phenomena, such as depth cues and surface properties. However, until recently, there has been scant consideration of the spatial, luminance, and color statistics of artwork, and even less study of ways that regularities in such statistics could affect visual processing. Furthermore, the relationship between regularities in art images and those in natural scenes has received little or no attention. In the past few years, there has been a concerted effort to study statistical regularities in art as they relate to neural coding and visual perception, and art stimuli have begun to be studied in rigorous ways, as natural scenes have been. In this minireview, we summarize quantitative studies of links between regular statistics in artwork and processing in the visual stream. The results of these studies suggest that art is especially germane to understanding human visual coding and perception, and it therefore warrants wider study. Copyright 2010 Elsevier Ltd. All rights reserved.
A statistical investigation of the mass discrepancy-acceleration relation
Desmond, Harry
2017-02-01
We use the mass discrepancy-acceleration relation (the correlation between the ratio of total-to-visible mass and acceleration in galaxies; MDAR) to test the galaxy-halo connection. We analyse the MDAR using a set of 16 statistics that quantify its four most important features: shape, scatter, the presence of a `characteristic acceleration scale', and the correlation of its residuals with other galaxy properties. We construct an empirical framework for the galaxy-halo connection in LCDM to generate predictions for these statistics, starting with conventional correlations (halo abundance matching; AM) and introducing more where required. Comparing to the SPARC data, we find that: (1) the approximate shape of the MDAR is readily reproduced by AM, and there is no evidence that the acceleration at which dark matter becomes negligible has less spread in the data than in AM mocks; (2) even under conservative assumptions, AM significantly overpredicts the scatter in the relation and its normalization at low acceleration, and furthermore positions dark matter too close to galaxies' centres on average; (3) the MDAR affords 2σ evidence for an anticorrelation of galaxy size and Hubble type with halo mass or concentration at fixed stellar mass. Our analysis lays the groundwork for a bottom-up determination of the galaxy-halo connection from relations such as the MDAR, provides concrete statistical tests for specific galaxy formation models, and brings into sharper focus the relative evidence accorded by galaxy kinematics to LCDM and modified gravity alternatives.
Symmetries and statistical behavior in fermion systems
International Nuclear Information System (INIS)
French, J.B.; Draayer, J.P.
1978-01-01
The interplay between statistical behavior and symmetries in nuclei, as revealed, for example, by spectra and by distributions for various kinds of excitations is considered. Methods and general results, rather than specific applications, are given. 16 references
Symmetries and statistical behavior in fermion systems
Energy Technology Data Exchange (ETDEWEB)
French, J.B.; Draayer, J.P.
1978-01-01
The interplay between statistical behavior and symmetries in nuclei, as revealed, for example, by spectra and by distributions for various kinds of excitations is considered. Methods and general results, rather than specific applications, are given. 16 references. (JFP)
Gibbs' theorem for open systems with incomplete statistics
International Nuclear Information System (INIS)
Bagci, G.B.
2009-01-01
Gibbs' theorem, which is originally intended for canonical ensembles with complete statistics has been generalized to open systems with incomplete statistics. As a result of this generalization, it is shown that the stationary equilibrium distribution of inverse power law form associated with the incomplete statistics has maximum entropy even for open systems with energy or matter influx. The renormalized entropy definition given in this paper can also serve as a measure of self-organization in open systems described by incomplete statistics.
PRIS-STATISTICS: Power Reactor Information System Statistical Reports. User's Manual
International Nuclear Information System (INIS)
2013-01-01
The IAEA developed the Power Reactor Information System (PRIS)-Statistics application to assist PRIS end users with generating statistical reports from PRIS data. Statistical reports provide an overview of the status, specification and performance results of every nuclear power reactor in the world. This user's manual was prepared to facilitate the use of the PRIS-Statistics application and to provide guidelines and detailed information for each report in the application. Statistical reports support analyses of nuclear power development and strategies, and the evaluation of nuclear power plant performance. The PRIS database can be used for comprehensive trend analyses and benchmarking against best performers and industrial standards.
Automated material accounting statistics system (AMASS)
International Nuclear Information System (INIS)
Messinger, M.; Lumb, R.F.; Tingey, F.H.
1981-01-01
In this paper the modeling and statistical analysis of measurement and process data for nuclear material accountability is readdressed under a more general framework than that provided in the literature. The result of this effort is a computer program (AMASS) which uses the algorithms and equations of this paper to accomplish the analyses indicated. The actual application of the method to process data is emphasized
Statistical mechanics in the context of special relativity. II.
Kaniadakis, G
2005-09-01
The special relativity laws emerge as one-parameter (light speed) generalizations of the corresponding laws of classical physics. These generalizations, imposed by the Lorentz transformations, affect both the definition of the various physical observables (e.g., momentum, energy, etc.), as well as the mathematical apparatus of the theory. Here, following the general lines of [Phys. Rev. E 66, 056125 (2002)], we show that the Lorentz transformations impose also a proper one-parameter generalization of the classical Boltzmann-Gibbs-Shannon entropy. The obtained relativistic entropy permits us to construct a coherent and self-consistent relativistic statistical theory, preserving the main features of the ordinary statistical theory, which is recovered in the classical limit. The predicted distribution function is a one-parameter continuous deformation of the classical Maxwell-Boltzmann distribution and has a simple analytic form, showing power law tails in accordance with the experimental evidence. Furthermore, this statistical mechanics can be obtained as the stationary case of a generalized kinetic theory governed by an evolution equation obeying the H theorem and reproducing the Boltzmann equation of the ordinary kinetics in the classical limit.
Statistical Analysis of Hypercalcaemia Data related to Transferability
DEFF Research Database (Denmark)
Frølich, Anne; Nielsen, Bo Friis
2005-01-01
In this report we describe statistical analysis related to a study of hypercalcaemia carried out in the Copenhagen area in the ten year period from 1984 to 1994. Results from the study have previously been publised in a number of papers [3, 4, 5, 6, 7, 8, 9] and in various abstracts and posters...... at conferences during the late eighties and early nineties. In this report we give a more detailed description of many of the analysis and provide some new results primarily by simultaneous studies of several databases....
Statistical properties of dynamical systems – Simulation and abstract computation
International Nuclear Information System (INIS)
Galatolo, Stefano; Hoyrup, Mathieu; Rojas, Cristóbal
2012-01-01
Highlights: ► A survey on results about computation and computability on the statistical properties of dynamical systems. ► Computability and non-computability results for invariant measures. ► A short proof for the computability of the convergence speed of ergodic averages. ► A kind of “constructive” version of the pointwise ergodic theorem. - Abstract: We survey an area of recent development, relating dynamics to theoretical computer science. We discuss some aspects of the theoretical simulation and computation of the long term behavior of dynamical systems. We will focus on the statistical limiting behavior and invariant measures. We present a general method allowing the algorithmic approximation at any given accuracy of invariant measures. The method can be applied in many interesting cases, as we shall explain. On the other hand, we exhibit some examples where the algorithmic approximation of invariant measures is not possible. We also explain how it is possible to compute the speed of convergence of ergodic averages (when the system is known exactly) and how this entails the computation of arbitrarily good approximations of points of the space having typical statistical behaviour (a sort of constructive version of the pointwise ergodic theorem).
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
Statistical fault detection in photovoltaic systems
Garoudja, Elyes
2017-05-08
Faults in photovoltaic (PV) systems, which can result in energy loss, system shutdown or even serious safety breaches, are often difficult to avoid. Fault detection in such systems is imperative to improve their reliability, productivity, safety and efficiency. Here, an innovative model-based fault-detection approach for early detection of shading of PV modules and faults on the direct current (DC) side of PV systems is proposed. This approach combines the flexibility, and simplicity of a one-diode model with the extended capacity of an exponentially weighted moving average (EWMA) control chart to detect incipient changes in a PV system. The one-diode model, which is easily calibrated due to its limited calibration parameters, is used to predict the healthy PV array\\'s maximum power coordinates of current, voltage and power using measured temperatures and irradiances. Residuals, which capture the difference between the measurements and the predictions of the one-diode model, are generated and used as fault indicators. Then, the EWMA monitoring chart is applied on the uncorrelated residuals obtained from the one-diode model to detect and identify the type of fault. Actual data from the grid-connected PV system installed at the Renewable Energy Development Center, Algeria, are used to assess the performance of the proposed approach. Results show that the proposed approach successfully monitors the DC side of PV systems and detects temporary shading.
Statistical mechanics of socio-economic systems with heterogeneous agents
International Nuclear Information System (INIS)
De Martino, Andrea; Marsili, Matteo
2006-01-01
We review the statistical mechanics approach to the study of the emerging collective behaviour of systems of heterogeneous interacting agents. The general framework is presented through examples in such contexts as ecosystem dynamics and traffic modelling. We then focus on the analysis of the optimal properties of large random resource-allocation problems and on Minority Games and related models of speculative trading in financial markets, discussing a number of extensions including multi-asset models, majority games and models with asymmetric information. Finally, we summarize the main conclusions and outline the major open problems and limitations of the approach. (topical review)
Actual problems of accession in relation with library statistics
Directory of Open Access Journals (Sweden)
Tereza Poličnik-Čermelj
2010-01-01
Full Text Available Accession is the process of recording bibliographic units in an accession register. Typically,library materials are acquired by purchase, exchange, gift or legal deposit. How-ever, COBISS (Cooperative Online Bibliographic System and Services Holdings software module includes some additional methods of acquisition which causes problems in gathering and presenting statistical data on local holdings. The article explains how to record holdings of different types of library materials and how to record retrospective collections. It describes necessary procedures in case the codes that define the publication pattern of the holdings are changed with special attention to integrating resources. Procedures of accession and circulation of bound materials, supplementary materials, teaching sets, multi parts, multimedia and collection level catalogue records are described. The attention is given to errors in recording lost item replacements and to the problems of circulation of certain types of library materials. The author also suggests how to record remote electronic resources. It is recommended to verify holdings data before the accession register is generated. The relevant and credible statistical data on collection development can only be created by librarians with sufficient acquisition and cataloguing skills.
Obtaining Internet Flow Statistics by Volunteer-Based System
DEFF Research Database (Denmark)
Pedersen, Jens Myrup; Bujlow, Tomasz
2012-01-01
In this paper we demonstrate how the Volunteer Based System for Research on the Internet, developed at Aalborg University, can be used for creating statistics of Internet usage. Since the data is collected on individual machines, the statistics can be made on the basis of both individual users......, and average flow durations. The paper is concluded with a discussion on what further statistics can be made, and the further development of the system....
Statistical fault detection in photovoltaic systems
Garoudja, Elyes; Harrou, Fouzi; Sun, Ying; Kara, Kamel; Chouder, Aissa; Silvestre, Santiago
2017-01-01
and efficiency. Here, an innovative model-based fault-detection approach for early detection of shading of PV modules and faults on the direct current (DC) side of PV systems is proposed. This approach combines the flexibility, and simplicity of a one-diode model
Statistical analysis of complex systems with nonclassical invariant measures
Fratalocchi, Andrea
2011-01-01
I investigate the problem of finding a statistical description of a complex many-body system whose invariant measure cannot be constructed stemming from classical thermodynamics ensembles. By taking solitons as a reference system and by employing a
Applying intelligent statistical methods on biometric systems
Betschart, Willie
2005-01-01
This master’s thesis work was performed at Optimum Biometric Labs, OBL, located in Karlskrona, Sweden. Optimum Biometric Labs perform independent scenario evaluations to companies who develop biometric devices. The company has a product Optimum preConTM which is surveillance and diagnosis tool for biometric systems. This thesis work’s objective was to develop a conceptual model and implement it as an additional layer above the biometric layer with intelligence about the biometric users. The l...
Statistical mechanics of complex neural systems and high dimensional data
International Nuclear Information System (INIS)
Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya
2013-01-01
Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks. (paper)
The Statistical Mechanics of Dilute, Disordered Systems
Blackburn, Roger Michael
Available from UMI in association with The British Library. Requires signed TDF. A graph partitioning problem with variable inter -partition costs is studied by exploiting its mapping on to the Ashkin-Teller spin glass. The cavity method is used to derive the TAP equations and free energy for both extensively connected and dilute systems. Unlike Ising and Potts spin glasses, the self-consistent equation for the distribution of effective fields does not have a solution solely made up of delta functions. Numerical integration is used to find the stable solution, from which the ground state energy is calculated. Simulated annealing is used to test the results. The retrieving activity distribution for networks of boolean functions trained as associative memories for optimal capacity is derived. For infinite networks, outputs are shown to be frozen, in contrast to dilute asymmetric networks trained with the Hebb rule. For finite networks, a steady leaking to the non-retrieving attractor is demonstrated. Simulations of quenched networks are reported which show a departure from this picture: some configurations remain frozen for all time, while others follow cycles of small periods. An estimate of the critical capacity from the simulations is found to be in broad agreement with recent analytical results. The existing theory is extended to include noise on recall, and the behaviour is found to be robust to noise up to order 1/c^2 for networks with connectivity c.
International Nuclear Information System (INIS)
2005-01-01
For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees
International Nuclear Information System (INIS)
2001-01-01
For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
1999-01-01
For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
Statistical Relations for Yield Degradation in Inertial Confinement Fusion
Woo, K. M.; Betti, R.; Patel, D.; Gopalaswamy, V.
2017-10-01
In inertial confinement fusion (ICF), the yield-over-clean (YOC) is a quantity commonly used to assess the performance of an implosion with respect to the degradation caused by asymmetries. The YOC also determines the Lawson parameter used to identify the onset of ignition and the level of alpha heating in ICF implosions. In this work, we show that the YOC is a unique function of the residual kinetic energy in the compressed shell (with respect to the 1-D case) regardless of the asymmetry spectrum. This result is derived using a simple model of the deceleration phase as well as through an extensive set of 3-D radiation-hydrodynamics simulations using the code DEC3D. The latter has been recently upgraded to include a 3-D spherical moving mesh, the HYPRE solver for 3-D radiation transport and piecewise-parabolic method for robust shock-capturing hydrodynamic simulations. DEC3D is used to build a synthetic single-mode database to study the behavior of yield degradation caused by Rayleigh-Taylor instabilities in the deceleration phase. The relation between YOC and residual kinetic energy is compared with the result in an adiabatic implosion model. The statistical expression of YOC is also applied to the ignition criterion in the presence of multidimensional nonuniformities. This material is based upon work supported by the Department of Energy National Nuclear Security Administration under Award Number DE-NA0001944.
Haldane's statistical interactions and universal properties of anyon systems
International Nuclear Information System (INIS)
Protogenov, A.
1995-03-01
The exclusion principle of fractional statistics proposed by Haldane is applied to systems with internal degrees of freedom. The symmetry of these systems is included in the statistical interaction matrix which contains the Cartan matrix of Lie algebras. The solutions of the equations for the statistical weights, which coincide with the thermodynamic Bethe ansatz equations are determined in the high temperature limit by the squares of q-deformed dimensions of irreducible representations. The entropy and other thermodynamic properties of anyon systems in this limit are completely characterized by the algebraic structure of symmetry in the universal form. (author). 39 refs
Flow Equation Approach to the Statistics of Nonlinear Dynamical Systems
Marston, J. B.; Hastings, M. B.
2005-03-01
The probability distribution function of non-linear dynamical systems is governed by a linear framework that resembles quantum many-body theory, in which stochastic forcing and/or averaging over initial conditions play the role of non-zero . Besides the well-known Fokker-Planck approach, there is a related Hopf functional methodootnotetextUriel Frisch, Turbulence: The Legacy of A. N. Kolmogorov (Cambridge University Press, 1995) chapter 9.5.; in both formalisms, zero modes of linear operators describe the stationary non-equilibrium statistics. To access the statistics, we investigate the method of continuous unitary transformationsootnotetextS. D. Glazek and K. G. Wilson, Phys. Rev. D 48, 5863 (1993); Phys. Rev. D 49, 4214 (1994). (also known as the flow equation approachootnotetextF. Wegner, Ann. Phys. 3, 77 (1994).), suitably generalized to the diagonalization of non-Hermitian matrices. Comparison to the more traditional cumulant expansion method is illustrated with low-dimensional attractors. The treatment of high-dimensional dynamical systems is also discussed.
Statistical mechanics for a system with imperfections: pt. 1
International Nuclear Information System (INIS)
Choh, S.T.; Kahng, W.H.; Um, C.I.
1982-01-01
Statistical mechanics is extended to treat a system where parts of the Hamiltonian are randomly varying. As the starting point of the theory, the statistical correlation among energy levels is neglected, allowing use of the central limit theorem of the probability theory. (Author)
International Nuclear Information System (INIS)
2003-01-01
For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products
International Nuclear Information System (INIS)
2004-01-01
For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
Glushak, P. A.; Markiv, B. B.; Tokarchuk, M. V.
2018-01-01
We present a generalization of Zubarev's nonequilibrium statistical operator method based on the principle of maximum Renyi entropy. In the framework of this approach, we obtain transport equations for the basic set of parameters of the reduced description of nonequilibrium processes in a classical system of interacting particles using Liouville equations with fractional derivatives. For a classical systems of particles in a medium with a fractal structure, we obtain a non-Markovian diffusion equation with fractional spatial derivatives. For a concrete model of the frequency dependence of a memory function, we obtain generalized Kettano-type diffusion equation with the spatial and temporal fractality taken into account. We present a generalization of nonequilibrium thermofield dynamics in Zubarev's nonequilibrium statistical operator method in the framework of Renyi statistics.
National Vital Statistics System (NVSS) - National Cardiovascular Disease Surveillance Data
U.S. Department of Health & Human Services — 2000 forward. NVSS is a secure, web-based data management system that collects and disseminates the Nation's official vital statistics. Indicators from this data...
HEALTH CARE SYSTEM AS AN OBJECT OF STATISTICAL RESEARCH
Directory of Open Access Journals (Sweden)
Pavel A. Smelov
2015-01-01
Full Text Available The article describes the health care system of the Russian Federation as anobject of statistical analysis. The features of accounting of the health system in Russia. The article highlights the key aspects of the health system, which is characterized as fully as possible the object of study.
Statistical physics of complex systems a concise introduction
Bertin, Eric
2016-01-01
This course-tested primer provides graduate students and non-specialists with a basic understanding of the concepts and methods of statistical physics and demonstrates their wide range of applications to interdisciplinary topics in the field of complex system sciences, including selected aspects of theoretical modeling in biology and the social sciences. Generally speaking, the goals of statistical physics may be summarized as follows: on the one hand to study systems composed of a large number of interacting units, and on the other to predict the macroscopic, collective behavior of the system considered from the perspective of the microscopic laws governing the dynamics of the individual entities. These two goals are essentially also shared by what is now called 'complex systems science', and as such, systems studied in the framework of statistical physics may be considered to be among the simplest examples of complex systems – while also offering a rather well developed mathematical treatment. The second ...
Statistical analysis of natural disasters and related losses
Pisarenko, VF
2014-01-01
The study of disaster statistics and disaster occurrence is a complicated interdisciplinary field involving the interplay of new theoretical findings from several scientific fields like mathematics, physics, and computer science. Statistical studies on the mode of occurrence of natural disasters largely rely on fundamental findings in the statistics of rare events, which were derived in the 20th century. With regard to natural disasters, it is not so much the fact that the importance of this problem for mankind was recognized during the last third of the 20th century - the myths one encounters in ancient civilizations show that the problem of disasters has always been recognized - rather, it is the fact that mankind now possesses the necessary theoretical and practical tools to effectively study natural disasters, which in turn supports effective, major practical measures to minimize their impact. All the above factors have resulted in considerable progress in natural disaster research. Substantial accrued ma...
Exploration of unique relation among industrial fungi by statistical analysis
Directory of Open Access Journals (Sweden)
Asma Siddique
2012-12-01
Full Text Available This work was carried out to explore the relation among thermophilic cellulolytic fungi, which are of industrialimportance. There was no report found about the genetic relationship of fungi, which are used to produce industrial enzymes.So the aim of the study was to observe the similarity among different cellulolytic fungi on genetic level, which will providethe background to understand the correlation among cellulase producing systems of these fungi. Eleven (11 fungi werestudied for genetic diversity using the Random Amplified Polymorphic DNA (RAPD a PCR based molecular marker system.In this regard twenty universal decamers used for RAPD resulted in 1527 numbers of bands observed during comparison ofall wild strains. Maximum polymorphism was generated with GLA-07. Average numbers of bands per 20 primers were 65-72.An Interesting feature of the study was the similarity of Humicola insolens with Torula thermophile, more than with theother members of the Humicola family. This genetic pattern affects the physical structure of the fungi. Spores of Torulathermophila are more related to Humicola insolens than to its own family. Similarity between the two was found to be 57.8%,whereas between Humicola lanuginosa (Thermomysis lanuginosus and Humicola grisea it was 57.3%. Apart from this,similarity between Talaromyces dupontii and Rhizomucor pusillus was 51.5%. Least similarity was found in Rhizomucorpusillus and Humicola grisea, which was 18.7% and Chaetomium thermophile and Sporotrichum thermophile, which was18.3%. Genetic similarity matrix was constructed on the basis of Nei and Li’s index.
Optimal Design and Related Areas in Optimization and Statistics
Pronzato, Luc
2009-01-01
This edited volume, dedicated to Henry P. Wynn, reflects his broad range of research interests, focusing in particular on the applications of optimal design theory in optimization and statistics. It covers algorithms for constructing optimal experimental designs, general gradient-type algorithms for convex optimization, majorization and stochastic ordering, algebraic statistics, Bayesian networks and nonlinear regression. Written by leading specialists in the field, each chapter contains a survey of the existing literature along with substantial new material. This work will appeal to both the
Analysis of neutron flux measurement systems using statistical functions
International Nuclear Information System (INIS)
Pontes, Eduardo Winston
1997-01-01
This work develops an integrated analysis for neutron flux measurement systems using the concepts of cumulants and spectra. Its major contribution is the generalization of Campbell's theorem in the form of spectra in the frequency domain, and its application to the analysis of neutron flux measurement systems. Campbell's theorem, in its generalized form, constitutes an important tool, not only to find the nth-order frequency spectra of the radiation detector, but also in the system analysis. The radiation detector, an ionization chamber for neutrons, is modeled for cylindrical, plane and spherical geometries. The detector current pulses are characterized by a vector of random parameters, and the associated charges, statistical moments and frequency spectra of the resulting current are calculated. A computer program is developed for application of the proposed methodology. In order for the analysis to integrate the associated electronics, the signal processor is studied, considering analog and digital configurations. The analysis is unified by developing the concept of equivalent systems that can be used to describe the cumulants and spectra in analog or digital systems. The noise in the signal processor input stage is analysed in terms of second order spectrum. Mathematical expressions are presented for cumulants and spectra up to fourth order, for important cases of filter positioning relative to detector spectra. Unbiased conventional estimators for cumulants are used, and, to evaluate systems precision and response time, expressions are developed for their variances. Finally, some possibilities for obtaining neutron radiation flux as a function of cumulants are discussed. In summary, this work proposes some analysis tools which make possible important decisions in the design of better neutron flux measurement systems. (author)
Statistical quasi-particle theory for open quantum systems
Zhang, Hou-Dao; Xu, Rui-Xue; Zheng, Xiao; Yan, YiJing
2018-04-01
This paper presents a comprehensive account on the recently developed dissipaton-equation-of-motion (DEOM) theory. This is a statistical quasi-particle theory for quantum dissipative dynamics. It accurately describes the influence of bulk environments, with a few number of quasi-particles, the dissipatons. The novel dissipaton algebra is then followed, which readily bridges the Schrödinger equation to the DEOM theory. As a fundamental theory of quantum mechanics in open systems, DEOM characterizes both the stationary and dynamic properties of system-and-bath interferences. It treats not only the quantum dissipative systems of primary interest, but also the hybrid environment dynamics that could be experimentally measurable. Examples are the linear or nonlinear Fano interferences and the Herzberg-Teller vibronic couplings in optical spectroscopies. This review covers the DEOM construction, the underlying dissipaton algebra and theorems, the physical meanings of dynamical variables, the possible identifications of dissipatons, and some recent advancements in efficient DEOM evaluations on various problems. The relations of the present theory to other nonperturbative methods are also critically presented.
Discrete changes of current statistics in periodically driven stochastic systems
International Nuclear Information System (INIS)
Chernyak, Vladimir Y; Sinitsyn, N A
2010-01-01
We demonstrate that the counting statistics of currents in periodically driven ergodic stochastic systems can show sharp changes of some of its properties in response to continuous changes of the driving protocol. To describe this effect, we introduce a new topological phase factor in the evolution of the moment generating function which is akin to the topological geometric phase in the evolution of a periodically driven quantum mechanical system with time-reversal symmetry. This phase leads to the prediction of a sign change for the difference of the probabilities to find even and odd numbers of particles transferred in a stochastic system in response to cyclic evolution of control parameters. The driving protocols that lead to this sign change should enclose specific degeneracy points in the space of control parameters. The relation between the topology of the paths in the control parameter space and the sign changes can be described in terms of the first Stiefel–Whitney class of topological invariants. (letter)
Data analysis using the Gnu R system for statistical computation
Energy Technology Data Exchange (ETDEWEB)
Simone, James; /Fermilab
2011-07-01
R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.
A Concise Introduction to the Statistical Physics of Complex Systems
Bertin, Eric
2012-01-01
This concise primer (based on lectures given at summer schools on complex systems and on a masters degree course in complex systems modeling) will provide graduate students and newcomers to the field with the basic knowledge of the concepts and methods of statistical physics and its potential for application to interdisciplinary topics. Indeed, in recent years, statistical physics has begun to attract the interest of a broad community of researchers in the field of complex system sciences, ranging from biology to the social sciences, economics and computer science. More generally, a growing number of graduate students and researchers feel the need to learn some basic concepts and questions originating in other disciplines without necessarily having to master all of the corresponding technicalities and jargon. Generally speaking, the goals of statistical physics may be summarized as follows: on the one hand to study systems composed of a large number of interacting ‘entities’, and on the other to predict...
Statistical properties of chaotic dynamical systems which exhibit strange attractors
International Nuclear Information System (INIS)
Jensen, R.V.; Oberman, C.R.
1981-07-01
A path integral method is developed for the calculation of the statistical properties of turbulent dynamical systems. The method is applicable to conservative systems which exhibit a transition to stochasticity as well as dissipative systems which exhibit strange attractors. A specific dissipative mapping is considered in detail which models the dynamics of a Brownian particle in a wave field with a broad frequency spectrum. Results are presented for the low order statistical moments for three turbulent regimes which exhibit strange attractors corresponding to strong, intermediate, and weak collisional damping
Call for civil registration and vital statistics systems experts | IDRC ...
International Development Research Centre (IDRC) Digital Library (Canada)
2017-06-30
Jun 30, 2017 ... This is a call for experts in civil registration, information technology, public health, statistics, law, ... digitization (including IT systems design, and system integration and ... socio-cultural and anthropological research); and; public health. ... IDRC and key partners will showcase critical work on adaptation and ...
Multivariate statistical pattern recognition system for reactor noise analysis
International Nuclear Information System (INIS)
Gonzalez, R.C.; Howington, L.C.; Sides, W.H. Jr.; Kryter, R.C.
1976-01-01
A multivariate statistical pattern recognition system for reactor noise analysis was developed. The basis of the system is a transformation for decoupling correlated variables and algorithms for inferring probability density functions. The system is adaptable to a variety of statistical properties of the data, and it has learning, tracking, and updating capabilities. System design emphasizes control of the false-alarm rate. The ability of the system to learn normal patterns of reactor behavior and to recognize deviations from these patterns was evaluated by experiments at the ORNL High-Flux Isotope Reactor (HFIR). Power perturbations of less than 0.1 percent of the mean value in selected frequency ranges were detected by the system
Multivariate statistical pattern recognition system for reactor noise analysis
International Nuclear Information System (INIS)
Gonzalez, R.C.; Howington, L.C.; Sides, W.H. Jr.; Kryter, R.C.
1975-01-01
A multivariate statistical pattern recognition system for reactor noise analysis was developed. The basis of the system is a transformation for decoupling correlated variables and algorithms for inferring probability density functions. The system is adaptable to a variety of statistical properties of the data, and it has learning, tracking, and updating capabilities. System design emphasizes control of the false-alarm rate. The ability of the system to learn normal patterns of reactor behavior and to recognize deviations from these patterns was evaluated by experiments at the ORNL High-Flux Isotope Reactor (HFIR). Power perturbations of less than 0.1 percent of the mean value in selected frequency ranges were detected by the system. 19 references
Statistical inference for noisy nonlinear ecological dynamic systems.
Wood, Simon N
2010-08-26
Chaotic ecological dynamic systems defy conventional statistical analysis. Systems with near-chaotic dynamics are little better. Such systems are almost invariably driven by endogenous dynamic processes plus demographic and environmental process noise, and are only observable with error. Their sensitivity to history means that minute changes in the driving noise realization, or the system parameters, will cause drastic changes in the system trajectory. This sensitivity is inherited and amplified by the joint probability density of the observable data and the process noise, rendering it useless as the basis for obtaining measures of statistical fit. Because the joint density is the basis for the fit measures used by all conventional statistical methods, this is a major theoretical shortcoming. The inability to make well-founded statistical inferences about biological dynamic models in the chaotic and near-chaotic regimes, other than on an ad hoc basis, leaves dynamic theory without the methods of quantitative validation that are essential tools in the rest of biological science. Here I show that this impasse can be resolved in a simple and general manner, using a method that requires only the ability to simulate the observed data on a system from the dynamic model about which inferences are required. The raw data series are reduced to phase-insensitive summary statistics, quantifying local dynamic structure and the distribution of observations. Simulation is used to obtain the mean and the covariance matrix of the statistics, given model parameters, allowing the construction of a 'synthetic likelihood' that assesses model fit. This likelihood can be explored using a straightforward Markov chain Monte Carlo sampler, but one further post-processing step returns pure likelihood-based inference. I apply the method to establish the dynamic nature of the fluctuations in Nicholson's classic blowfly experiments.
Phase flow and statistical structure of Galton-board systems
International Nuclear Information System (INIS)
Lue, A.; Brenner, H.
1993-01-01
Galton boards, found in museum exhibits devoted to science and technology, are often used to demonstrate visually the ubiquity of so-called ''laws of probability'' via an experimental realization of normal distributions. A detailed theoretical study of Galton-board phase-space dynamics and statistical behavior is presented. The study is based on a simple inelastic-collision model employing a particle fall- ing through a spatially periodic lattice of rigid, convex scatterers. We show that such systems exhibit indeterminate behavior through the presence of strange attractors or strange repellers in phase space; nevertheless, we also show that these systems exhibit regular and predictable behavior under specific circumstances. Phase-space strange attractors, periodic attractors, and strange repellers are present in numerical simulations, confirming results anticipated from geometric analysis. The system's geometry (dictated by lattice geometry and density as well as the direction of gravity) is observed to play a dominant role in stability, phase-flow topology, and statistical observations. Smale horseshoes appear to exist in the low-lattice-density limit and may exist in other regimes. These horseshoes are generated by homoclinic orbits whose existence is dictated by system characteristics. The horseshoes lead directly to deterministic chaos in the system. Strong evidence exists for ergodicity in all attractors. Phase-space complexities are manifested at all observed levels, particularly statistical ones. Consequently, statistical observations are critically dependent upon system details. Under well-defined circumstances, these observations display behavior which does not constitute a realization of the ''laws of probability.''
Bovier, Anton
2006-06-01
Our mathematical understanding of the statistical mechanics of disordered systems is going through a period of stunning progress. This self-contained book is a graduate-level introduction for mathematicians and for physicists interested in the mathematical foundations of the field, and can be used as a textbook for a two-semester course on mathematical statistical mechanics. It assumes only basic knowledge of classical physics and, on the mathematics side, a good working knowledge of graduate-level probability theory. The book starts with a concise introduction to statistical mechanics, proceeds to disordered lattice spin systems, and concludes with a presentation of the latest developments in the mathematical understanding of mean-field spin glass models. In particular, recent progress towards a rigorous understanding of the replica symmetry-breaking solutions of the Sherrington-Kirkpatrick spin glass models, due to Guerra, Aizenman-Sims-Starr and Talagrand, is reviewed in some detail. Comprehensive introduction to an active and fascinating area of research Clear exposition that builds to the state of the art in the mathematics of spin glasses Written by a well-known and active researcher in the field
Experience and Sentence Processing: Statistical Learning and Relative Clause Comprehension
Wells, Justine B.; Christiansen, Morten H.; Race, David S.; Acheson, Daniel J.; MacDonald, Maryellen C.
2009-01-01
Many explanations of the difficulties associated with interpreting object relative clauses appeal to the demands that object relatives make on working memory. MacDonald and Christiansen [MacDonald, M. C., & Christiansen, M. H. (2002). "Reassessing working memory: Comment on Just and Carpenter (1992) and Waters and Caplan (1996)." "Psychological…
Spectral statistics of chaotic many-body systems
International Nuclear Information System (INIS)
Dubertrand, Rémy; Müller, Sebastian
2016-01-01
We derive a trace formula that expresses the level density of chaotic many-body systems as a smooth term plus a sum over contributions associated to solutions of the nonlinear Schrödinger (or Gross–Pitaevski) equation. Our formula applies to bosonic systems with discretised positions, such as the Bose–Hubbard model, in the semiclassical limit as well as in the limit where the number of particles is taken to infinity. We use the trace formula to investigate the spectral statistics of these systems, by studying interference between solutions of the nonlinear Schrödinger equation. We show that in the limits taken the statistics of fully chaotic many-particle systems becomes universal and agrees with predictions from the Wigner–Dyson ensembles of random matrix theory. The conditions for Wigner–Dyson statistics involve a gap in the spectrum of the Frobenius–Perron operator, leaving the possibility of different statistics for systems with weaker chaotic properties. (paper)
Effective control of complex turbulent dynamical systems through statistical functionals.
Majda, Andrew J; Qi, Di
2017-05-30
Turbulent dynamical systems characterized by both a high-dimensional phase space and a large number of instabilities are ubiquitous among complex systems in science and engineering, including climate, material, and neural science. Control of these complex systems is a grand challenge, for example, in mitigating the effects of climate change or safe design of technology with fully developed shear turbulence. Control of flows in the transition to turbulence, where there is a small dimension of instabilities about a basic mean state, is an important and successful discipline. In complex turbulent dynamical systems, it is impossible to track and control the large dimension of instabilities, which strongly interact and exchange energy, and new control strategies are needed. The goal of this paper is to propose an effective statistical control strategy for complex turbulent dynamical systems based on a recent statistical energy principle and statistical linear response theory. We illustrate the potential practical efficiency and verify this effective statistical control strategy on the 40D Lorenz 1996 model in forcing regimes with various types of fully turbulent dynamics with nearly one-half of the phase space unstable.
Statistical fluctuations and correlations in hadronic equilibrium systems
Energy Technology Data Exchange (ETDEWEB)
Hauer, Michael
2010-06-17
This thesis is dedicated to the study of fluctuation and correlation observables of hadronic equilibrium systems. The statistical hadronization model of high energy physics, in its ideal, i.e. non-interacting, gas approximation is investigated in different ensemble formulations. The hypothesis of thermal and chemical equilibrium in high energy interaction is tested against qualitative and quantitative predictions. (orig.)
Statistical fluctuations and correlations in hadronic equilibrium systems
International Nuclear Information System (INIS)
Hauer, Michael
2010-01-01
This thesis is dedicated to the study of fluctuation and correlation observables of hadronic equilibrium systems. The statistical hadronization model of high energy physics, in its ideal, i.e. non-interacting, gas approximation is investigated in different ensemble formulations. The hypothesis of thermal and chemical equilibrium in high energy interaction is tested against qualitative and quantitative predictions. (orig.)
Statistics of resonances in one-dimensional continuous systems
Indian Academy of Sciences (India)
Vol. 73, No. 3. — journal of. September 2009 physics pp. 565–572. Statistics of resonances in one-dimensional continuous systems. JOSHUA FEINBERG. Physics Department, University of Haifa at Oranim, Tivon 36006, Israel ..... relativistic quantum mechanics (Israel Program for Scientific Translations, Jerusalem,. 1969).
Prototyping a Distributed Information Retrieval System That Uses Statistical Ranking.
Harman, Donna; And Others
1991-01-01
Built using a distributed architecture, this prototype distributed information retrieval system uses statistical ranking techniques to provide better service to the end user. Distributed architecture was shown to be a feasible alternative to centralized or CD-ROM information retrieval, and user testing of the ranking methodology showed both…
Statistics of Shared Components in Complex Component Systems
Mazzolini, Andrea; Gherardi, Marco; Caselle, Michele; Cosentino Lagomarsino, Marco; Osella, Matteo
2018-04-01
Many complex systems are modular. Such systems can be represented as "component systems," i.e., sets of elementary components, such as LEGO bricks in LEGO sets. The bricks found in a LEGO set reflect a target architecture, which can be built following a set-specific list of instructions. In other component systems, instead, the underlying functional design and constraints are not obvious a priori, and their detection is often a challenge of both scientific and practical importance, requiring a clear understanding of component statistics. Importantly, some quantitative invariants appear to be common to many component systems, most notably a common broad distribution of component abundances, which often resembles the well-known Zipf's law. Such "laws" affect in a general and nontrivial way the component statistics, potentially hindering the identification of system-specific functional constraints or generative processes. Here, we specifically focus on the statistics of shared components, i.e., the distribution of the number of components shared by different system realizations, such as the common bricks found in different LEGO sets. To account for the effects of component heterogeneity, we consider a simple null model, which builds system realizations by random draws from a universe of possible components. Under general assumptions on abundance heterogeneity, we provide analytical estimates of component occurrence, which quantify exhaustively the statistics of shared components. Surprisingly, this simple null model can positively explain important features of empirical component-occurrence distributions obtained from large-scale data on bacterial genomes, LEGO sets, and book chapters. Specific architectural features and functional constraints can be detected from occurrence patterns as deviations from these null predictions, as we show for the illustrative case of the "core" genome in bacteria.
Statistics of Shared Components in Complex Component Systems
Directory of Open Access Journals (Sweden)
Andrea Mazzolini
2018-04-01
Full Text Available Many complex systems are modular. Such systems can be represented as “component systems,” i.e., sets of elementary components, such as LEGO bricks in LEGO sets. The bricks found in a LEGO set reflect a target architecture, which can be built following a set-specific list of instructions. In other component systems, instead, the underlying functional design and constraints are not obvious a priori, and their detection is often a challenge of both scientific and practical importance, requiring a clear understanding of component statistics. Importantly, some quantitative invariants appear to be common to many component systems, most notably a common broad distribution of component abundances, which often resembles the well-known Zipf’s law. Such “laws” affect in a general and nontrivial way the component statistics, potentially hindering the identification of system-specific functional constraints or generative processes. Here, we specifically focus on the statistics of shared components, i.e., the distribution of the number of components shared by different system realizations, such as the common bricks found in different LEGO sets. To account for the effects of component heterogeneity, we consider a simple null model, which builds system realizations by random draws from a universe of possible components. Under general assumptions on abundance heterogeneity, we provide analytical estimates of component occurrence, which quantify exhaustively the statistics of shared components. Surprisingly, this simple null model can positively explain important features of empirical component-occurrence distributions obtained from large-scale data on bacterial genomes, LEGO sets, and book chapters. Specific architectural features and functional constraints can be detected from occurrence patterns as deviations from these null predictions, as we show for the illustrative case of the “core” genome in bacteria.
Method of statistical estimation of temperature minimums in binary systems
International Nuclear Information System (INIS)
Mireev, V.A.; Safonov, V.V.
1985-01-01
On the basis of statistical processing of literature data the technique for evaluation of temperature minima on liquidus curves in binary systems with common ion chloride systems being taken as an example, is developed. The systems are formed by 48 chlorides of 45 chemical elements including alkali, alkaline earth, rare earth and transition metals as well as Cd, In, Th. It is shown that calculation error in determining minimum melting points depends on topology of the phase diagram. The comparison of calculated and experimental data for several previously nonstudied systems is given
China’s Statist Energy Relations with Turkmenistan and Kazakhstan
Amineh, M.P.; van Driel, M.
2018-01-01
During the last decade, China’s diplomatic, economic, security and multilateral relations with Resource-Rich Countries (RRCs) in general, and with Central Asia and the Caspian Region (CACR) in particular, created a regional web of complementarity connecting states and societies. This trend reflects
Statistical analysis of the effects of relative humidity and temperature ...
African Journals Online (AJOL)
Meteorological data from the Department of Satellite Application Facility on Climate Monitoring (CMSAF), DWD Germany have been used to study and investigate the effect of relative humidity and temperature on refractivity in twenty six locations grouped into for climatic regions aloft Nigeria (Coastal, Guinea savannah, ...
Statistical thermodynamics understanding the properties of macroscopic systems
Fai, Lukong Cornelius
2012-01-01
Basic Principles of Statistical PhysicsMicroscopic and Macroscopic Description of StatesBasic PostulatesGibbs Ergodic AssumptionGibbsian EnsemblesExperimental Basis of Statistical MechanicsDefinition of Expectation ValuesErgodic Principle and Expectation ValuesProperties of Distribution FunctionRelative Fluctuation of an Additive Macroscopic ParameterLiouville TheoremGibbs Microcanonical EnsembleMicrocanonical Distribution in Quantum MechanicsDensity MatrixDensity Matrix in Energy RepresentationEntropyThermodynamic FunctionsTemperatureAdiabatic ProcessesPressureThermodynamic IdentityLaws of Th
Statistical analysis of the Ft. Calhoun reactor coolant pump system
International Nuclear Information System (INIS)
Heising, Carolyn D.
1998-01-01
In engineering science, statistical quality control techniques have traditionally been applied to control manufacturing processes. An application to commercial nuclear power plant maintenance and control is presented that can greatly improve plant safety. As a demonstration of such an approach to plant maintenance and control, a specific system is analyzed: the reactor coolant pumps (RCPs) of the Ft. Calhoun nuclear power plant. This research uses capability analysis, Shewhart X-bar, R-charts, canonical correlation methods, and design of experiments to analyze the process for the state of statistical control. The results obtained show that six out of ten parameters are under control specifications limits and four parameters are not in the state of statistical control. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators Such a system would provide operators with ample time to respond to possible emergency situations and thus improve plant safety and reliability. (author)
Statistical analysis of the Ft. Calhoun reactor coolant pump system
International Nuclear Information System (INIS)
Patel, Bimal; Heising, C.D.
1997-01-01
In engineering science, statistical quality control techniques have traditionally been applied to control manufacturing processes. An application to commercial nuclear power plant maintenance and control is presented that can greatly improve plant safety. As a demonstration of such an approach, a specific system is analyzed: the reactor coolant pumps (RCPs) of the Ft. Calhoun nuclear power plant. This research uses capability analysis, Shewhart X-bar, R charts, canonical correlation methods, and design of experiments to analyze the process for the state of statistical control. The results obtained show that six out of ten parameters are under control specification limits and four parameters are not in the state of statistical control. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators. Such a system would provide operators with ample time to respond to possible emergency situations and thus improve plant safety and reliability. (Author)
STATISTICAL RELATIONAL LEARNING AND SCRIPT INDUCTION FOR TEXTUAL INFERENCE
2017-12-01
compensate for parser errors. We replace deterministic conjunction by an average combiner, which encodes causal independence. Our framework was the...sentence similarity (STS) and sentence paraphrasing, but not Textual Entailment, where deeper inferences are required. As the formula for conjunction ...When combined, our algorithm learns to rely on systems that not just agree on an output but also the provenance of this output in conjunction with the
Automated Material Accounting Statistics System at Rockwell Hanford Operations
International Nuclear Information System (INIS)
Eggers, R.F.; Giese, E.W.; Kodman, G.P.
1986-01-01
The Automated Material Accounting Statistics System (AMASS) was developed under the sponsorship of the U.S. Nuclear Regulatory Commission. The AMASS was developed when it was realized that classical methods of error propagation, based only on measured quantities, did not properly control false alarm rate and that errors other than measurement errors affect inventory differences. The classical assumptions that (1) the mean value of the inventory difference (ID) for a particular nuclear material processing facility is zero, and (2) the variance of the inventory difference is due only to errors in measured quantities are overly simplistic. The AMASS provides a valuable statistical tool for estimating the true mean value and variance of the ID data produced by a particular material balance area. In addition it provides statistical methods of testing both individual and cumulative sums of IDs, taking into account the estimated mean value and total observed variance of the ID
Statistical physics of networks, information and complex systems
Energy Technology Data Exchange (ETDEWEB)
Ecke, Robert E [Los Alamos National Laboratory
2009-01-01
In this project we explore the mathematical methods and concepts of statistical physics that are fmding abundant applications across the scientific and technological spectrum from soft condensed matter systems and bio-infonnatics to economic and social systems. Our approach exploits the considerable similarity of concepts between statistical physics and computer science, allowing for a powerful multi-disciplinary approach that draws its strength from cross-fertilization and mUltiple interactions of researchers with different backgrounds. The work on this project takes advantage of the newly appreciated connection between computer science and statistics and addresses important problems in data storage, decoding, optimization, the infonnation processing properties of the brain, the interface between quantum and classical infonnation science, the verification of large software programs, modeling of complex systems including disease epidemiology, resource distribution issues, and the nature of highly fluctuating complex systems. Common themes that the project has been emphasizing are (i) neural computation, (ii) network theory and its applications, and (iii) a statistical physics approach to infonnation theory. The project's efforts focus on the general problem of optimization and variational techniques, algorithm development and infonnation theoretic approaches to quantum systems. These efforts are responsible for fruitful collaborations and the nucleation of science efforts that span multiple divisions such as EES, CCS, 0 , T, ISR and P. This project supports the DOE mission in Energy Security and Nuclear Non-Proliferation by developing novel infonnation science tools for communication, sensing, and interacting complex networks such as the internet or energy distribution system. The work also supports programs in Threat Reduction and Homeland Security.
On some problems related to feasibility of statistical mechanics
International Nuclear Information System (INIS)
Bogolyubov, N.N.; Kazaryan, A.R.; Kurbatov, A.M.
1981-01-01
The principle generalization of the method kinetic Boltzmann equation in the theory of electrons, moving in a crystal and interacting both with lattice vibrations and external electric field, is developed. A scheme of constrUction of the kinetic equation from the accurate evolution one is discussed in the framework of the averaged lattice pulse approximation. A kinetic description of the helical polaron motion is performed on the base of the Boltzmann equation obtained. The above approach permits to generalize The Tornber-Feynman formulae, applied in the transport theory in crystals. The Green function equations are dirived for electron-phonon systems
Nonequilibrium statistical mechanics and stochastic thermodynamics of small systems
International Nuclear Information System (INIS)
Tu Zhanchun
2014-01-01
Thermodynamics is an old subject. The research objects in conventional thermodynamics are macroscopic systems with huge number of particles. In recent 30 years, thermodynamics of small systems is a frontier topic in physics. Here we introduce nonequilibrium statistical mechanics and stochastic thermodynamics of small systems. As a case study, we construct a Canot-like cycle of a stochastic heat engine with a single particle controlled by a time-dependent harmonic potential. We find that the efficiency at maximum power is 1 - √T c /T h , where Tc and Th are the temperatures of cold bath and hot bath, respectively. (author)
FRAMES Software System: Linking to the Statistical Package R
Energy Technology Data Exchange (ETDEWEB)
Castleton, Karl J.; Whelan, Gene; Hoopes, Bonnie L.
2006-12-11
This document provides requirements, design, data-file specifications, test plan, and Quality Assurance/Quality Control protocol for the linkage between the statistical package R and the Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) Versions 1.x and 2.0. The requirements identify the attributes of the system. The design describes how the system will be structured to meet those requirements. The specification presents the specific modifications to FRAMES to meet the requirements and design. The test plan confirms that the basic functionality listed in the requirements (black box testing) actually functions as designed, and QA/QC confirms that the software meets the client’s needs.
Development of modelling algorithm of technological systems by statistical tests
Shemshura, E. A.; Otrokov, A. V.; Chernyh, V. G.
2018-03-01
The paper tackles the problem of economic assessment of design efficiency regarding various technological systems at the stage of their operation. The modelling algorithm of a technological system was performed using statistical tests and with account of the reliability index allows estimating the level of machinery technical excellence and defining the efficiency of design reliability against its performance. Economic feasibility of its application shall be determined on the basis of service quality of a technological system with further forecasting of volumes and the range of spare parts supply.
Quantum statistical Monte Carlo methods and applications to spin systems
International Nuclear Information System (INIS)
Suzuki, M.
1986-01-01
A short review is given concerning the quantum statistical Monte Carlo method based on the equivalence theorem that d-dimensional quantum systems are mapped onto (d+1)-dimensional classical systems. The convergence property of this approximate tansformation is discussed in detail. Some applications of this general appoach to quantum spin systems are reviewed. A new Monte Carlo method, ''thermo field Monte Carlo method,'' is presented, which is an extension of the projection Monte Carlo method at zero temperature to that at finite temperatures
Phase difference statistics related to sensor and forest parameters
Lopes, A.; Mougin, E.; Beaudoin, A.; Goze, S.; Nezry, E.; Touzi, R.; Karam, M. A.; Fung, A. K.
1992-01-01
The information content of ordinary synthetic aperture radar (SAR) data is principally contained in the radiometric polarization channels, i.e., the four Ihh, Ivv, Ihv and Ivh backscattered intensities. In the case of clutter, polarimetric information is given by the four complex degrees of coherence, from which the mean polarization phase differences (PPD), correlation coefficients or degrees of polarization can be deduced. For radiometric features, the polarimetric parameters are corrupted by multiplicative speckle noise and by some sensor effects. The PPD distribution is related to the sensor, speckle and terrain properties. Experimental results are given for the variation of the terrain hh/vv mean phase difference and magnitude of the degree of coherence observed on bare soil and on different pine forest stands.
Energy-level statistics and time relaxation in quantum systems
International Nuclear Information System (INIS)
Gruver, J.L.; Cerdeira, H.A.; Aliaga, J.; Mello, P.A.; Proto, A.N.
1997-05-01
We study a quantum-mechanical system, prepared, at t = 0, in a model state, that subsequently decays into a sea of other states whose energy levels form a discrete spectrum with given statistical properties. An important quantity is the survival probability P(t), defined as the probability, at time t, to find the system in the original model state. Our main purpose is to analyze the influence of the discreteness and statistical properties of the spectrum on the behavior of P(t). Since P(t) itself is a statistical quantity, we restrict our attention to its ensemble average , which is calculated analytically using random-matrix techniques, within certain approximations discussed in the text. We find, for , an exponential decay, followed by a revival, governed by the two-point structure of the statistical spectrum, thus giving a nonzero asymptotic value for large t's. The analytic result compares well with a number of computer simulations, over a time range discussed in the text. (author). 17 refs, 1 fig
[Stomatological problems related to pregnancy. A statistical study].
Masoni, S; Panattoni, E; Rolla, P; Rossi, M; Giuca, M R; Gabriele, M
1991-12-01
Pregnancy is related to particular dental issues, such as the increased incidence of diseases( gingivitis, caries, epulis), the fluoride supplementation, and the limits of diagnostics and therapy. Moreover, the mysterious halo surrounding pregnancy often makes the dentist uneasy. In order to objectively evaluate the implications of pregnancy in dentistry, we distributed a form to 100 pregnant women. The results of the form showed that 53 of them had gingival bleeding, 22 had toothache, 19 had caries but that just 12 of them had gone to the dentist because of dental troubles while 54 had not gone at all. Among the pluri-gravidae, all the women with dental diseases in their previous pregnancies had them again in their current pregnancy but nonetheless only some had undergone a dental check-up. The dentists did not show any uneasiness, as they performed tooth extractions in 5 women, endodontics in 2 women and fillings in 11 women. Just 4 out of 100 women had taken a fluoride supplementation. We deem advisable a stronger collaboration between physician, gynecologist and dentist in order to resolve specialist problems and to make pregnant women more aware of the need for dental follow-ups and fluoride supplementations.
Applied systems ecology: models, data, and statistical methods
Energy Technology Data Exchange (ETDEWEB)
Eberhardt, L L
1976-01-01
In this report, systems ecology is largely equated to mathematical or computer simulation modelling. The need for models in ecology stems from the necessity to have an integrative device for the diversity of ecological data, much of which is observational, rather than experimental, as well as from the present lack of a theoretical structure for ecology. Different objectives in applied studies require specialized methods. The best predictive devices may be regression equations, often non-linear in form, extracted from much more detailed models. A variety of statistical aspects of modelling, including sampling, are discussed. Several aspects of population dynamics and food-chain kinetics are described, and it is suggested that the two presently separated approaches should be combined into a single theoretical framework. It is concluded that future efforts in systems ecology should emphasize actual data and statistical methods, as well as modelling.
Research and Development on Food Nutrition Statistical Analysis Software System
Du Li; Ke Yun
2013-01-01
Designing and developing a set of food nutrition component statistical analysis software can realize the automation of nutrition calculation, improve the nutrition processional professional’s working efficiency and achieve the informatization of the nutrition propaganda and education. In the software development process, the software engineering method and database technology are used to calculate the human daily nutritional intake and the intelligent system is used to evaluate the user’s hea...
DEFF Research Database (Denmark)
Steffen, J.H.; Ford, E.B.; Rowe, J.F.
2012-01-01
We analyze the deviations of transit times from a linear ephemeris for the Kepler Objects of Interest (KOI) through quarter six of science data. We conduct two statistical tests for all KOIs and a related statistical test for all pairs of KOIs in multi-transiting systems. These tests identify...... several systems which show potentially interesting transit timing variations (TTVs). Strong TTV systems have been valuable for the confirmation of planets and their mass measurements. Many of the systems identified in this study should prove fruitful for detailed TTV studies....
International Nuclear Information System (INIS)
Steffen, Jason H.; Ford, Eric B.; Rowe, Jason F.; Borucki, William J.; Bryson, Steve; Caldwell, Douglas A.; Jenkins, Jon M.; Koch, David G.; Sanderfer, Dwight T.; Seader, Shawn; Twicken, Joseph D.; Fabrycky, Daniel C.; Holman, Matthew J.; Welsh, William F.; Batalha, Natalie M.; Ciardi, David R.; Kjeldsen, Hans; Prša, Andrej
2012-01-01
We analyze the deviations of transit times from a linear ephemeris for the Kepler Objects of Interest (KOI) through quarter six of science data. We conduct two statistical tests for all KOIs and a related statistical test for all pairs of KOIs in multi-transiting systems. These tests identify several systems which show potentially interesting transit timing variations (TTVs). Strong TTV systems have been valuable for the confirmation of planets and their mass measurements. Many of the systems identified in this study should prove fruitful for detailed TTV studies.
The system for statistical analysis of logistic information
Directory of Open Access Journals (Sweden)
Khayrullin Rustam Zinnatullovich
2015-05-01
Full Text Available The current problem for managers in logistic and trading companies is the task of improving the operational business performance and developing the logistics support of sales. The development of logistics sales supposes development and implementation of a set of works for the development of the existing warehouse facilities, including both a detailed description of the work performed, and the timing of their implementation. Logistics engineering of warehouse complex includes such tasks as: determining the number and the types of technological zones, calculation of the required number of loading-unloading places, development of storage structures, development and pre-sales preparation zones, development of specifications of storage types, selection of loading-unloading equipment, detailed planning of warehouse logistics system, creation of architectural-planning decisions, selection of information-processing equipment, etc. The currently used ERP and WMS systems did not allow us to solve the full list of logistics engineering problems. In this regard, the development of specialized software products, taking into account the specifics of warehouse logistics, and subsequent integration of these software with ERP and WMS systems seems to be a current task. In this paper we suggest a system of statistical analysis of logistics information, designed to meet the challenges of logistics engineering and planning. The system is based on the methods of statistical data processing.The proposed specialized software is designed to improve the efficiency of the operating business and the development of logistics support of sales. The system is based on the methods of statistical data processing, the methods of assessment and prediction of logistics performance, the methods for the determination and calculation of the data required for registration, storage and processing of metal products, as well as the methods for planning the reconstruction and development
A statistical view of uncertainty in expert systems
International Nuclear Information System (INIS)
Spiegelhalter, D.J.
1986-01-01
The constructors of expert systems interpret ''uncertainty'' in a wide sense and have suggested a variety of qualitative and quantitative techniques for handling the concept, such as the theory of ''endorsements,'' fuzzy reasoning, and belief functions. After a brief selective review of procedures that do not adhere to the laws of probability, it is argued that a subjectivist Bayesian view of uncertainty, if flexibly applied, can provide many of the features demanded by expert systems. This claim is illustrated with a number of examples of probabilistic reasoning, and a connection drawn with statistical work on the graphical representation of multivariate distributions. Possible areas of future research are outlined
Nonequilibrium statistical mechanics of systems with long-range interactions
Energy Technology Data Exchange (ETDEWEB)
Levin, Yan, E-mail: levin@if.ufrgs.br; Pakter, Renato, E-mail: pakter@if.ufrgs.br; Rizzato, Felipe B., E-mail: rizzato@if.ufrgs.br; Teles, Tarcísio N., E-mail: tarcisio.teles@fi.infn.it; Benetti, Fernanda P.C., E-mail: fbenetti@if.ufrgs.br
2014-02-01
Systems with long-range (LR) forces, for which the interaction potential decays with the interparticle distance with an exponent smaller than the dimensionality of the embedding space, remain an outstanding challenge to statistical physics. The internal energy of such systems lacks extensivity and additivity. Although the extensivity can be restored by scaling the interaction potential with the number of particles, the non-additivity still remains. Lack of additivity leads to inequivalence of statistical ensembles. Before relaxing to thermodynamic equilibrium, isolated systems with LR forces become trapped in out-of-equilibrium quasi-stationary states (qSSs), the lifetime of which diverges with the number of particles. Therefore, in the thermodynamic limit LR systems will not relax to equilibrium. The qSSs are attained through the process of collisionless relaxation. Density oscillations lead to particle–wave interactions and excitation of parametric resonances. The resonant particles escape from the main cluster to form a tenuous halo. Simultaneously, this cools down the core of the distribution and dampens out the oscillations. When all the oscillations die out the ergodicity is broken and a qSS is born. In this report, we will review a theory which allows us to quantitatively predict the particle distribution in the qSS. The theory is applied to various LR interacting systems, ranging from plasmas to self-gravitating clusters and kinetic spin models.
Energy Technology Data Exchange (ETDEWEB)
Kančev, Duško, E-mail: dusko.kancev@ec.europa.eu [European Commission, DG-JRC, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Duchac, Alexander; Zerger, Benoit [European Commission, DG-JRC, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Maqua, Michael [Gesellschaft für Anlagen-und-Reaktorsicherheit (GRS) mbH, Schwetnergasse 1, 50667 Köln (Germany); Wattrelos, Didier [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), BP 17 - 92262 Fontenay-aux-Roses Cedex (France)
2014-07-01
Highlights: • Analysis of operating experience related to emergency diesel generators events at NPPs. • Four abundant operating experience databases screened. • Delineating important insights and conclusions based on the operating experience. - Abstract: This paper is aimed at studying the operating experience related to emergency diesel generators (EDGs) events at nuclear power plants collected from the past 20 years. Events related to EDGs failures and/or unavailability as well as all the supporting equipment are in the focus of the analysis. The selected operating experience was analyzed in detail in order to identify the type of failures, attributes that contributed to the failure, failure modes potential or real, discuss risk relevance, summarize important lessons learned, and provide recommendations. The study in this particular paper is tightly related to the performing of statistical analysis of the operating experience. For the purpose of this study EDG failure is defined as EDG failure to function on demand (i.e. fail to start, fail to run) or during testing, or an unavailability of an EDG, except of unavailability due to regular maintenance. The Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases as well as the operating experience contained in the IAEA/NEA International Reporting System for Operating Experience and the U.S. Licensee Event Reports were screened. The screening methodology applied for each of the four different databases is presented. Further on, analysis aimed at delineating the causes, root causes, contributing factors and consequences are performed. A statistical analysis was performed related to the chronology of events, types of failures, the operational circumstances of detection of the failure and the affected components/subsystems. The conclusions and results of the statistical analysis are discussed. The main findings concerning the testing
International Nuclear Information System (INIS)
Kančev, Duško; Duchac, Alexander; Zerger, Benoit; Maqua, Michael; Wattrelos, Didier
2014-01-01
Highlights: • Analysis of operating experience related to emergency diesel generators events at NPPs. • Four abundant operating experience databases screened. • Delineating important insights and conclusions based on the operating experience. - Abstract: This paper is aimed at studying the operating experience related to emergency diesel generators (EDGs) events at nuclear power plants collected from the past 20 years. Events related to EDGs failures and/or unavailability as well as all the supporting equipment are in the focus of the analysis. The selected operating experience was analyzed in detail in order to identify the type of failures, attributes that contributed to the failure, failure modes potential or real, discuss risk relevance, summarize important lessons learned, and provide recommendations. The study in this particular paper is tightly related to the performing of statistical analysis of the operating experience. For the purpose of this study EDG failure is defined as EDG failure to function on demand (i.e. fail to start, fail to run) or during testing, or an unavailability of an EDG, except of unavailability due to regular maintenance. The Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases as well as the operating experience contained in the IAEA/NEA International Reporting System for Operating Experience and the U.S. Licensee Event Reports were screened. The screening methodology applied for each of the four different databases is presented. Further on, analysis aimed at delineating the causes, root causes, contributing factors and consequences are performed. A statistical analysis was performed related to the chronology of events, types of failures, the operational circumstances of detection of the failure and the affected components/subsystems. The conclusions and results of the statistical analysis are discussed. The main findings concerning the testing
A Review of Modeling Bioelectrochemical Systems: Engineering and Statistical Aspects
Directory of Open Access Journals (Sweden)
Shuai Luo
2016-02-01
Full Text Available Bioelectrochemical systems (BES are promising technologies to convert organic compounds in wastewater to electrical energy through a series of complex physical-chemical, biological and electrochemical processes. Representative BES such as microbial fuel cells (MFCs have been studied and advanced for energy recovery. Substantial experimental and modeling efforts have been made for investigating the processes involved in electricity generation toward the improvement of the BES performance for practical applications. However, there are many parameters that will potentially affect these processes, thereby making the optimization of system performance hard to be achieved. Mathematical models, including engineering models and statistical models, are powerful tools to help understand the interactions among the parameters in BES and perform optimization of BES configuration/operation. This review paper aims to introduce and discuss the recent developments of BES modeling from engineering and statistical aspects, including analysis on the model structure, description of application cases and sensitivity analysis of various parameters. It is expected to serves as a compass for integrating the engineering and statistical modeling strategies to improve model accuracy for BES development.
Statistics in a Trilinear Interacting Stokes-Antistokes Boson System
Tänzler, W.; Schütte, F.-J.
The statistics of a system of four boson modes is treated with simultaneous Stokes-Antistokes interaction taking place. The time evolution is calculated in full quantum manner but in short time approximation. Mean photon numbers and correlations of second order are calculated. Antibunching can be found in the laser mode and in the system of Stokes and Antistokes mode.Translated AbstractStatistik in einem trilinear wechselwirkenden Stokes-Antistokes-BosonensystemDie Statistik eines Systems von vier Bosonenmoden mit gleichzeitiger Stokes-Antistokes-Wechselwirkung wird bei vollquantenphysikalischer Beschreibung in Kurzzeitnäherung untersucht. Mittlere Photonenzahlen und Korrelationen zweiter Ordnung werden berechnet. Dabei wird Antibunching sowohl in der Lasermode allein als auch im System aus Stokes- und Antistokesmode gefunden.
Applying incomplete statistics to nonextensive systems with different q indices
International Nuclear Information System (INIS)
Nivanen, L.; Pezeril, M.; Wang, Q.A.; Mehaute, A. Le
2005-01-01
The nonextensive statistics based on the q-entropy Sq=--bar i=1v(pi-piq)1-q has been so far applied to systems in which the q value is uniformly distributed. For the systems containing different q's, the applicability of the theory is still a matter of investigation. The difficulty is that the class of systems to which the theory can be applied is actually limited by the usual nonadditivity rule of entropy which is no more valid when the systems contain non uniform distribution of q values. In this paper, within the framework of the so called incomplete information theory, we propose a more general nonadditivity rule of entropy prescribed by the zeroth law of thermodynamics. This new nonadditivity generalizes in a simple way the usual one and can be proved to lead uniquely to the q-entropy
Statistical reliability assessment of software-based systems
International Nuclear Information System (INIS)
Korhonen, J.; Pulkkinen, U.; Haapanen, P.
1997-01-01
Plant vendors nowadays propose software-based systems even for the most critical safety functions. The reliability estimation of safety critical software-based systems is difficult since the conventional modeling techniques do not necessarily apply to the analysis of these systems, and the quantification seems to be impossible. Due to lack of operational experience and due to the nature of software faults, the conventional reliability estimation methods can not be applied. New methods are therefore needed for the safety assessment of software-based systems. In the research project Programmable automation systems in nuclear power plants (OHA), financed together by the Finnish Centre for Radiation and Nuclear Safety (STUK), the Ministry of Trade and Industry and the Technical Research Centre of Finland (VTT), various safety assessment methods and tools for software based systems are developed and evaluated. This volume in the OHA-report series deals with the statistical reliability assessment of software based systems on the basis of dynamic test results and qualitative evidence from the system design process. Other reports to be published later on in OHA-report series will handle the diversity requirements in safety critical software-based systems, generation of test data from operational profiles and handling of programmable automation in plant PSA-studies. (orig.) (25 refs.)
Management system of occupational diseases in Korea: statistics, report and monitoring system.
Rhee, Kyung Yong; Choe, Seong Weon
2010-12-01
The management system of occupational diseases in Korea can be assessed from the perspective of a surveillance system. Workers' compensation insurance reports are used to produce official statistics on occupational diseases in Korea. National working conditions surveys are used to monitor the magnitude of work-related symptoms and signs in the labor force. A health examination program was introduced to detect occupational diseases through both selective and mass screening programs. The Working Environment Measurement Institution assesses workers' exposure to hazards in the workplace. Government regulates that the employer should do health examinations and working conditions measurement through contracted private agencies and following the Occupational Safety and Health Act. It is hoped that these institutions may be able to effectively detect and monitor occupational diseases and hazards in the workplace. In view of this, the occupational management system in Korea is well designed, except for the national survey system. In the future, national surveys for detection of hazards and ill-health outcomes in workers should be developed. The existing surveillance system for occupational disease can be improved by providing more refined information through statistical analysis of surveillance data.
Indian Academy of Sciences (India)
D. Bhattacharyya
2018-02-09
Feb 9, 2018 ... SMBH than that of the nearby stars. The relation of the. SMBHs to their host galaxies can be seen by the strong correlation between the mass of SMBH and velocity dispersion σ of the stars in the galaxy. This is some- what surprising because the stars are too far from the. SMBH for the velocity dispersion to ...
Statistical analysis of complex systems with nonclassical invariant measures
Fratalocchi, Andrea
2011-02-28
I investigate the problem of finding a statistical description of a complex many-body system whose invariant measure cannot be constructed stemming from classical thermodynamics ensembles. By taking solitons as a reference system and by employing a general formalism based on the Ablowitz-Kaup-Newell-Segur scheme, I demonstrate how to build an invariant measure and, within a one-dimensional phase space, how to develop a suitable thermodynamics. A detailed example is provided with a universal model of wave propagation, with reference to a transparent potential sustaining gray solitons. The system shows a rich thermodynamic scenario, with a free-energy landscape supporting phase transitions and controllable emergent properties. I finally discuss the origin of such behavior, trying to identify common denominators in the area of complex dynamics.
Statistical characterization of discrete conservative systems: The web map
Ruiz, Guiomar; Tirnakli, Ugur; Borges, Ernesto P.; Tsallis, Constantino
2017-10-01
We numerically study the two-dimensional, area preserving, web map. When the map is governed by ergodic behavior, it is, as expected, correctly described by Boltzmann-Gibbs statistics, based on the additive entropic functional SB G[p (x ) ] =-k ∫d x p (x ) lnp (x ) . In contrast, possible ergodicity breakdown and transitory sticky dynamical behavior drag the map into the realm of generalized q statistics, based on the nonadditive entropic functional Sq[p (x ) ] =k 1/-∫d x [p(x ) ] q q -1 (q ∈R ;S1=SB G ). We statistically describe the system (probability distribution of the sum of successive iterates, sensitivity to the initial condition, and entropy production per unit time) for typical values of the parameter that controls the ergodicity of the map. For small (large) values of the external parameter K , we observe q -Gaussian distributions with q =1.935 ⋯ (Gaussian distributions), like for the standard map. In contrast, for intermediate values of K , we observe a different scenario, due to the fractal structure of the trajectories embedded in the chaotic sea. Long-standing non-Gaussian distributions are characterized in terms of the kurtosis and the box-counting dimension of chaotic sea.
Non-statistical behavior of coupled optical systems
International Nuclear Information System (INIS)
Perez, G.; Pando Lambruschini, C.; Sinha, S.; Cerdeira, H.A.
1991-10-01
We study globally coupled chaotic maps modeling an optical system, and find clear evidence of non-statistical behavior: the mean square deviation (MSD) of the mean field saturates with respect to increase in the number of elements coupled, after a critical value, and its distribution is clearly non-Gaussian. We also find that the power spectrum of the mean field displays well defined peaks, indicating a subtle coherence among different elements, even in the ''turbulent'' phase. This system is a physically realistic model that may be experimentally realizable. It is also a higher dimensional example (as each individual element is given by a complex map). Its study confirms that the phenomena observed in a wide class of coupled one-dimensional maps are present here as well. This gives more evidence to believe that such non-statistical behavior is probably generic in globally coupled systems. We also investigate the influence of parametric fluctuations on the MSD. (author). 10 refs, 7 figs, 1 tab
Level and width statistics for a decaying chaotic system
International Nuclear Information System (INIS)
Mizutori, S.; Zelevinsky, V.G.
1993-01-01
The random matrix ensemble of discretized effective non-hermitian hamiltonians is used for studying local correlations and fluctuations of energies and widths in a quantum system where intrinsic levels are coupled to the continuum via a common decay channel. With the use of analytical estimates and numerical simulations, generic properties of statistical observables are obtained for the regimes of weak and strong continuum coupling as well as for the transitional region. Typical signals of the transition (width collectivization, disappearance of level repulsion at small spacings and violation of uniformity along the energy axis) are discussed quantitatively. (orig.)
Foundations of Complex Systems Nonlinear Dynamics, Statistical Physics, and Prediction
Nicolis, Gregoire
2007-01-01
Complexity is emerging as a post-Newtonian paradigm for approaching a large body of phenomena of concern at the crossroads of physical, engineering, environmental, life and human sciences from a unifying point of view. This book outlines the foundations of modern complexity research as it arose from the cross-fertilization of ideas and tools from nonlinear science, statistical physics and numerical simulation. It is shown how these developments lead to an understanding, both qualitative and quantitative, of the complex systems encountered in nature and in everyday experience and, conversely, h
Reliability assessment for safety critical systems by statistical random testing
International Nuclear Information System (INIS)
Mills, S.E.
1995-11-01
In this report we present an overview of reliability assessment for software and focus on some basic aspects of assessing reliability for safety critical systems by statistical random testing. We also discuss possible deviations from some essential assumptions on which the general methodology is based. These deviations appear quite likely in practical applications. We present and discuss possible remedies and adjustments and then undertake applying this methodology to a portion of the SDS1 software. We also indicate shortcomings of the methodology and possible avenues to address to follow to address these problems. (author). 128 refs., 11 tabs., 31 figs
Reliability assessment for safety critical systems by statistical random testing
Energy Technology Data Exchange (ETDEWEB)
Mills, S E [Carleton Univ., Ottawa, ON (Canada). Statistical Consulting Centre
1995-11-01
In this report we present an overview of reliability assessment for software and focus on some basic aspects of assessing reliability for safety critical systems by statistical random testing. We also discuss possible deviations from some essential assumptions on which the general methodology is based. These deviations appear quite likely in practical applications. We present and discuss possible remedies and adjustments and then undertake applying this methodology to a portion of the SDS1 software. We also indicate shortcomings of the methodology and possible avenues to address to follow to address these problems. (author). 128 refs., 11 tabs., 31 figs.
Directory of Open Access Journals (Sweden)
D. R. Novog
2008-01-01
Full Text Available This paper provides a novel and robust methodology for determination of nuclear reactor trip setpoints which accounts for uncertainties in input parameters and models, as well as accounting for the variations in operating states that periodically occur. Further it demonstrates that in performing best estimate and uncertainty calculations, it is critical to consider the impact of all fuel channels and instrumentation in the integration of these uncertainties in setpoint determination. This methodology is based on the concept of a true trip setpoint, which is the reactor setpoint that would be required in an ideal situation where all key inputs and plant responses were known, such that during the accident sequence a reactor shutdown will occur which just prevents the acceptance criteria from being exceeded. Since this true value cannot be established, the uncertainties in plant simulations and plant measurements as well as operational variations which lead to time changes in the true value of initial conditions must be considered. This paper presents the general concept used to determine the actuation setpoints considering the uncertainties and changes in initial conditions, and allowing for safety systems instrumentation redundancy. The results demonstrate unique statistical behavior with respect to both fuel and instrumentation uncertainties which has not previously been investigated.
Becchi, Carlo Maria
2007-01-01
These notes are designed as a text book for a course on the Modern Physics Theory for undergraduate students. The purpose is providing a rigorous and self-contained presentation of the simplest theoretical framework using elementary mathematical tools. A number of examples of relevant applications and an appropriate list of exercises and answered questions are also given. The first part is devoted to Special Relativity concerning in particular space-time relativity and relativistic kinematics. The second part deals with Schroedinger's formulation of quantum mechanics. The presentation concerns mainly one dimensional problems, in particular tunnel effect, discrete energy levels and band spectra. The third part concerns the application of Gibbs statistical methods to quantum systems and in particular to Bose and Fermi gasses.
International Nuclear Information System (INIS)
Heising, C.D.; Grenzebach, W.S.
1990-01-01
In engineering science, statistical quality control techniques have traditionally been applied to control manufacturing processes. An application to commercial nuclear power plant maintenance and control is presented that can greatly improve safety. As a demonstration of such an approach to plant maintenance and control, a specific system is analyzed: the reactor coolant pumps of the St. Lucie Unit 2 nuclear power plant located in Florida. A 30-day history of the four pumps prior to a plant shutdown caused by pump failure and a related fire within the containment was analyzed. Statistical quality control charts of recorded variables were constructed for each pump, which were shown to go out of statistical control many days before the plant trip. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators
Spectral statistics in chiral-orthogonal disordered systems
International Nuclear Information System (INIS)
Evangelou, S N; Katsanos, D E
2003-01-01
We describe the singularities in the averaged density of states and the corresponding statistics of the energy levels in two- (2D) and three-dimensional (3D) chiral symmetric and time-reversal invariant disordered systems, realized in bipartite lattices with real off-diagonal disorder. For off-diagonal disorder of zero mean, we obtain a singular density of states in 2D which becomes much less pronounced in 3D, while the level-statistics can be described by a semi-Poisson distribution with mostly critical fractal states in 2D and Wigner surmise with mostly delocalized states in 3D. For logarithmic off-diagonal disorder of large strength, we find behaviour indistinguishable from ordinary disorder with strong localization in any dimension but in addition one-dimensional 1/ vertical bar E vertical bar Dyson-like asymptotic spectral singularities. The off-diagonal disorder is also shown to enhance the propagation of two interacting particles similarly to systems with diagonal disorder. Although disordered models with chiral symmetry differ from non-chiral ones due to the presence of spectral singularities, both share the same qualitative localization properties except at the chiral symmetry point E=0 which is critical
Characterization of a Compton suppression system and the applicability of Poisson statistics
International Nuclear Information System (INIS)
Nicholson, G.; Landsberger, S.; Welch, L.
2008-01-01
The Compton suppression system (CSS) has been thoroughly characterized at the University of Texas' Nuclear Engineering Teaching Laboratory (NETL). Effects of dead-time, sample displacement from primary detector, and primary energy detector position relative to the active shield detector have been measured and analyzed. Also, the applicability of Poisson counting statistics to Compton suppression spectroscopy has been evaluated. (author)
Sex differences in discriminative power of volleyball game-related statistics.
João, Paulo Vicente; Leite, Nuno; Mesquita, Isabel; Sampaio, Jaime
2010-12-01
To identify sex differences in volleyball game-related statistics, the game-related statistics of several World Championships in 2007 (N=132) were analyzed using the software VIS from the International Volleyball Federation. Discriminant analysis was used to identify the game-related statistics which better discriminated performances by sex. Analysis yielded an emphasis on fault serves (SC = -.40), shot spikes (SC = .40), and reception digs (SC = .31). Specific robust numbers represent that considerable variability was evident in the game-related statistics profile, as men's volleyball games were better associated with terminal actions (errors of service), and women's volleyball games were characterized by continuous actions (in defense and attack). These differences may be related to the anthropometric and physiological differences between women and men and their influence on performance profiles.
A Statistical Graphical Model of the California Reservoir System
Taeb, A.; Reager, J. T.; Turmon, M.; Chandrasekaran, V.
2017-11-01
The recent California drought has highlighted the potential vulnerability of the state's water management infrastructure to multiyear dry intervals. Due to the high complexity of the network, dynamic storage changes in California reservoirs on a state-wide scale have previously been difficult to model using either traditional statistical or physical approaches. Indeed, although there is a significant line of research on exploring models for single (or a small number of) reservoirs, these approaches are not amenable to a system-wide modeling of the California reservoir network due to the spatial and hydrological heterogeneities of the system. In this work, we develop a state-wide statistical graphical model to characterize the dependencies among a collection of 55 major California reservoirs across the state; this model is defined with respect to a graph in which the nodes index reservoirs and the edges specify the relationships or dependencies between reservoirs. We obtain and validate this model in a data-driven manner based on reservoir volumes over the period 2003-2016. A key feature of our framework is a quantification of the effects of external phenomena that influence the entire reservoir network. We further characterize the degree to which physical factors (e.g., state-wide Palmer Drought Severity Index (PDSI), average temperature, snow pack) and economic factors (e.g., consumer price index, number of agricultural workers) explain these external influences. As a consequence of this analysis, we obtain a system-wide health diagnosis of the reservoir network as a function of PDSI.
The Norwegian research and innovation system - statistics and indicators 2003
International Nuclear Information System (INIS)
2003-01-01
This is the fourth report in a series from the Research Council of Norway. The report shows the extent of the resource use in research and development and innovation and presents results of these activities. As a basis the R and D and the innovation statistics for 2001 are used as well as other statistics and analyses. The report contains time series and international comparisons. The aim of the report is to present a collective survey of the state and development of the activities in Norway within research, innovation, science and technology. This includes data regarding costs and financing of the R and D work, human resources, cooperation relations and results from the R and D and innovation activities, publishing and quotations, patenting and trade balances included. The report opens with a research political article about research as basis for new business. Furthermore several ''focusboxes'' are included that indicate the development of science and technology indicators within various themes. In the report for 2003 the EU central indicator pairs for reference testing are included for the first time and a survey is made of public investigations, white papers and parliamentary proposals within research, higher education and innovation. For the second time a short English version is included
He, Ping
2012-01-01
The long-standing puzzle surrounding the statistical mechanics of self-gravitating systems has not yet been solved successfully. We formulate a systematic theoretical framework of entropy-based statistical mechanics for spherically symmetric collisionless self-gravitating systems. We use an approach that is very different from that of the conventional statistical mechanics of short-range interaction systems. We demonstrate that the equilibrium states of self-gravitating systems consist of both mechanical and statistical equilibria, with the former characterized by a series of velocity-moment equations and the latter by statistical equilibrium equations, which should be derived from the entropy principle. The velocity-moment equations of all orders are derived from the steady-state collisionless Boltzmann equation. We point out that the ergodicity is invalid for the whole self-gravitating system, but it can be re-established locally. Based on the local ergodicity, using Fermi-Dirac-like statistics, with the non-degenerate condition and the spatial independence of the local microstates, we rederive the Boltzmann-Gibbs entropy. This is consistent with the validity of the collisionless Boltzmann equation, and should be the correct entropy form for collisionless self-gravitating systems. Apart from the usual constraints of mass and energy conservation, we demonstrate that the series of moment or virialization equations must be included as additional constraints on the entropy functional when performing the variational calculus; this is an extension to the original prescription by White & Narayan. Any possible velocity distribution can be produced by the statistical-mechanical approach that we have developed with the extended Boltzmann-Gibbs/White-Narayan statistics. Finally, we discuss the questions of negative specific heat and ensemble inequivalence for self-gravitating systems.
Statistical mechanics of lattice systems a concrete mathematical introduction
Friedli, Sacha
2017-01-01
This motivating textbook gives a friendly, rigorous introduction to fundamental concepts in equilibrium statistical mechanics, covering a selection of specific models, including the Curie–Weiss and Ising models, the Gaussian free field, O(n) models, and models with Kać interactions. Using classical concepts such as Gibbs measures, pressure, free energy, and entropy, the book exposes the main features of the classical description of large systems in equilibrium, in particular the central problem of phase transitions. It treats such important topics as the Peierls argument, the Dobrushin uniqueness, Mermin–Wagner and Lee–Yang theorems, and develops from scratch such workhorses as correlation inequalities, the cluster expansion, Pirogov–Sinai Theory, and reflection positivity. Written as a self-contained course for advanced undergraduate or beginning graduate students, the detailed explanations, large collection of exercises (with solutions), and appendix of mathematical results and concepts also make i...
Statistical Outlier Detection for Jury Based Grading Systems
DEFF Research Database (Denmark)
Thompson, Mary Kathryn; Clemmensen, Line Katrine Harder; Rosas, Harvey
2013-01-01
This paper presents an algorithm that was developed to identify statistical outliers from the scores of grading jury members in a large project-based first year design course. The background and requirements for the outlier detection system are presented. The outlier detection algorithm...... and the follow-up procedures for score validation and appeals are described in detail. Finally, the impact of various elements of the outlier detection algorithm, their interactions, and the sensitivity of their numerical values are investigated. It is shown that the difference in the mean score produced...... by a grading jury before and after a suspected outlier is removed from the mean is the single most effective criterion for identifying potential outliers but that all of the criteria included in the algorithm have an effect on the outlier detection process....
Statistical characterization of speckle noise in coherent imaging systems
Yaroslavsky, Leonid; Shefler, A.
2003-05-01
Speckle noise imposes fundamental limitation on image quality in coherent radiation based imaging and optical metrology systems. Speckle noise phenomena are associated with properties of objects to diffusely scatter irradiation and with the fact that in recording the wave field, a number of signal distortions inevitably occur due to technical limitations inherent to hologram sensors. The statistical theory of speckle noise was developed with regard to only limited resolving power of coherent imaging devices. It is valid only asymptotically as much as the central limit theorem of the probability theory can be applied. In applications this assumption is not always applicable. Moreover, in treating speckle noise problem one should also consider other sources of the hologram deterioration. In the paper, statistical properties of speckle due to the limitation of hologram size, dynamic range and hologram signal quantization are studied by Monte-Carlo simulation for holograms recorded in near and far diffraction zones. The simulation experiments have shown that, for limited resolving power of the imaging system, widely accepted opinion that speckle contrast is equal to one holds only for rather severe level of the hologram size limitation. For moderate limitations, speckle contrast changes gradually from zero for no limitation to one for limitation to less than about 20% of hologram size. The results obtained for the limitation of the hologram sensor"s dynamic range and hologram signal quantization reveal that speckle noise due to these hologram signal distortions is not multiplicative and is directly associated with the severity of the limitation and quantization. On the base of the simulation results, analytical models are suggested.
Asymptotic expansion and statistical description of turbulent systems
International Nuclear Information System (INIS)
Hagan, W.K. III.
1986-01-01
A new approach to studying turbulent systems is presented in which an asymptotic expansion of the general dynamical equations is performed prior to the application of statistical methods for describing the evolution of the system. This approach has been applied to two specific systems: anomalous drift wave turbulence in plasmas and homogeneous, isotropic turbulence in fluids. For the plasma case, the time and length scales of the turbulent state result in the asymptotic expansion of the Vlasov/Poisson equations taking the form of nonlinear gyrokinetic theory. Questions regarding this theory and modern Hamiltonian perturbation methods are discussed and resolved. A new alternative Hamiltonian method is described. The Eulerian Direct Interaction Approximation (EDIA) is slightly reformulated and applied to the equations of nonlinear gyrokinetic theory. Using a similarity transformation technique, expressions for the thermal diffusivity are derived from the EDIA equations for various geometries, including a tokamak. In particular, the unique result for generalized geometry may be of use in evaluating fusion reactor designs and theories of anomalous thermal transport in tokamaks. Finally, a new and useful property of the EDIA is pointed out. For the fluid case, an asymptotic expansion is applied to the Navier-Stokes equation and the results lead to the speculation that such an approach may resolve the problem of predicting the Kolmogorov inertial range energy spectrum for homogeneous, isotropic turbulence. 45 refs., 3 figs
Statistically validated network of portfolio overlaps and systemic risk.
Gualdi, Stanislao; Cimini, Giulio; Primicerio, Kevin; Di Clemente, Riccardo; Challet, Damien
2016-12-21
Common asset holding by financial institutions (portfolio overlap) is nowadays regarded as an important channel for financial contagion with the potential to trigger fire sales and severe losses at the systemic level. We propose a method to assess the statistical significance of the overlap between heterogeneously diversified portfolios, which we use to build a validated network of financial institutions where links indicate potential contagion channels. The method is implemented on a historical database of institutional holdings ranging from 1999 to the end of 2013, but can be applied to any bipartite network. We find that the proportion of validated links (i.e. of significant overlaps) increased steadily before the 2007-2008 financial crisis and reached a maximum when the crisis occurred. We argue that the nature of this measure implies that systemic risk from fire sales liquidation was maximal at that time. After a sharp drop in 2008, systemic risk resumed its growth in 2009, with a notable acceleration in 2013. We finally show that market trends tend to be amplified in the portfolios identified by the algorithm, such that it is possible to have an informative signal about institutions that are about to suffer (enjoy) the most significant losses (gains).
Correcting the Count: Improving Vital Statistics Data Regarding Deaths Related to Obesity.
McCleskey, Brandi C; Davis, Gregory G; Dye, Daniel W
2017-11-15
Obesity can involve any organ system and compromise the overall health of an individual, including premature death. Despite the increased risk of death associated with being obese, obesity itself is infrequently indicated on the death certificate. We performed an audit of our records to identify how often "obesity" was listed on the death certificate to determine how our practices affected national mortality data collection regarding obesity-related mortality. During the span of nearly 25 years, 0.2% of deaths were attributed to or contributed by obesity. Over the course of 5 years, 96% of selected natural deaths were likely underreported as being associated with obesity. We present an algorithm for certifiers to use to determine whether obesity should be listed on the death certificate and guidelines for certifying cases in which this is appropriate. Use of this algorithm will improve vital statistics concerning the role of obesity in causing or contributing to death. © 2017 American Academy of Forensic Sciences.
Goyal, Shrigopal; Balhara, Yatan Pal Singh; Khandelwal, S K
2012-07-01
Two of the most commonly used nosological systems- International Statistical Classification of Diseases and Related Health Problems (ICD)-10 and Diagnostic and Statistical Manual of Mental Disorders (DSM)-IV are under revision. This process has generated a lot of interesting debates with regards to future of the current diagnostic categories. In fact, the status of categorical approach in the upcoming versions of ICD and DSM is also being debated. The current article focuses on the debate with regards to the eating disorders. The existing classification of eating disorders has been criticized for its limitations. A host of new diagnostic categories have been recommended for inclusion in the upcoming revisions. Also the structure of the existing categories has also been put under scrutiny.
A new Markov-chain-related statistical approach for modelling synthetic wind power time series
International Nuclear Information System (INIS)
Pesch, T; Hake, J F; Schröders, S; Allelein, H J
2015-01-01
The integration of rising shares of volatile wind power in the generation mix is a major challenge for the future energy system. To address the uncertainties involved in wind power generation, models analysing and simulating the stochastic nature of this energy source are becoming increasingly important. One statistical approach that has been frequently used in the literature is the Markov chain approach. Recently, the method was identified as being of limited use for generating wind time series with time steps shorter than 15–40 min as it is not capable of reproducing the autocorrelation characteristics accurately. This paper presents a new Markov-chain-related statistical approach that is capable of solving this problem by introducing a variable second lag. Furthermore, additional features are presented that allow for the further adjustment of the generated synthetic time series. The influences of the model parameter settings are examined by meaningful parameter variations. The suitability of the approach is demonstrated by an application analysis with the example of the wind feed-in in Germany. It shows that—in contrast to conventional Markov chain approaches—the generated synthetic time series do not systematically underestimate the required storage capacity to balance wind power fluctuation. (paper)
Statistically qualified neuro-analytic failure detection method and system
Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.
2002-03-02
An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.
A Statistic-Based Calibration Method for TIADC System
Directory of Open Access Journals (Sweden)
Kuojun Yang
2015-01-01
Full Text Available Time-interleaved technique is widely used to increase the sampling rate of analog-to-digital converter (ADC. However, the channel mismatches degrade the performance of time-interleaved ADC (TIADC. Therefore, a statistic-based calibration method for TIADC is proposed in this paper. The average value of sampling points is utilized to calculate offset error, and the summation of sampling points is used to calculate gain error. After offset and gain error are obtained, they are calibrated by offset and gain adjustment elements in ADC. Timing skew is calibrated by an iterative method. The product of sampling points of two adjacent subchannels is used as a metric for calibration. The proposed method is employed to calibrate mismatches in a four-channel 5 GS/s TIADC system. Simulation results show that the proposed method can estimate mismatches accurately in a wide frequency range. It is also proved that an accurate estimation can be obtained even if the signal noise ratio (SNR of input signal is 20 dB. Furthermore, the results obtained from a real four-channel 5 GS/s TIADC system demonstrate the effectiveness of the proposed method. We can see that the spectra spurs due to mismatches have been effectively eliminated after calibration.
Statistical and dynamical remastering of classic exoplanet systems
Nelson, Benjamin Earl
The most powerful constraints on planet formation will come from characterizing the dynamical state of complex multi-planet systems. Unfortunately, with that complexity comes a number of factors that make analyzing these systems a computationally challenging endeavor: the sheer number of model parameters, a wonky shaped posterior distribution, and hundreds to thousands of time series measurements. In this dissertation, I will review our efforts to improve the statistical analyses of radial velocity (RV) data and their applications to some renown, dynamically complex exoplanet system. In the first project (Chapters 2 and 4), we develop a differential evolution Markov chain Monte Carlo (RUN DMC) algorithm to tackle the aforementioned difficult aspects of data analysis. We test the robustness of the algorithm in regards to the number of modeled planets (model dimensionality) and increasing dynamical strength. We apply RUN DMC to a couple classic multi-planet systems and one highly debated system from radial velocity surveys. In the second project (Chapter 5), we analyze RV data of 55 Cancri, a wide binary system known to harbor five planetary orbiting the primary. We find the inner-most planet "e" must be coplanar to within 40 degrees of the outer planets, otherwise Kozai-like perturbations will cause the planet to enter the stellar photosphere through its periastron passage. We find the orbits of planets "b" and "c" are apsidally aligned and librating with low to median amplitude (50+/-6 10 degrees), but they are not orbiting in a mean-motion resonance. In the third project (Chapters 3, 4, 6), we analyze RV data of Gliese 876, a four planet system with three participating in a multi-body resonance, i.e. a Laplace resonance. From a combined observational and statistical analysis computing Bayes factors, we find a four-planet model is favored over one with three-planets. Conditioned on this preferred model, we meaningfully constrain the three-dimensional orbital
Statistical data processing with automatic system for environmental radiation monitoring
International Nuclear Information System (INIS)
Zarkh, V.G.; Ostroglyadov, S.V.
1986-01-01
Practice of statistical data processing for radiation monitoring is exemplified, and some results obtained are presented. Experience in practical application of mathematical statistics methods for radiation monitoring data processing allowed to develop a concrete algorithm of statistical processing realized in M-6000 minicomputer. The suggested algorithm by its content is divided into 3 parts: parametrical data processing and hypotheses test, pair and multiple correlation analysis. Statistical processing programms are in a dialogue operation. The above algorithm was used to process observed data over radioactive waste disposal control region. Results of surface waters monitoring processing are presented
Parallelization of the Physical-Space Statistical Analysis System (PSAS)
Larson, J. W.; Guo, J.; Lyster, P. M.
1999-01-01
Atmospheric data assimilation is a method of combining observations with model forecasts to produce a more accurate description of the atmosphere than the observations or forecast alone can provide. Data assimilation plays an increasingly important role in the study of climate and atmospheric chemistry. The NASA Data Assimilation Office (DAO) has developed the Goddard Earth Observing System Data Assimilation System (GEOS DAS) to create assimilated datasets. The core computational components of the GEOS DAS include the GEOS General Circulation Model (GCM) and the Physical-space Statistical Analysis System (PSAS). The need for timely validation of scientific enhancements to the data assimilation system poses computational demands that are best met by distributed parallel software. PSAS is implemented in Fortran 90 using object-based design principles. The analysis portions of the code solve two equations. The first of these is the "innovation" equation, which is solved on the unstructured observation grid using a preconditioned conjugate gradient (CG) method. The "analysis" equation is a transformation from the observation grid back to a structured grid, and is solved by a direct matrix-vector multiplication. Use of a factored-operator formulation reduces the computational complexity of both the CG solver and the matrix-vector multiplication, rendering the matrix-vector multiplications as a successive product of operators on a vector. Sparsity is introduced to these operators by partitioning the observations using an icosahedral decomposition scheme. PSAS builds a large (approx. 128MB) run-time database of parameters used in the calculation of these operators. Implementing a message passing parallel computing paradigm into an existing yet developing computational system as complex as PSAS is nontrivial. One of the technical challenges is balancing the requirements for computational reproducibility with the need for high performance. The problem of computational
Statistical Physics of Neural Systems with Nonadditive Dendritic Coupling
Directory of Open Access Journals (Sweden)
David Breuer
2014-03-01
Full Text Available How neurons process their inputs crucially determines the dynamics of biological and artificial neural networks. In such neural and neural-like systems, synaptic input is typically considered to be merely transmitted linearly or sublinearly by the dendritic compartments. Yet, single-neuron experiments report pronounced supralinear dendritic summation of sufficiently synchronous and spatially close-by inputs. Here, we provide a statistical physics approach to study the impact of such nonadditive dendritic processing on single-neuron responses and the performance of associative-memory tasks in artificial neural networks. First, we compute the effect of random input to a neuron incorporating nonlinear dendrites. This approach is independent of the details of the neuronal dynamics. Second, we use those results to study the impact of dendritic nonlinearities on the network dynamics in a paradigmatic model for associative memory, both numerically and analytically. We find that dendritic nonlinearities maintain network convergence and increase the robustness of memory performance against noise. Interestingly, an intermediate number of dendritic branches is optimal for memory functionality.
Decision Support Systems: Applications in Statistics and Hypothesis Testing.
Olsen, Christopher R.; Bozeman, William C.
1988-01-01
Discussion of the selection of appropriate statistical procedures by educators highlights a study conducted to investigate the effectiveness of decision aids in facilitating the use of appropriate statistics. Experimental groups and a control group using a printed flow chart, a computer-based decision aid, and a standard text are described. (11…
Bridging Weighted Rules and Graph Random Walks for Statistical Relational Models
Directory of Open Access Journals (Sweden)
Seyed Mehran Kazemi
2018-02-01
Full Text Available The aim of statistical relational learning is to learn statistical models from relational or graph-structured data. Three main statistical relational learning paradigms include weighted rule learning, random walks on graphs, and tensor factorization. These paradigms have been mostly developed and studied in isolation for many years, with few works attempting at understanding the relationship among them or combining them. In this article, we study the relationship between the path ranking algorithm (PRA, one of the most well-known relational learning methods in the graph random walk paradigm, and relational logistic regression (RLR, one of the recent developments in weighted rule learning. We provide a simple way to normalize relations and prove that relational logistic regression using normalized relations generalizes the path ranking algorithm. This result provides a better understanding of relational learning, especially for the weighted rule learning and graph random walk paradigms. It opens up the possibility of using the more flexible RLR rules within PRA models and even generalizing both by including normalized and unnormalized relations in the same model.
Current statistical tools, systems and bodies concerned with safety and accident statistics.
Koornstra, M.J.
1996-01-01
There are a wide range of differences in the methods used nationally to classify and record road accidents. The current use of road safety information systems and the few systems available for international use are discussed. Recommendations are made for a more efficient, less costly, and improved
Statistical mechanics of homogeneous partly pinned fluid systems.
Krakoviack, Vincent
2010-12-01
The homogeneous partly pinned fluid systems are simple models of a fluid confined in a disordered porous matrix obtained by arresting randomly chosen particles in a one-component bulk fluid or one of the two components of a binary mixture. In this paper, their configurational properties are investigated. It is shown that a peculiar complementarity exists between the mobile and immobile phases, which originates from the fact that the solid is prepared in presence of and in equilibrium with the adsorbed fluid. Simple identities follow, which connect different types of configurational averages, either relative to the fluid-matrix system or to the bulk fluid from which it is prepared. Crucial simplifications result for the computation of important structural quantities, both in computer simulations and in theoretical approaches. Finally, possible applications of the model in the field of dynamics in confinement or in strongly asymmetric mixtures are suggested.
Nonequilibrium statistical mechanics in the general theory of relativity. I. A general formalism
International Nuclear Information System (INIS)
Israel, W.; Kandrup, H.E.
1984-01-01
This is the first in a series of papers, the overall objective of which is the formulation of a new covariant approach to nonequilibrium statistical mechanics in classical general relativity. The objecct here is the development of a tractable theory for self-gravitating systems. It is argued that the ''state'' of an N-particle system may be characterized by an N-particle distribution function, defined in an 8N-dimensional phase space, which satisfies a collection of N conservation equations. By mapping the true physics onto a fictitious ''background'' spacetime, which may be chosen to satisfy some ''average'' field equations, one then obtains a useful covariant notion of ''evolution'' in response to a fluctuating ''gravitational force.'' For many cases of practical interest, one may suppose (i) that these fluctuating forces satisfy linear field equations and (ii) that they may be modeled by a direct interaction. In this case, one can use a relativistic projection operator formalism to derive exact closed equations for the evolution of such objects as an appropriately defined reduced one-particle distribution function. By capturing, in a natural way, the notion of a dilute gas, or impulse, approximation, one is then led to a comparatively simple equation for the one-particle distribution. If, furthermore, one treats the effects of the fluctuating forces as ''localized'' in space and time, one obtains a tractable kinetic equation which reduces, in the Newtonian limit, to the stardard Landau equation
Abdullah, Rusli; Samah, Bahaman Abu; Bolong, Jusang; D'Silva, Jeffrey Lawrence; Shaffril, Hayrol Azril Mohamed
2014-09-01
Today, teaching and learning (T&L) using technology as tool is becoming more important especially in the field of statistics as a part of the subject matter in higher education system environment. Eventhough, there are many types of technology of statistical learnig tool (SLT) which can be used to support and enhance T&L environment, however, there is lack of a common standard knowledge management as a knowledge portal for guidance especially in relation to infrastructure requirement of SLT in servicing the community of user (CoU) such as educators, students and other parties who are interested in performing this technology as a tool for their T&L. Therefore, there is a need of a common standard infrastructure requirement of knowledge portal in helping CoU for managing of statistical knowledge in acquiring, storing, desseminating and applying of the statistical knowedge for their specific purposes. Futhermore, by having this infrastructure requirement of knowledge portal model of SLT as a guidance in promoting knowledge of best practise among the CoU, it can also enhance the quality and productivity of their work towards excellence of statistical knowledge application in education system environment.
STATLIB, Interactive Statistics Program Library of Tutorial System
International Nuclear Information System (INIS)
Anderson, H.E.
1986-01-01
1 - Description of program or function: STATLIB is a conversational statistical program library developed in conjunction with a Sandia National Laboratories applied statistics course intended for practicing engineers and scientists. STATLIB is a group of 15 interactive, argument-free, statistical routines. Included are analysis of sensitivity tests; sample statistics for the normal, exponential, hypergeometric, Weibull, and extreme value distributions; three models of multiple regression analysis; x-y data plots; exact probabilities for RxC tables; n sets of m permuted integers in the range 1 to m; simple linear regression and correlation; K different random integers in the range m to n; and Fisher's exact test of independence for a 2 by 2 contingency table. Forty-five other subroutines in the library support the basic 15
Thermal and statistical properties of nuclei and nuclear systems
International Nuclear Information System (INIS)
Moretto, L.G.; Wozniak, G.J.
1989-07-01
The term statistical decay, statistical or thermodynamic equilibrium, thermalization, temperature, etc., have been used in nuclear physics since the introduction of the compound nucleus (CN) concept, and they are still used, perhaps even more frequently, in the context of intermediate- and high-energy heavy-ion reactions. Unfortunately, the increased popularity of these terms has not made them any clearer, and more often than not one encounters sweeping statements about the alleged statisticity of a nuclear process where the ''statistical'' connotation is a more apt description of the state of the speaker's mind than of the nuclear reaction. It is our goal, in this short set of lectures, to set at least some ideas straight on this broad and beautiful subject, on the one hand by clarifying some fundamental concepts, on the other by presenting some interesting applications to actual physical cases. 74 refs., 38 figs
MANAGERIAL DECISION IN INNOVATIVE EDUCATION SYSTEMS STATISTICAL SURVEY BASED ON SAMPLE THEORY
Directory of Open Access Journals (Sweden)
Gheorghe SĂVOIU
2012-12-01
Full Text Available Before formulating the statistical hypotheses and the econometrictesting itself, a breakdown of some of the technical issues is required, which are related to managerial decision in innovative educational systems, the educational managerial phenomenon tested through statistical and mathematical methods, respectively the significant difference in perceiving the current qualities, knowledge, experience, behaviour and desirable health, obtained through a questionnaire applied to a stratified population at the end,in the educational environment, either with educational activities, or with simultaneously managerial and educational activities. The details having to do with research focused on the survey theory, turning into a working tool the questionnaires and statistical data that are processed from those questionnaires, are summarized below.
Statistical estimation Monte Carlo for unreliability evaluation of highly reliable system
International Nuclear Information System (INIS)
Xiao Gang; Su Guanghui; Jia Dounan; Li Tianduo
2000-01-01
Based on analog Monte Carlo simulation, statistical Monte Carlo methods for unreliable evaluation of highly reliable system are constructed, including direct statistical estimation Monte Carlo method and weighted statistical estimation Monte Carlo method. The basal element is given, and the statistical estimation Monte Carlo estimators are derived. Direct Monte Carlo simulation method, bounding-sampling method, forced transitions Monte Carlo method, direct statistical estimation Monte Carlo and weighted statistical estimation Monte Carlo are used to evaluate unreliability of a same system. By comparing, weighted statistical estimation Monte Carlo estimator has smallest variance, and has highest calculating efficiency
Recent Advances in System Reliability Signatures, Multi-state Systems and Statistical Inference
Frenkel, Ilia
2012-01-01
Recent Advances in System Reliability discusses developments in modern reliability theory such as signatures, multi-state systems and statistical inference. It describes the latest achievements in these fields, and covers the application of these achievements to reliability engineering practice. The chapters cover a wide range of new theoretical subjects and have been written by leading experts in reliability theory and its applications. The topics include: concepts and different definitions of signatures (D-spectra), their properties and applications to reliability of coherent systems and network-type structures; Lz-transform of Markov stochastic process and its application to multi-state system reliability analysis; methods for cost-reliability and cost-availability analysis of multi-state systems; optimal replacement and protection strategy; and statistical inference. Recent Advances in System Reliability presents many examples to illustrate the theoretical results. Real world multi-state systems...
International Nuclear Information System (INIS)
Lim, Gyeong Hui
2008-03-01
This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics
Ranganathan, Priya; Pramesh, C. S.; Aggarwal, Rakesh
2016-01-01
In the previous article in this series on common pitfalls in statistical analysis, we looked at the difference between risk and odds. Risk, which refers to the probability of occurrence of an event or outcome, can be defined in absolute or relative terms. Understanding what these measures represent is essential for the accurate interpretation of study results. PMID:26952180
Wali, F.; Knotter, D. Martin; Wortelboer, Ronald; Mud, Auke
2007-01-01
Ultra pure water supplied inside the Fab is used in different tools at different stages of processing. Data of the particles measured in ultra pure water was compared with the defect density on wafers processed on these tools and a statistical relation is found Keywords— Yield, defect density,
Directory of Open Access Journals (Sweden)
Yolanda Escalante
2012-09-01
Full Text Available The aims of this study were (i to compare women's water polo game-related statistics by match outcome (winning and losing teams and phase (preliminary, classificatory, and semi-final/bronze medal/gold medal, and (ii identify characteristics that discriminate performances for each phase. The game-related statistics of the 124 women's matches played in five International Championships (World and European Championships were analyzed. Differences between winning and losing teams in each phase were determined using the chi-squared. A discriminant analysis was then performed according to context in each of the three phases. It was found that the game-related statistics differentiate the winning from the losing teams in each phase of an international championship. The differentiating variables were both offensive (centre goals, power-play goals, counterattack goal, assists, offensive fouls, steals, blocked shots, and won sprints and defensive (goalkeeper-blocked shots, goalkeeper-blocked inferiority shots, and goalkeeper-blocked 5-m shots. The discriminant analysis showed the game-related statistics to discriminate performance in all phases: preliminary, classificatory, and final phases (92%, 90%, and 83%, respectively. Two variables were discriminatory by match outcome (winning or losing teams in all three phases: goals and goalkeeper-blocked shots
Statistical language learning in neonates revealed by event-related brain potentials
Directory of Open Access Journals (Sweden)
Näätänen Risto
2009-03-01
Full Text Available Abstract Background Statistical learning is a candidate for one of the basic prerequisites underlying the expeditious acquisition of spoken language. Infants from 8 months of age exhibit this form of learning to segment fluent speech into distinct words. To test the statistical learning skills at birth, we recorded event-related brain responses of sleeping neonates while they were listening to a stream of syllables containing statistical cues to word boundaries. Results We found evidence that sleeping neonates are able to automatically extract statistical properties of the speech input and thus detect the word boundaries in a continuous stream of syllables containing no morphological cues. Syllable-specific event-related brain responses found in two separate studies demonstrated that the neonatal brain treated the syllables differently according to their position within pseudowords. Conclusion These results demonstrate that neonates can efficiently learn transitional probabilities or frequencies of co-occurrence between different syllables, enabling them to detect word boundaries and in this way isolate single words out of fluent natural speech. The ability to adopt statistical structures from speech may play a fundamental role as one of the earliest prerequisites of language acquisition.
Statistical cluster analysis and diagnosis of nuclear system level performance
International Nuclear Information System (INIS)
Teichmann, T.; Levine, M.M.; Samanta, P.K.; Kato, W.Y.
1985-01-01
The complexity of individual nuclear power plants and the importance of maintaining reliable and safe operations makes it desirable to complement the deterministic analyses of these plants by corresponding statistical surveys and diagnoses. Based on such investigations, one can then explore, statistically, the anticipation, prevention, and when necessary, the control of such failures and malfunctions. This paper, and the accompanying one by Samanta et al., describe some of the initial steps in exploring the feasibility of setting up such a program on an integrated and global (industry-wide) basis. The conceptual statistical and data framework was originally outlined in BNL/NUREG-51609, NUREG/CR-3026, and the present work aims at showing how some important elements might be implemented in a practical way (albeit using hypothetical or simulated data)
The star-triangle relation and the inversion relation in statistical mechanics
International Nuclear Information System (INIS)
Maillard, J.M.
1983-10-01
The plan of this paper is the following: we give a definition of the star-triangle relation (S.T.R.); we also define another very simple relation which occurs simultaneously with the S.T.R. for the two-dimensional (2-d) exact models: the inversion relation (I.R.); we study the connection between the S.T.R. and the I.R.: we see that the S.T.R. is deeply connected to the I.R., but, on the contrary, we see that the I.R. can exist even when no S.T.R. exists, as we show for the 2-d anisotropic Potts model by exhibiting an inverse functional equation satisfied by the partition function; having recognized the I.R. as an interesting concept, we use it by looking at the analytical consequences of this I.R. and, at last, we come back to the S.T.R., examining some consequences of the I.R. on the S.T.R
International Nuclear Information System (INIS)
Huang Zhifu; Lin Bihong; ChenJincan
2009-01-01
In order to overcome the limitations of the original expression of the probability distribution appearing in literature of Incomplete Statistics, a new expression of the probability distribution is derived, where the Lagrange multiplier β introduced here is proved to be identical with that introduced in the second and third choices for the internal energy constraint in Tsallis' statistics and to be just equal to the physical inverse temperature. It is expounded that the probability distribution described by the new expression is invariant through uniform translation of the energy spectrum. Moreover, several fundamental thermodynamic relations are given and the relationship between the new and the original expressions of the probability distribution is discussed.
Statistical Modeling of Large Wind Plant System's Generation - A Case Study
International Nuclear Information System (INIS)
Sabolic, D.
2014-01-01
This paper presents simplistic, yet very accurate, descriptive statistical models of various static and dynamic parameters of energy output from a large system of wind plants operated by Bonneville Power Administration (BPA), USA. The system's size at the end of 2013 was 4515 MW of installed capacity. The 5-minute readings from the beginning of 2007 to the end of 2013, recorded and published by BPA, were used to derive a number of experimental distributions, which were then used to devise theoretic statistical models with merely one or two parameters. In spite of the simplicity, they reproduced experimental data with great accuracy, which was checked by rigorous tests of goodness-of-fit. Statistical distribution functions were obtained for the following wind generation-related quantities: total generation as percentage of total installed capacity; change in total generation power in 5, 10, 15, 20, 25, 30, 45, and 60 minutes as percentage of total installed capacity; duration of intervals with total generated power, expressed as percentage of total installed capacity, lower than certain pre-specified level. Limitation of total installed wind plant capacity, when it is determined by regulation demand from wind plants, is discussed, too. The models presented here can be utilized in analyses related to power system economics/policy, which is also briefly discussed in the paper. (author).
Nonequilibrium work relation in a macroscopic system
International Nuclear Information System (INIS)
Sughiyama, Yuki; Ohzeki, Masayuki
2013-01-01
We reconsider a well-known relationship between the fluctuation theorem and the second law of thermodynamics by evaluating stochastic evolution of the density field (probability measure valued process). In order to establish a bridge between microscopic and macroscopic behaviors, we must take the thermodynamic limit of a stochastic dynamical system following the standard procedure in statistical mechanics. The thermodynamic path characterizing a dynamical behavior in the macroscopic scale can be formulated as an infimum of the action functional for the stochastic evolution of the density field. In our formulation, the second law of thermodynamics can be derived only by symmetry of the action functional without recourse to the Jarzynski equality. Our formulation leads to a nontrivial nonequilibrium work relation for metastable (quasi-stationary) states, which are peculiar in the macroscopic system. We propose a prescription for computing the free energy for metastable states based on the resultant work relation. (paper)
Statistical projection effects in a hydrodynamic pilot-wave system
Sáenz, Pedro J.; Cristea-Platon, Tudor; Bush, John W. M.
2018-03-01
Millimetric liquid droplets can walk across the surface of a vibrating fluid bath, self-propelled through a resonant interaction with their own guiding or `pilot' wave fields. These walking droplets, or `walkers', exhibit several features previously thought to be peculiar to the microscopic, quantum realm. In particular, walkers confined to circular corrals manifest a wave-like statistical behaviour reminiscent of that of electrons in quantum corrals. Here we demonstrate that localized topological inhomogeneities in an elliptical corral may lead to resonant projection effects in the walker's statistics similar to those reported in quantum corrals. Specifically, we show that a submerged circular well may drive the walker to excite specific eigenmodes in the bath that result in drastic changes in the particle's statistical behaviour. The well tends to attract the walker, leading to a local peak in the walker's position histogram. By placing the well at one of the foci, a mode with maxima near the foci is preferentially excited, leading to a projection effect in the walker's position histogram towards the empty focus, an effect strongly reminiscent of the quantum mirage. Finally, we demonstrate that the mean pilot-wave field has the same form as the histogram describing the walker's statistics.
Game Related Statistics Which Discriminate Between Winning and Losing Under-16 Male Basketball Games
Lorenzo, Alberto; Gómez, Miguel Ángel; Ortega, Enrique; Ibáñez, Sergio José; Sampaio, Jaime
2010-01-01
The aim of the present study was to identify the game-related statistics which discriminate between winning and losing teams in under-16 years old male basketball games. The sample gathered all 122 games in the 2004 and 2005 Under-16 European Championships. The game-related statistics analysed were the free-throws (both successful and unsuccessful), 2- and 3-points field-goals (both successful and unsuccessful) offensive and defensive rebounds, blocks, assists, fouls, turnovers and steals. The winning teams exhibited lower ball possessions per game and better offensive and defensive efficacy coefficients than the losing teams. Results from discriminant analysis were statistically significant and allowed to emphasize several structure coefficients (SC). In close games (final score differences below 9 points), the discriminant variables were the turnovers (SC = -0.47) and the assists (SC = 0.33). In balanced games (final score differences between 10 and 29 points), the variables that discriminated between the groups were the successful 2-point field-goals (SC = -0.34) and defensive rebounds (SC = -0. 36); and in unbalanced games (final score differences above 30 points) the variables that best discriminated both groups were the successful 2-point field-goals (SC = 0.37). These results allowed understanding that these players' specific characteristics result in a different game-related statistical profile and helped to point out the importance of the perceptive and decision making process in practice and in competition. Key points The players' game-related statistical profile varied according to game type, game outcome and in formative categories in basketball. The results of this work help to point out the different player's performance described in U-16 men's basketball teams compared with senior and professional men's basketball teams. The results obtained enhance the importance of the perceptive and decision making process in practice and in competition. PMID
Non-equilibrium statistical physics with application to disordered systems
Cáceres, Manuel Osvaldo
2017-01-01
This textbook is the result of the enhancement of several courses on non-equilibrium statistics, stochastic processes, stochastic differential equations, anomalous diffusion and disorder. The target audience includes students of physics, mathematics, biology, chemistry, and engineering at undergraduate and graduate level with a grasp of the basic elements of mathematics and physics of the fourth year of a typical undergraduate course. The little-known physical and mathematical concepts are described in sections and specific exercises throughout the text, as well as in appendices. Physical-mathematical motivation is the main driving force for the development of this text. It presents the academic topics of probability theory and stochastic processes as well as new educational aspects in the presentation of non-equilibrium statistical theory and stochastic differential equations.. In particular it discusses the problem of irreversibility in that context and the dynamics of Fokker-Planck. An introduction on fluc...
Statistical physics of black holes as quantum-mechanical systems
Giddings, Steven B.
2013-01-01
Some basic features of black-hole statistical mechanics are investigated, assuming that black holes respect the principles of quantum mechanics. Care is needed in defining an entropy S_bh corresponding to the number of microstates of a black hole, given that the black hole interacts with its surroundings. An open question is then the relationship between this entropy and the Bekenstein-Hawking entropy S_BH. For a wide class of models with interactions needed to ensure unitary quantum evolutio...
Performance Monitoring System: Summary of Lock Statistics. Revision 1.
1985-12-01
2751 84 4057 4141 526 798 18 1342 5727 19 5523 3996 4587 8583 1056 1630 35 2721 6536LOCK A DAMI 2 AUXILIARY I Ins NO DATA RECORDD FOR THIS LOCK- " LOCK I...TOTAL (KTOMS) ’ - (AVt OPNP ETC) ’’ ,q [ " ARKANSAS RIVER "" FORRELL LOCK IP 7A/3TRC 9/N83 UPBOUID STATISTICS ISO 53 42 M6 553 356 909 221 41 21 M8
2015 QuickCompass of Sexual Assult-Related Responders: Statistical Methodology Report
2016-02-01
Degree Age CAGE5 18 to 24 years olds 25 to 30 years olds 31 to 34 years olds 35 to 40 years olds 41 years old and older Gender CSEX Male Female...2015 QuickCompass of Sexual Assault Prevention and Response- Related Responders Statistical Methodology Report Additional copies of this report...from: http://www.dtic.mil/ Ask for report by ADA630235 DMDC Report No. 2015-039 February 2016 2015 QUICKCOMPASS OF SEXUAL ASSAULT PREVENTION
Statistical representation of sound textures in the impaired auditory system
DEFF Research Database (Denmark)
McWalter, Richard Ian; Dau, Torsten
2015-01-01
Many challenges exist when it comes to understanding and compensating for hearing impairment. Traditional methods, such as pure tone audiometry and speech intelligibility tests, offer insight into the deficiencies of a hearingimpaired listener, but can only partially reveal the mechanisms...... that underlie the hearing loss. An alternative approach is to investigate the statistical representation of sounds for hearing-impaired listeners along the auditory pathway. Using models of the auditory periphery and sound synthesis, we aimed to probe hearing impaired perception for sound textures – temporally...
Siddiqi, Ariba; Arjunan, Sridhar P; Kumar, Dinesh K
2016-08-01
Age-associated changes in the surface electromyogram (sEMG) of Tibialis Anterior (TA) muscle can be attributable to neuromuscular alterations that precede strength loss. We have used our sEMG model of the Tibialis Anterior to interpret the age-related changes and compared with the experimental sEMG. Eighteen young (20-30 years) and 18 older (60-85 years) performed isometric dorsiflexion at 6 different percentage levels of maximum voluntary contractions (MVC), and their sEMG from the TA muscle was recorded. Six different age-related changes in the neuromuscular system were simulated using the sEMG model at the same MVCs as the experiment. The maximal power of the spectrum, Gaussianity and Linearity Test Statistics were computed from the simulated and experimental sEMG. A correlation analysis at α=0.05 was performed between the simulated and experimental age-related change in the sEMG features. The results show the loss in motor units was distinguished by the Gaussianity and Linearity test statistics; while the maximal power of the PSD distinguished between the muscular factors. The simulated condition of 40% loss of motor units with halved the number of fast fibers best correlated with the age-related change observed in the experimental sEMG higher order statistical features. The simulated aging condition found by this study corresponds with the moderate motor unit remodelling and negligible strength loss reported in literature for the cohorts aged 60-70 years.
Discriminatory power of water polo game-related statistics at the 2008 Olympic Games.
Escalante, Yolanda; Saavedra, Jose M; Mansilla, Mirella; Tella, Victor
2011-02-01
The aims of this study were (1) to compare water polo game-related statistics by context (winning and losing teams) and sex (men and women), and (2) to identify characteristics discriminating the performances for each sex. The game-related statistics of the 64 matches (44 men's and 20 women's) played in the final phase of the Olympic Games held in Beijing in 2008 were analysed. Unpaired t-tests compared winners and losers and men and women, and confidence intervals and effect sizes of the differences were calculated. The results were subjected to a discriminant analysis to identify the differentiating game-related statistics of the winning and losing teams. The results showed the differences between winning and losing men's teams to be in both defence and offence, whereas in women's teams they were only in offence. In men's games, passing (assists), aggressive play (exclusions), centre position effectiveness (centre shots), and goalkeeper defence (goalkeeper-blocked 5-m shots) predominated, whereas in women's games the play was more dynamic (possessions). The variable that most discriminated performance in men was goalkeeper-blocked shots, and in women shooting effectiveness (shots). These results should help coaches when planning training and competition.
Research and Development of Statistical Analysis Software System of Maize Seedling Experiment
Hui Cao
2014-01-01
In this study, software engineer measures were used to develop a set of software system for maize seedling experiments statistics and analysis works. During development works, B/S structure software design method was used and a set of statistics indicators for maize seedling evaluation were established. The experiments results indicated that this set of software system could finish quality statistics and analysis for maize seedling very well. The development of this software system explored a...
Designing a Course in Statistics for a Learning Health Systems Training Program
Samsa, Gregory P.; LeBlanc, Thomas W.; Zaas, Aimee; Howie, Lynn; Abernethy, Amy P.
2014-01-01
The core pedagogic problem considered here is how to effectively teach statistics to physicians who are engaged in a "learning health system" (LHS). This is a special case of a broader issue--namely, how to effectively teach statistics to academic physicians for whom research--and thus statistics--is a requirement for professional…
Röpke, G.
2018-01-01
One of the fundamental problems in physics that are not yet rigorously solved is the statistical mechanics of nonequilibrium processes. An important contribution to describing irreversible behavior starting from reversible Hamiltonian dynamics was given by D. N. Zubarev, who invented the method of the nonequilibrium statistical operator. We discuss this approach, in particular, the extended von Neumann equation, and as an example consider the electrical conductivity of a system of charged particles. We consider the selection of the set of relevant observables. We show the relation between kinetic theory and linear response theory. Using thermodynamic Green's functions, we present a systematic treatment of correlation functions, but the convergence needs investigation. We compare different expressions for the conductivity and list open questions.
Distribution function of excitations in systems with fractional statistics
International Nuclear Information System (INIS)
Protogenov, A.P.
1992-08-01
The distribution function of low-energy excitations in 2+1D systems has been considered. It is shown that in these systems the quantum distribution function differs from the usual one by having a finite value of the entropy of linked braids. (author). 47 refs
Nonequilibrium thermodynamics and fluctuation relations for small systems
International Nuclear Information System (INIS)
Cao Liang; Ke Pu; Qiao Li-Yan; Zheng Zhi-Gang
2014-01-01
In this review, we give a retrospect of the recent progress in nonequilibrium statistical mechanics and thermodynamics in small dynamical systems. For systems with only a few number of particles, fluctuations and nonlinearity become significant and contribute to the nonequilibrium behaviors of the systems, hence the statistical properties and thermodynamics should be carefully studied. We review recent developments of this topic by starting from the Gallavotti—Cohen fluctuation theorem, and then to the Evans—Searles transient fluctuation theorem, Jarzynski free-energy equality, and the Crooks fluctuation relation. We also investigate the nonequilibrium free energy theorem for trajectories involving changes of the heat bath temperature and propose a generalized free-energy relation. It should be noticed that the non-Markovian property of the heat bath may lead to the violation of the free-energy relation. (topical review - statistical physics and complex systems)
Directory of Open Access Journals (Sweden)
Carlos Lago-Peñas
2010-06-01
Full Text Available The aim of the present study was to analyze men's football competitions, trying to identify which game-related statistics allow to discriminate winning, drawing and losing teams. The sample used corresponded to 380 games from the 2008-2009 season of the Spanish Men's Professional League. The game-related statistics gathered were: total shots, shots on goal, effectiveness, assists, crosses, offsides commited and received, corners, ball possession, crosses against, fouls committed and received, corners against, yellow and red cards, and venue. An univariate (t-test and multivariate (discriminant analysis of data was done. The results showed that winning teams had averages that were significantly higher for the following game statistics: total shots (p < 0.001, shots on goal (p < 0.01, effectiveness (p < 0.01, assists (p < 0.01, offsides committed (p < 0.01 and crosses against (p < 0.01. Losing teams had significantly higher averages in the variable crosses (p < 0.01, offsides received (p < 0. 01 and red cards (p < 0.01. Discriminant analysis allowed to conclude the following: the variables that discriminate between winning, drawing and losing teams were the total shots, shots on goal, crosses, crosses against, ball possession and venue. Coaches and players should be aware for these different profiles in order to increase knowledge about game cognitive and motor solicitation and, therefore, to evaluate specificity at the time of practice and game planning
International Nuclear Information System (INIS)
Land, C.E.; Pierce, D.A.
1983-01-01
Statistical theory and methodology provide the logical structure for scientific inference about the cancer risk associated with exposure to ionizing radiation. Although much is known about radiation carcinogenesis, the risk associated with low-level exposures is difficult to assess because it is too small to measure directly. Estimation must therefore depend upon mathematical models which relate observed risks at high exposure levels to risks at lower exposure levels. Extrapolated risk estimates obtained using such models are heavily dependent upon assumptions about the shape of the dose-response relationship, the temporal distribution of risk following exposure, and variation of risk according to variables such as age at exposure, sex, and underlying population cancer rates. Expanded statistical models, which make explicit certain assumed relationships between different data sets, can be used to strengthen inferences by incorporating relevant information from diverse sources. They also allow the uncertainties inherent in information from related data sets to be expressed in estimates which partially depend upon that information. To the extent that informed opinion is based upon a valid assessment of scientific data, the larger context of decision theory, which includes statistical theory, provides a logical framework for the incorporation into public policy decisions of the informational content of expert opinion
The relation between statistical power and inference in fMRI.
Directory of Open Access Journals (Sweden)
Henk R Cremers
Full Text Available Statistically underpowered studies can result in experimental failure even when all other experimental considerations have been addressed impeccably. In fMRI the combination of a large number of dependent variables, a relatively small number of observations (subjects, and a need to correct for multiple comparisons can decrease statistical power dramatically. This problem has been clearly addressed yet remains controversial-especially in regards to the expected effect sizes in fMRI, and especially for between-subjects effects such as group comparisons and brain-behavior correlations. We aimed to clarify the power problem by considering and contrasting two simulated scenarios of such possible brain-behavior correlations: weak diffuse effects and strong localized effects. Sampling from these scenarios shows that, particularly in the weak diffuse scenario, common sample sizes (n = 20-30 display extremely low statistical power, poorly represent the actual effects in the full sample, and show large variation on subsequent replications. Empirical data from the Human Connectome Project resembles the weak diffuse scenario much more than the localized strong scenario, which underscores the extent of the power problem for many studies. Possible solutions to the power problem include increasing the sample size, using less stringent thresholds, or focusing on a region-of-interest. However, these approaches are not always feasible and some have major drawbacks. The most prominent solutions that may help address the power problem include model-based (multivariate prediction methods and meta-analyses with related synthesis-oriented approaches.
A statistical modeling approach to build expert credit risk rating systems
DEFF Research Database (Denmark)
Waagepetersen, Rasmus
2010-01-01
This paper presents an efficient method for extracting expert knowledge when building a credit risk rating system. Experts are asked to rate a sample of counterparty cases according to creditworthiness. Next, a statistical model is used to capture the relation between the characteristics...... of a counterparty and the expert rating. For any counterparty the model can identify the rating, which would be agreed upon by the majority of experts. Furthermore, the model can quantify the concurrence among experts. The approach is illustrated by a case study regarding the construction of an application score...
International Nuclear Information System (INIS)
Sewell, G.L.
1986-01-01
The author shows how the basic axioms of quantum field theory, general relativity and statistical thermodynamics lead, in a model-independent way, to a generalized Hawking-Unruh effect, whereby the gravitational fields carried by a class of space-time manifolds with event horizons thermalize ambient quantum fields. The author is concerned with a quantum field on a space-time x containing a submanifold X' bounded by event horizons. The objective is to show that, for a wide class of space-times, the global vacuum state of the field reduces, in X', to a thermal state, whose temperature depends on the geometry. The statistical thermodynaical, geometrical, and quantum field theoretical essential ingredients for the reduction of the vacuum state are discussed
Synchronised laser chaos communication: statistical investigation of an experimental system
Lawrance, Anthony J.; Papamarkou, Theodore; Uchida, Atsushi
2017-01-01
The paper is concerned with analyzing data from an experimental antipodal laser-based chaos shift-keying communication system. Binary messages are embedded in a chaotically behaving laser wave which is transmitted through a fiber-optic cable and are decoded at the receiver using a second laser synchronized with the emitter laser. Instrumentation in the experimental system makes it particularly interesting to be able to empirically analyze both optical noise and synchronization error as well a...
Phenomenological approach to the statistics and dynamics of model systems
International Nuclear Information System (INIS)
Choi, M.Y.
1985-01-01
This thesis investigates the equilibrium and nonequilibrium properties of some model systems, and consists of two parts. Part 1 deals with phase transitions in frustrated xy models, which can serve as a model for the coupled Josephson junction arrays. The Hubbard-Stratanovich transform is developed to construct the Landau-Ginzburg-Wilson Hamiltonians for uniformly frustrated xy models both on a square lattice and on a triangular lattice, which reflect the formation of various superlattices according to the frustration f. Near the critical point, the system with f equal to 1/4 on a triangular lattice is shown to belong to the same universality class as the fully frustrated system on a square lattice. By decomposing two mode systems into two coupled xy models and by applying the Migdal-Kadanoff approximation, the possibilities of Ising-like or three-state Potts-like transition are shown in addition to the Kosterlitz-Thouless-like ones. Part 2 considers the time evaluation of model systems with retarded interactions. For such systems, a master equation is derived with non-Markovian character. It is shown that in higher dimensions, the interplay between interaction strength and delay can lead to complicated behavior
History by history statistical estimators in the BEAM code system
International Nuclear Information System (INIS)
Walters, B.R.B.; Kawrakow, I.; Rogers, D.W.O.
2002-01-01
A history by history method for estimating uncertainties has been implemented in the BEAMnrc and DOSXYZnrc codes replacing the method of statistical batches. This method groups scored quantities (e.g., dose) by primary history. When phase-space sources are used, this method groups incident particles according to the primary histories that generated them. This necessitated adding markers (negative energy) to phase-space files to indicate the first particle generated by a new primary history. The new method greatly reduces the uncertainty in the uncertainty estimate. The new method eliminates one dimension (which kept the results for each batch) from all scoring arrays, resulting in memory requirement being decreased by a factor of 2. Correlations between particles in phase-space sources are taken into account. The only correlations with any significant impact on uncertainty are those introduced by particle recycling. Failure to account for these correlations can result in a significant underestimate of the uncertainty. The previous method of accounting for correlations due to recycling by placing all recycled particles in the same batch did work. Neither the new method nor the batch method take into account correlations between incident particles when a phase-space source is restarted so one must avoid restarts
Statistical evaluation of major human errors during the development of new technological systems
International Nuclear Information System (INIS)
Campbell, G; Ott, K.O.
1979-01-01
Statistical procedures are presented to evaluate major human errors during the development of a new system, errors that have led or can lead to accidents or major failures. The first procedure aims at estimating the average residual occurrence rate for s or major failures after several have occurred. The procedure is solely based on the historical record. Certain idealizations are introduced that allow the application of a sound statistical evaluation procedure. These idealizations are practically realized to a sufficient degree such that the proposed estimation procedure yields meaningful results, even for situations with a sparse data base, represented by very few accidents. Under the assumption that the possible human-error-related failure times have exponential distributions, the statistical technique of isotonic regression is proposed to estimate the failure rates due to human design error at the failure times of the system. The last value in the sequence of estimates gives the residual accident chance. In addition, theactual situation is tested against the hypothesis that the failure rate of the system remains constant over time. This test determines the chance for a decreasing failure rate being incidental, rather than an indication of an actual learning process. Both techniques can be applied not merely to a single system but to an entire series of similar systems that a technology would generate, enabling the assessment of technological improvement. For the purpose of illustration, the nuclear decay of isotopes was chosen as an example, since the assumptions of the model are rigorously satisfied in this case. This application shows satisfactory agreement of the estimated and actual failure rates (which are exactly known in this example), although the estimation was deliberately based on a sparse historical record
Statistical Physics of Economic Systems: a Survey for Open Economies
Tao, Yong; Chen, Xun
2012-05-01
We extend the theoretical framework of an independent economy developed by Tao [Phys. Rev. E 82 (2010) 036118] so as to include multiple economies. Since the starting point of our framework is on the basis of the theory of the competitive markets of traditional economics, this framework shall be suitable for any free market. Our study shows that integration of world economies can decrease trade friction among economic systems, but may also cause a global economic crisis whenever economy disequilibrium occurs in any one of these economic systems.
Quantifying fluctuations in economic systems by adapting methods of statistical physics
Stanley, H. E.; Gopikrishnan, P.; Plerou, V.; Amaral, L. A. N.
2000-12-01
The emerging subfield of econophysics explores the degree to which certain concepts and methods from statistical physics can be appropriately modified and adapted to provide new insights into questions that have been the focus of interest in the economics community. Here we give a brief overview of two examples of research topics that are receiving recent attention. A first topic is the characterization of the dynamics of stock price fluctuations. For example, we investigate the relation between trading activity - measured by the number of transactions NΔ t - and the price change GΔ t for a given stock, over a time interval [t, t+ Δt] . We relate the time-dependent standard deviation of price fluctuations - volatility - to two microscopic quantities: the number of transactions NΔ t in Δ t and the variance WΔ t2 of the price changes for all transactions in Δ t. Our work indicates that while the pronounced tails in the distribution of price fluctuations arise from WΔ t, the long-range correlations found in ∣ GΔ t∣ are largely due to NΔ t. We also investigate the relation between price fluctuations and the number of shares QΔ t traded in Δ t. We find that the distribution of QΔ t is consistent with a stable Lévy distribution, suggesting a Lévy scaling relationship between QΔ t and NΔ t, which would provide one explanation for volume-volatility co-movement. A second topic concerns cross-correlations between the price fluctuations of different stocks. We adapt a conceptual framework, random matrix theory (RMT), first used in physics to interpret statistical properties of nuclear energy spectra. RMT makes predictions for the statistical properties of matrices that are universal, that is, do not depend on the interactions between the elements comprising the system. In physics systems, deviations from the predictions of RMT provide clues regarding the mechanisms controlling the dynamics of a given system, so this framework can be of potential value if
van der Maas, H.L.J.; Newell, K.; Molenaar, P.C.M.
1998-01-01
Cognitive developmental psychology is faced with new developments in the mathematical theory of nonlinear dynamic systems and in psychometrics. This chapter addresses: the relation between the strategy concept in cognitive developmental psychology and the concept of attractor in nonlinear dynamic
Directory of Open Access Journals (Sweden)
Luciane Bastistella
2018-02-01
Full Text Available New experimental techniques, as well as modern variants on known methods, have recently been employed to investigate the fundamental reactions underlying the oxidation of biochar. The purpose of this paper was to experimentally and statistically study how the relative humidity of air, mass, and particle size of four biochars influenced the adsorption of water and the increase in temperature. A random factorial design was employed using the intuitive statistical software Xlstat. A simple linear regression model and an analysis of variance with a pairwise comparison were performed. The experimental study was carried out on the wood of Quercus pubescens, Cyclobalanopsis glauca, Trigonostemon huangmosun, and Bambusa vulgaris, and involved five relative humidity conditions (22, 43, 75, 84, and 90%, two mass samples (0.1 and 1 g, and two particle sizes (powder and piece. Two response variables including water adsorption and temperature increase were analyzed and discussed. The temperature did not increase linearly with the adsorption of water. Temperature was modeled by nine explanatory variables, while water adsorption was modeled by eight. Five variables, including factors and their interactions, were found to be common to the two models. Sample mass and relative humidity influenced the two qualitative variables, while particle size and biochar type only influenced the temperature.
Using Relative Statistics and Approximate Disease Prevalence to Compare Screening Tests.
Samuelson, Frank; Abbey, Craig
2016-11-01
Schatzkin et al. and other authors demonstrated that the ratios of some conditional statistics such as the true positive fraction are equal to the ratios of unconditional statistics, such as disease detection rates, and therefore we can calculate these ratios between two screening tests on the same population even if negative test patients are not followed with a reference procedure and the true and false negative rates are unknown. We demonstrate that this same property applies to an expected utility metric. We also demonstrate that while simple estimates of relative specificities and relative areas under ROC curves (AUC) do depend on the unknown negative rates, we can write these ratios in terms of disease prevalence, and the dependence of these ratios on a posited prevalence is often weak particularly if that prevalence is small or the performance of the two screening tests is similar. Therefore we can estimate relative specificity or AUC with little loss of accuracy, if we use an approximate value of disease prevalence.
Inferential, non-parametric statistics to assess the quality of probabilistic forecast systems
Maia, A.H.N.; Meinke, H.B.; Lennox, S.; Stone, R.C.
2007-01-01
Many statistical forecast systems are available to interested users. To be useful for decision making, these systems must be based on evidence of underlying mechanisms. Once causal connections between the mechanism and its statistical manifestation have been firmly established, the forecasts must
Kleijnen, J.P.C.
1995-01-01
This tutorial discusses what-if analysis and optimization of System Dynamics models. These problems are solved, using the statistical techniques of regression analysis and design of experiments (DOE). These issues are illustrated by applying the statistical techniques to a System Dynamics model for
Technical issues relating to the statistical parametric mapping of brain SPECT studies
International Nuclear Information System (INIS)
Hatton, R.L.; Cordato, N.; Hutton, B.F.; Lau, Y.H.; Evans, S.G.
2000-01-01
Full text: Statistical Parametric Mapping (SPM) is a software tool designed for the statistical analysis of functional neuro images, specifically Positron Emission Tomography and functional Magnetic Resonance Imaging, and more recently SPECT. This review examines some problems associated with the analysis of SPECT. A comparison of a patient group with normal studies revealed factors that could influence results, some that commonly occur, others that require further exploration. To optimise the differences between two groups of subjects, both spatial variability and differences in global activity must be minimised. The choice and effectiveness of co registration method and approach to normalisation of activity concentration can affect the optimisation. A small number of subject scans were identified as possessing truncated data resulting in edge effects that could adversely influence the analysis. Other problems included unusual areas of significance possibly related to reconstruction methods and the geometry associated with nonparallel collimators. Areas of extra cerebral significance are a point of concern - and may result from scatter effects, or mis registration. Difficulties in patient positioning, due to postural limitations, can lead to resolution differences. SPM has been used to assess areas of statistical significance arising from these technical factors, as opposed to areas of true clinical significance when comparing subject groups. This contributes to a better understanding of the effects of technical factors so that these may be eliminated, minimised, or incorporated in the study design. Copyright (2000) The Australian and New Zealand Society of Nuclear Medicine Inc
International Nuclear Information System (INIS)
Vincent, C.H.
1982-01-01
Bayes' principle is applied to the differential counting measurement of a positive quantity in which the statistical errors are not necessarily small in relation to the true value of the quantity. The methods of estimation derived are found to give consistent results and to avoid the anomalous negative estimates sometimes obtained by conventional methods. One of the methods given provides a simple means of deriving the required estimates from conventionally presented results and appears to have wide potential applications. Both methods provide the actual posterior probability distribution of the quantity to be measured. A particularly important potential application is the correction of counts on low radioacitvity samples for background. (orig.)
Statistical evidence about human influence on the climate system
Pierre Perron; Francisco Estrada; Benjamín Martínez-López
2012-01-01
We use recent methods for the analysis of time series data, in particular related to breaks in trends, to establish that human factors are the main contributors to the secular movements in observed global and hemispheric temperatures series. The most important feature documented is a marked increase in the growth rates of temperatures (purged from the Atlantic Multidecadal Oscillation) and anthropogenic greenhouse gases occurring for all series around 1955, which marks the start of sustained ...
Relational time in anyonic systems
Nikolova, A.; Brennen, G. K.; Osborne, T. J.; Milburn, G. J.; Stace, T. M.
2018-03-01
In a seminal paper [Phys. Rev. D 27, 2885 (1983), 10.1103/PhysRevD.27.2885], Page and Wootters suggest that time evolution could be described solely in terms of correlations between systems and clocks, as a means of dealing with the "problem of time" stemming from vanishing Hamiltonian dynamics in many theories of quantum gravity. Their approach seeks to identify relational dynamics given a Hamiltonian constraint on the physical states. Here we present a "state-centric" reformulation of the Page and Wootters model better suited to cases where the Hamiltonian constraint is satisfied, such as anyons emerging in Chern-Simons theories. We describe relational time by encoding logical "clock" qubits into topologically protected anyonic degrees of freedom. The minimum temporal increment of such anyonic clocks is determined by the universality of the anyonic braid group, with nonuniversal models naturally exhibiting discrete time. We exemplify this approach by using SU (2) 2 anyons and discuss generalizations to other states and models.
Low-cost data acquisition systems for photovoltaic system monitoring and usage statistics
Fanourakis, S.; Wang, K.; McCarthy, P.; Jiao, L.
2017-11-01
This paper presents the design of a low-cost data acquisition system for monitoring a photovoltaic system’s electrical quantities, battery temperatures, and state of charge of the battery. The electrical quantities are the voltages and currents of the solar panels, the battery, and the system loads. The system uses an Atmega328p microcontroller to acquire data from the photovoltaic system’s charge controller. It also records individual load information using current sensing resistors along with a voltage amplification circuit and an analog to digital converter. The system is used in conjunction with a wall power data acquisition system for the recording of regional power outages. Both data acquisition systems record data in micro SD cards. The data has been successfully acquired from both systems and has been used to monitor the status of the PV system and the local power grid. As more data is gathered it can be used for the maintenance and improvement of the photovoltaic system through analysis of the photovoltaic system’s parameters and usage statistics.
DETERMINANT EROSION FACTORS FOR PENSION ROMANIAN SYSTEM - A STATISTICAL APPROACH
Directory of Open Access Journals (Sweden)
Ana-Gabriela BABUCEA
2010-09-01
Full Text Available The demographical evolutions of the last 20 years, the changes in the Romanian economy and society influenced one special category which is the retired people. As inactive population, retired people represents an important category, their numbers being rather big in comparison with the employed population, practical the contributors at the pensions fund. In 2010, the stat has serious problems with paying pensions. The evolution of this category of people is the subject of this paper. We try to identify the factors that had negative influence upon the pension Romanian system. The reference years that we considered are 1990-2009.
Statistical mechanical analysis of (1 + ∞) dimensional disordered systems
International Nuclear Information System (INIS)
Skantzos, Nikolaos Stavrou
2001-01-01
Valuable insight into the theory of disordered systems and spin-glasses has been offered by two classes of exactly solvable models: one-dimensional models and mean-field (infinite-range) ones, which, each carry their own specific techniques and restrictions. Both classes of models are now considered as 'exactly solvable' in the sense that in the thermodynamic limit the partition sum can been carried out analytically and the average over the disorder can be performed using methods which are well understood. In this thesis I study equilibrium properties of spin systems with a combination of one-dimensional short- and infinite-range interactions. I find that such systems, under either synchronous or asynchronous spin dynamics, and even in the absence of disorder, lead to phase diagrams with first-order transitions and regions with a multiple number of locally stable states. I then proceed to the study of recurrent neural network models with (1+∞)-dimensional interactions, and find that the competing short- and long-range forces lead to highly complex phase diagrams and that unlike infinite-range (Hopfield-type) models these phase diagrams depend crucially on the number of patterns stored, even away from saturation. To solve the statics of such models for the case of synchronous dynamics I first make a detour to solve the synchronous counterpart of the one-dimensional random-field Ising model, where I prove rigorously that the physics of the two random-field models (synchronous vs. sequential) becomes asymptotically the same, leading to an extensive ground state entropy and an infinite hierarchy of discontinuous transitions close to zero temperature. Finally, I propose and solve the statics of a spin model for the prediction of secondary structure in random hetero-polymers (which are considered as the natural first step to the study of real proteins). The model lies in the class of (1+∞)-dimensional disordered systems as a consequence of having steric- and hydrogen
Directory of Open Access Journals (Sweden)
Günter Moser
2009-04-01
Full Text Available The quality of statistical data covering the economic and social development of the People’s Republic of China has been questioned by international and national data users for years. The reasons for this doubt lie mainly in the structure of the Chinese system of statistics. Two parallel systems exist which operate largely autonomously: the national system of statistics and the sectoral system of statistics. In the area of the national statistical system, the National Bureau of Statistics (NBS has the authority to order and collect statistics. This competence lies with the ministries and authorities below the ministerial level. This article describes and analyses these structures, the resulting problems, and the reform measures taken to date. It also aims to provide a better understanding of the statistical data about the People’s Republic of China and to enable an assessment of them within a changing structural context. In conclusion, approaches to further reforms will be provided based on the author’s long-standing experience in cooperation projects with the official Chinese statistics agencies. Die Qualität der Statistiken zur ökonomischen und sozialen Entwicklung in der Volksrepublik China ist in letzter Zeit sowohl von ausländischen wie auch von einheimischen Nutzern der Daten in Frage gestellt worden. Die Gründe dafür liegen vor allem in der Struktur des Erhebungssystems in China.
Statistical Analysis of the Grid Connected Photovoltaic System Performance Ratio
Directory of Open Access Journals (Sweden)
Javier Vilariño-García
2017-05-01
Full Text Available A methodology based on the application of variance analysis and Tukey's method to a data set of solar radiation in the plane of the photovoltaic modules and the corresponding values of power delivered to the grid at intervals of 10 minutes presents from sunrise to sunset during the 52 weeks of the year 2013. These data were obtained through a monitoring system located in a photovoltaic plant of 10 MW of rated power located in Cordoba, consisting of 16 transformers and 98 investors. The application of the comparative method among the middle of the performance index of the processing centers to detect with an analysis of variance if there is significant difference in average at least the rest at a level of significance of 5% and then by testing Tukey which one or more processing centers that are below average due to a fault to be detected and corrected are.
Coordination of the National Statistical System in the Information Security Context
Directory of Open Access Journals (Sweden)
O. H.
2017-12-01
Full Text Available The need for building the national statistical system (NSS as the framework for coordination of statistical works is substantiated. NSS is defined on the basis of system approach. It is emphasized that the essential conditions underlying NSS are strategic planning, reliance on internationally adopted methods and due consideration to country-specific environment. The role of the state coordination policy in organizing statistical activities in the NSS framework is highlighted, key objectives of the integrated national policy on coordination of statistical activities are given. Threats arising from non-existence of NSS in a country are shown: “irregular” pattern of statistical activities, resulting from absence of common legal, methodological and organizational grounds; high costs involved in the finished information product in parallel with its low quality; impossibility of administering the statistical information security in a coherent manner, i. e. keeping with the rules on confidentiality of data, preventing intentional distortion of information and keeping with the rules of treatment with data making the state secret. An extensive review of NSS functional objectives is made: to ensure the system development of the official statistics; to ensure confidentiality and protection of individual data; to establish interdepartmental mechanisms for control and protection of secret statistical information; to broaden and regulate the access to statistical data and their effective use. The need for creating the National Statistical Commission is grounded.
Visual statistical learning is related to natural language ability in adults: An ERP study.
Daltrozzo, Jerome; Emerson, Samantha N; Deocampo, Joanne; Singh, Sonia; Freggens, Marjorie; Branum-Martin, Lee; Conway, Christopher M
2017-03-01
Statistical learning (SL) is believed to enable language acquisition by allowing individuals to learn regularities within linguistic input. However, neural evidence supporting a direct relationship between SL and language ability is scarce. We investigated whether there are associations between event-related potential (ERP) correlates of SL and language abilities while controlling for the general level of selective attention. Seventeen adults completed tests of visual SL, receptive vocabulary, grammatical ability, and sentence completion. Response times and ERPs showed that SL is related to receptive vocabulary and grammatical ability. ERPs indicated that the relationship between SL and grammatical ability was independent of attention while the association between SL and receptive vocabulary depended on attention. The implications of these dissociative relationships in terms of underlying mechanisms of SL and language are discussed. These results further elucidate the cognitive nature of the links between SL mechanisms and language abilities. Copyright © 2017 Elsevier Inc. All rights reserved.
Becchi, Carlo Maria
2016-01-01
This is the third edition of a well-received textbook on modern physics theory. This book provides an elementary but rigorous and self-contained presentation of the simplest theoretical framework that will meet the needs of undergraduate students. In addition, a number of examples of relevant applications and an appropriate list of solved problems are provided.Apart from a substantial extension of the proposed problems, the new edition provides more detailed discussion on Lorentz transformations and their group properties, a deeper treatment of quantum mechanics in a central potential, and a closer comparison of statistical mechanics in classical and in quantum physics. The first part of the book is devoted to special relativity, with a particular focus on space-time relativity and relativistic kinematics. The second part deals with Schrödinger's formulation of quantum mechanics. The presentation concerns mainly one-dimensional problems, but some three-dimensional examples are discussed in detail. The third...
Tenenbaum, Joel
This thesis applies statistical physics concepts and methods to quantitatively analyze complex systems. This thesis is separated into four parts: (i) characteristics of earthquake systems (ii) memory and volatility in data time series (iii) the application of part (ii) to world financial markets, and (iv) statistical observations on the evolution of word usage. In Part I, we observe statistical patterns in the occurrence of earthquakes. We select a 14-year earthquake catalog covering the archipelago of Japan. We find that regions traditionally thought of as being too distant from one another for causal contact display remarkably high correlations, and the networks that result have a tendency to link highly connected areas with other highly connected areas. In Part II, we introduce and apply the concept of "volatility asymmetry", the primary use of which is in financial data. We explain the relation between memory and "volatility asymmetry" in terms of an asymmetry parameter lambda. We define a litmus test for determining whether lambda is statistically significant and propose a stochastic model based on this parameter and use the model to further explain empirical data. In Part III, we expand on volatility asymmetry. Importing the concepts of time dependence and universality from physics, we explore the aspects of emerging (or "transition") economies in Eastern Europe as they relate to asymmetry. We find that these emerging markets in some instances behave like developed markets and in other instances do not, and that the distinction is a matter both of country and a matter of time period, crisis periods showing different asymmetry characteristics than "healthy" periods. In Part IV, we take note of a series of findings in econophysics, showing statistical growth similarities between a variety of different areas that all have in common the fact of taking place in areas that are both (i) competing and (ii) dynamic. We show that this same growth distribution can be
2010-07-01
... relating to a hurricane, earthquake, or other natural occurrence? 250.192 Section 250.192 Mineral Resources... statistics must I submit relating to a hurricane, earthquake, or other natural occurrence? (a) You must... tropical storm, or an earthquake. Statistics include facilities and rigs evacuated and the amount of...
Gröbner bases statistics and software systems
2013-01-01
The idea of the Gröbner basis first appeared in a 1927 paper by F. S. Macaulay, who succeeded in creating a combinatorial characterization of the Hilbert functions of homogeneous ideals of the polynomial ring. Later, the modern definition of the Gröbner basis was independently introduced by Heisuke Hironaka in 1964 and Bruno Buchberger in 1965. However, after the discovery of the notion of the Gröbner basis by Hironaka and Buchberger, it was not actively pursued for 20 years. A breakthrough was made in the mid-1980s by David Bayer and Michael Stillman, who created the Macaulay computer algebra system with the help of the Gröbner basis. Since then, rapid development on the Gröbner basis has been achieved by many researchers, including Bernd Sturmfels. This book serves as a standard bible of the Gröbner basis, for which the harmony of theory, application, and computation are indispensable. It provides all the fundamentals for graduate students to learn the ABC’s of the Gröbner basis, requiring no speci...
PREFACE: Advanced many-body and statistical methods in mesoscopic systems
Anghel, Dragos Victor; Sabin Delion, Doru; Sorin Paraoanu, Gheorghe
2012-02-01
It has increasingly been realized in recent times that the borders separating various subfields of physics are largely artificial. This is the case for nanoscale physics, physics of lower-dimensional systems and nuclear physics, where the advanced techniques of many-body theory developed in recent times could provide a unifying framework for these disciplines under the general name of mesoscopic physics. Other fields, such as quantum optics and quantum information, are increasingly using related methods. The 6-day conference 'Advanced many-body and statistical methods in mesoscopic systems' that took place in Constanta, Romania, between 27 June and 2 July 2011 was, we believe, a successful attempt at bridging an impressive list of topical research areas: foundations of quantum physics, equilibrium and non-equilibrium quantum statistics/fractional statistics, quantum transport, phases and phase transitions in mesoscopic systems/superfluidity and superconductivity, quantum electromechanical systems, quantum dissipation, dephasing, noise and decoherence, quantum information, spin systems and their dynamics, fundamental symmetries in mesoscopic systems, phase transitions, exactly solvable methods for mesoscopic systems, various extension of the random phase approximation, open quantum systems, clustering, decay and fission modes and systematic versus random behaviour of nuclear spectra. This event brought together participants from seventeen countries and five continents. Each of the participants brought considerable expertise in his/her field of research and, at the same time, was exposed to the newest results and methods coming from the other, seemingly remote, disciplines. The talks touched on subjects that are at the forefront of topical research areas and we hope that the resulting cross-fertilization of ideas will lead to new, interesting results from which everybody will benefit. We are grateful for the financial and organizational support from IFIN-HH, Ovidius
Bayesian Statistics and Uncertainty Quantification for Safety Boundary Analysis in Complex Systems
He, Yuning; Davies, Misty Dawn
2014-01-01
The analysis of a safety-critical system often requires detailed knowledge of safe regions and their highdimensional non-linear boundaries. We present a statistical approach to iteratively detect and characterize the boundaries, which are provided as parameterized shape candidates. Using methods from uncertainty quantification and active learning, we incrementally construct a statistical model from only few simulation runs and obtain statistically sound estimates of the shape parameters for safety boundaries.
The derivation and application of a risk related value for saving a statistical life
International Nuclear Information System (INIS)
Jackson, D.; Stone, D.; Butler, G.G.; Mcglynn, G.
2004-01-01
A risk related value of spend for saving a statistical life (VSSSL) is proposed for cost-benefit studies across the power generation sector, and the nuclear industry in particular. An upper bound on VSSSL is set based on the UK government standard of around pound 1 M or, in particular circumstances, pound 2 M and the observation that excessive spend (probably of the order of more than pound 5 M per statistical life) will actually cost lives. Above a risk of 10 -3 a -1 it is assumed that VSSSL approaches maximum sustainable value around pound 2 M, whereas below a risk of 10 -9 a -1 the value of further risk reduction approaches zero. At risks around 10 -6 a -1 it is proposed that an appropriate VSSL lies in the range pound 0.25 M to pound 1 M. With respect to radiological protection, it is suggested that where collective doses are dominated by average individual doses no more than a few μSv, the detriment arising from a man-Sv can be valued at about pound 15 k to pound 60 k. It is further suggested that for individual dose contributions below 0.01 μSv (representing a risk equivalent to less than 10 -9 ) a low residual VSSSL should be applied in cost-benefit analyses based on collective dose exposures. (author)
Obozov, A. A.; Serpik, I. N.; Mihalchenko, G. S.; Fedyaeva, G. A.
2017-01-01
In the article, the problem of application of the pattern recognition (a relatively young area of engineering cybernetics) for analysis of complicated technical systems is examined. It is shown that the application of a statistical approach for hard distinguishable situations could be the most effective. The different recognition algorithms are based on Bayes approach, which estimates posteriori probabilities of a certain event and an assumed error. Application of the statistical approach to pattern recognition is possible for solving the problem of technical diagnosis complicated systems and particularly big powered marine diesel engines.
A new formalism for non extensive physical systems: Tsallis Thermo statistics
International Nuclear Information System (INIS)
Tirnakli, U.; Bueyuekkilic, F.; Demirhan, D.
1999-01-01
Although Boltzmann-Gibbs (BG) statistics provides a suitable tool which enables us to handle a large number of physical systems satisfactorily, it has some basic restrictions. Recently a non extensive thermo statistics has been proposed by C.Tsallis to handle the non extensive physical systems and up to now, besides the generalization of some of the conventional concepts, the formalism has been prosperous in some of the physical applications. In this study, our effort is to introduce Tsallis thermo statistics in some details and to emphasize its achievements on physical systems by noting the recent developments on this line
Directory of Open Access Journals (Sweden)
M. N. Ivliev
2016-01-01
Full Text Available The work is devoted to methods of analysis the company financial condition, including aggregated ratings. It is proposed to use the generalized solvency and liquidity indicator and the capital structure composite index. Mathematically, the generalized index is a sum of variables-characteristics and weighting factors characterizing the relative importance of individual characteristics composition. It is offered to select the significant features from a set of standard financial ratios, calculated according to enterprises balance sheets. To obtain the weighting factors values it is proposed to use one of the expert statistical approaches, the analytic hierarchy process. The method is as follows: we choose the most important characteristic and after the experts determine the degree of preference for the main feature based on the linguistic scale. Further, matrix of pairwise comparisons based on the assigned ranks is compiled, which characterizes the relative importance of attributes. The required coefficients are determined as elements of a vector of priorities, which is the first vector of the matrix of paired comparisons. The paper proposes a mechanism for finding the fields for rating numbers analysis. In addition, the paper proposes a method for the statistical evaluation of the balance sheets of various companies by calculating the mutual correlation matrices. Based on the considered mathematical methods to determine quantitative characteristics of technical objects financial and economic activities, was developed algorithms, information and software allowing to realize of different systems economic analysis.
Program system for inclusion, settlement of account and statistical evaluation of on-line recherches
International Nuclear Information System (INIS)
Helmreich, F.; Nevyjel, A.
1981-03-01
The described program system is used for the automatisation of the administration in an information retrieval department. The data of the users and of every on line session are stored in two files and can be evaluated in different statistics. The data acquisition is done interactively, the statistic programs run as well in dialog and in batch. (author)
Statistics of the relative velocity of particles in bidisperse turbulent suspensions
Bhatnagar, Akshay; Gustavsson, Kristian; Mehlig, Bernhard; Mitra, Dhrubaditya
2017-11-01
We calculate the joint probability distribution function (JPDF) of relative distances (R) and velocities (V with longitudinal component VR) of a pair of bidisperse heavy inertial particles in homogeneous and isotropic turbulent flows using direct numerical simulations (DNS). A recent paper (J. Meibohm, et. al. 2017), using statistical-model simulations and mathematical analysis of an one-dimensional white-noise model, has shown that the JPDF, P (R ,VR) , for two particles with Stokes numbers, St1 and St2 , can be interpreted in terms of StM , the harmonic mean of St1 and St2 and θ ≡ | St1 - St2 | / (St1 + St2) . For small θ there emerges a small-scale cutoff Rc and a small-velocity cutoff Vc such that for VR Foundation, Dnr. KAW 2014.0048.
Statistical distributions of earthquakes and related non-linear features in seismic waves
International Nuclear Information System (INIS)
Apostol, B.-F.
2006-01-01
A few basic facts in the science of the earthquakes are briefly reviewed. An accumulation, or growth, model is put forward for the focal mechanisms and the critical focal zone of the earthquakes, which relates the earthquake average recurrence time to the released seismic energy. The temporal statistical distribution for average recurrence time is introduced for earthquakes, and, on this basis, the Omori-type distribution in energy is derived, as well as the distribution in magnitude, by making use of the semi-empirical Gutenberg-Richter law relating seismic energy to earthquake magnitude. On geometric grounds, the accumulation model suggests the value r = 1/3 for the Omori parameter in the power-law of energy distribution, which leads to β = 1,17 for the coefficient in the Gutenberg-Richter recurrence law, in fair agreement with the statistical analysis of the empirical data. Making use of this value, the empirical Bath's law is discussed for the average magnitude of the aftershocks (which is 1.2 less than the magnitude of the main seismic shock), by assuming that the aftershocks are relaxation events of the seismic zone. The time distribution of the earthquakes with a fixed average recurrence time is also derived, the earthquake occurrence prediction is discussed by means of the average recurrence time and the seismicity rate, and application of this discussion to the seismic region Vrancea, Romania, is outlined. Finally, a special effect of non-linear behaviour of the seismic waves is discussed, by describing an exact solution derived recently for the elastic waves equation with cubic anharmonicities, its relevance, and its connection to the approximate quasi-plane waves picture. The properties of the seismic activity accompanying a main seismic shock, both like foreshocks and aftershocks, are relegated to forthcoming publications. (author)
Pension System Related Public Politics
Directory of Open Access Journals (Sweden)
LIVIU RADU
2015-05-01
Full Text Available This paper aims to find some answers regarding the long term sustainability of the pension system. Romania’s pension system originates from the invalidity insurances and pension system designed by the German cancellor Otto Eduard Leopold von Bismark in 1889. From a European perspective, Romania has to fill an obvious gap regarding the reformation of the national public pension system. International experience, particularly of the last 130 years, indicates that, in actuality, multiple pension systems have been put into function in most of the world’s countries and which are diferenciated by some elements (organizing and managing the system, defyning pension rights, method of forming the resources, the pension’s level rapported to the average income etc. and after the eficacity degree dependent on internal influences, social, economic and demographic environment, and last but not least by the political factor.
Safety-related control air systems
International Nuclear Information System (INIS)
Anon.
1977-01-01
This Standard applies to those portions of the control air system that furnish air required to support, control, or operate systems or portions of systems that are safety related in nuclear power plants. This Standard relates only to the air supply system(s) for safety-related air operated devices and does not apply to the safety-related air operated device or to air operated actuators for such devices. The objectives of this Standard are to provide (1) minimum system design requirements for equipment, piping, instruments, controls, and wiring that constitute the air supply system; and (2) the system and component testing and maintenance requirements
Influence of Signal and Noise on Statistical Fluctuation of Single-Mode Laser System
International Nuclear Information System (INIS)
Xu Dahai; Cheng Qinghua; Cao Li; Wu Dajin
2006-01-01
On the basis of calculating the steady-state mean normalized intensity fluctuation of a signal-mode laser system driven by both colored pump noise with signal modulation and the quantum noise with cross-correlation between its real and imaginary parts, we analyze the influence of modulation signal, noise, and its correlation form on the statistical fluctuation of the laser system. We have found that when the amplitude of modulation signal weakens and its frequency quickens, the statistical fluctuation will reduce rapidly. The statistical fluctuation of the laser system can be restrained by reducing the intensity of pump noise and quantum noise. Moreover, with prolonging of colored cross-correlation time, the statistical fluctuation of laser system experiences a repeated changing process, that is, from decreasing to augmenting, then to decreasing, and finally to augmenting again. With the decreasing of the value of cross-correlation coefficient, the statistical fluctuation will decrease too. When the cross-correlation form between the real part and imaginary part of quantum noise is zero correlation, the statistical fluctuation of laser system has a minimum. Compared with the influence of intensity of pump noise, the influence of intensity of quantum noise on the statistical fluctuation is smaller.
Infant Statistical-Learning Ability Is Related to Real-Time Language Processing
Lany, Jill; Shoaib, Amber; Thompson, Abbie; Estes, Katharine Graf
2018-01-01
Infants are adept at learning statistical regularities in artificial language materials, suggesting that the ability to learn statistical structure may support language development. Indeed, infants who perform better on statistical learning tasks tend to be more advanced in parental reports of infants' language skills. Work with adults suggests…
Zimmerman, Whitney Alicia; Johnson, Glenn
2017-01-01
Data were collected from 353 online undergraduate introductory statistics students at the beginning of a semester using the Goals and Outcomes Associated with Learning Statistics (GOALS) instrument and an abbreviated form of the Statistics Anxiety Rating Scale (STARS). Data included a survey of expected grade, expected time commitment, and the…
Nonequilibrium statistical mechanics of shear flow: invariant quantities and current relations
International Nuclear Information System (INIS)
Baule, A; Evans, R M L
2010-01-01
In modeling nonequilibrium systems one usually starts with a definition of the microscopic dynamics, e.g., in terms of transition rates, and then derives the resulting macroscopic behavior. We address the inverse question for a class of steady state systems, namely complex fluids under continuous shear flow: how does an externally imposed shear current affect the microscopic dynamics of the fluid? The answer can be formulated in the form of invariant quantities, exact relations for the transition rates in the nonequilibrium steady state, as discussed in a recent letter (Baule and Evans, 2008 Phys. Rev. Lett. 101 240601). Here, we present a more pedagogical account of the invariant quantities and the theory underlying them, known as the nonequilibrium counterpart to detailed balance (NCDB). Furthermore, we investigate the relationship between the transition rates and the shear current in the steady state. We show that a fluctuation relation of the Gallavotti–Cohen type holds for systems satisfying NCDB
The Role of Discrete Global Grid Systems in the Global Statistical Geospatial Framework
Purss, M. B. J.; Peterson, P.; Minchin, S. A.; Bermudez, L. E.
2016-12-01
The United Nations Committee of Experts on Global Geospatial Information Management (UN-GGIM) has proposed the development of a Global Statistical Geospatial Framework (GSGF) as a mechanism for the establishment of common analytical systems that enable the integration of statistical and geospatial information. Conventional coordinate reference systems address the globe with a continuous field of points suitable for repeatable navigation and analytical geometry. While this continuous field is represented on a computer in a digitized and discrete fashion by tuples of fixed-precision floating point values, it is a non-trivial exercise to relate point observations spatially referenced in this way to areal coverages on the surface of the Earth. The GSGF states the need to move to gridded data delivery and the importance of using common geographies and geocoding. The challenges associated with meeting these goals are not new and there has been a significant effort within the geospatial community to develop nested gridding standards to tackle these issues over many years. These efforts have recently culminated in the development of a Discrete Global Grid Systems (DGGS) standard which has been developed under the auspices of Open Geospatial Consortium (OGC). DGGS provide a fixed areal based geospatial reference frame for the persistent location of measured Earth observations, feature interpretations, and modelled predictions. DGGS address the entire planet by partitioning it into a discrete hierarchical tessellation of progressively finer resolution cells, which are referenced by a unique index that facilitates rapid computation, query and analysis. The geometry and location of the cell is the principle aspect of a DGGS. Data integration, decomposition, and aggregation is optimised in the DGGS hierarchical structure and can be exploited for efficient multi-source data processing, storage, discovery, transmission, visualization, computation, analysis, and modelling. During
Vasilaki, V; Volcke, E I P; Nandi, A K; van Loosdrecht, M C M; Katsou, E
2018-04-26
Multivariate statistical analysis was applied to investigate the dependencies and underlying patterns between N 2 O emissions and online operational variables (dissolved oxygen and nitrogen component concentrations, temperature and influent flow-rate) during biological nitrogen removal from wastewater. The system under study was a full-scale reactor, for which hourly sensor data were available. The 15-month long monitoring campaign was divided into 10 sub-periods based on the profile of N 2 O emissions, using Binary Segmentation. The dependencies between operating variables and N 2 O emissions fluctuated according to Spearman's rank correlation. The correlation between N 2 O emissions and nitrite concentrations ranged between 0.51 and 0.78. Correlation >0.7 between N 2 O emissions and nitrate concentrations was observed at sub-periods with average temperature lower than 12 °C. Hierarchical k-means clustering and principal component analysis linked N 2 O emission peaks with precipitation events and ammonium concentrations higher than 2 mg/L, especially in sub-periods characterized by low N 2 O fluxes. Additionally, the highest ranges of measured N 2 O fluxes belonged to clusters corresponding with NO 3 -N concentration less than 1 mg/L in the upstream plug-flow reactor (middle of oxic zone), indicating slow nitrification rates. The results showed that the range of N 2 O emissions partially depends on the prior behavior of the system. The principal component analysis validated the findings from the clustering analysis and showed that ammonium, nitrate, nitrite and temperature explained a considerable percentage of the variance in the system for the majority of the sub-periods. The applied statistical methods, linked the different ranges of emissions with the system variables, provided insights on the effect of operating conditions on N 2 O emissions in each sub-period and can be integrated into N 2 O emissions data processing at wastewater treatment plants
E.W. Fobes; R.W. Rowe
1968-01-01
A system for classifying wood-using industries and recording pertinent statistics for automatic data processing is described. Forms and coding instructions for recording data of primary processing plants are included.
Harrou, Fouzi; Sun, Ying; Taghezouit, Bilal; Saidi, Ahmed; Hamlati, Mohamed-Elkarim
2017-01-01
This study reports the development of an innovative fault detection and diagnosis scheme to monitor the direct current (DC) side of photovoltaic (PV) systems. Towards this end, we propose a statistical approach that exploits the advantages of one
Statistical modeling of the mother-baby system in newborn infants with cerebral ischemia
Directory of Open Access Journals (Sweden)
A. V. Filonenko
2014-01-01
Full Text Available The statistical model could consider the influence of specific maternal psychoemotional and personality factors on a newborn with cerebral ischemia and develop a procedure to prevent negative consequences of postpartum depression in the mother-baby system.
Higher order capacity statistics of multi-hop transmission systems over Rayleigh fading channels
Yilmaz, Ferkan; Tabassum, Hina; Alouini, Mohamed-Slim
2012-01-01
In this paper, we present an exact analytical expression to evaluate the higher order statistics of the channel capacity for amplify and forward (AF) multihop transmission systems operating over Rayleigh fading channels. Furthermore, we present
Statistical Decision Support Tools for System-Oriented Runway Management, Phase II
National Aeronautics and Space Administration — The feasibility of developing a statistical decision support system for traffic flow management in the terminal area and runway load balancing was demonstrated in...
Statistical methods to monitor the West Valley off-gas system
International Nuclear Information System (INIS)
Eggett, D.L.
1990-01-01
This paper reports on the of-gas system for the ceramic melter operated at the West Valley Demonstration Project at West Valley, NY, monitored during melter operation. A one-at-a-time method of monitoring the parameters of the off-gas system is not statistically sound. Therefore, multivariate statistical methods appropriate for the monitoring of many correlated parameters will be used. Monitoring a large number of parameters increases the probability of a false out-of-control signal. If the parameters being monitored are statistically independent, the control limits can be easily adjusted to obtain the desired probability of a false out-of-control signal. The principal component (PC) scores have desirable statistical properties when the original variables are distributed as multivariate normals. Two statistics derived from the PC scores and used to form multivariate control charts are outlined and their distributional properties reviewed
Directory of Open Access Journals (Sweden)
А. Haniukova
2015-04-01
Full Text Available In the article deals the structure and content of the statistical analysis secondary education. Particular attention is paid the organization principles of constructing a system of statistical indicators of the status and trends of the phenomenon. Courtesy a system of indicators, that containing existing indicators and proposed by the author. The analysis of which will increase the efficiency of public administration in the area.
Statistical trend analysis methodology for rare failures in changing technical systems
International Nuclear Information System (INIS)
Ott, K.O.; Hoffmann, H.J.
1983-07-01
A methodology for a statistical trend analysis (STA) in failure rates is presented. It applies primarily to relatively rare events in changing technologies or components. The formulation is more general and the assumptions are less restrictive than in a previously published version. Relations of the statistical analysis and probabilistic assessment (PRA) are discussed in terms of categorization of decisions for action following particular failure events. The significance of tentatively identified trends is explored. In addition to statistical tests for trend significance, a combination of STA and PRA results quantifying the trend complement is proposed. The STA approach is compared with other concepts for trend characterization. (orig.)
Intersection Types and Related Systems
Directory of Open Access Journals (Sweden)
Paweł Parys
2017-02-01
Full Text Available We present a new approach to the following meta-problem: given a quantitative property of trees, design a type system such that the desired property for the tree generated by an infinitary ground lambda-term corresponds to some property of a derivation of a type for this lambda-term, in this type system. Our approach is presented in the particular case of the language finiteness problem for nondeterministic higher-order recursion schemes (HORSes: given a nondeterministic HORS, decide whether the set of all finite trees generated by this HORS is finite. We give a type system such that the HORS can generate a tree of an arbitrarily large finite size if and only if in the type system we can obtain derivations that are arbitrarily large, in an appropriate sense; the latter condition can be easily decided.
Statistics and Analysis of the Relations between Rainstorm Floods and Earthquakes
Directory of Open Access Journals (Sweden)
Baodeng Hou
2016-01-01
Full Text Available The frequent occurrence of geophysical disasters under climate change has drawn Chinese scholars to pay their attention to disaster relations. If the occurrence sequence of disasters could be identified, long-term disaster forecast could be realized. Based on the Earth Degassing Effect (EDE which is valid, this paper took the magnitude, epicenter, and occurrence time of the earthquake, as well as the epicenter and occurrence time of the rainstorm floods as basic factors to establish an integrated model to study the correlation between rainstorm floods and earthquakes. 2461 severe earthquakes occurred in China or within 3000 km from China and the 169 heavy rainstorm floods occurred in China over the past 200+ years as the input data of the model. The computational results showed that although most of the rainstorm floods have nothing to do with the severe earthquakes from a statistical perspective, some floods might relate to earthquakes. This is especially true when the earthquakes happen in the vapor transmission zone where rainstorms lead to abundant water vapors. In this regard, earthquakes are more likely to cause big rainstorm floods. However, many cases of rainstorm floods could be found after severe earthquakes with a large extent of uncertainty.
Study of film data processing systems by means of a statistical simulation
International Nuclear Information System (INIS)
Deart, A.F.; Gromov, A.I.; Kapustinskaya, V.I.; Okorochenko, G.E.; Sychev, A.Yu.; Tatsij, L.I.
1974-01-01
Considered is a statistic model of the film information processing system. The given time diagrams illustrate the model operation algorithm. The program realizing this model of the system is described in detail. The elaborated program model has been tested at the film information processing system which represents a group of measuring devices operating in line with BESM computer. The obtained functioning quantitative characteristics of the system being tested permit to estimate the system operation efficiency
Goodman, J. W.
This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.
Jana, Madhusudan
2015-01-01
Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...
Vaessen, B.E.; van den Beemt, A.A.J.; van de Watering, G.A.; van Meeuwen, L.W.; Lemmens, A.M.C.; den Brok, P.J.
2017-01-01
This pilot study measures university students’ perceptions of graded frequent assessments in an obligatory statistics course using a novel questionnaire. Relations between perceptions of frequent assessments, intrinsic motivation and grades were also investigated. A factor analysis of the
Problems of a Statistical Ensemble Theory for Systems Far from Equilibrium
Ebeling, Werner
The development of a general statistical physics of nonequilibrium systems was one of the main unfinished tasks of statistical physics of the 20th century. The aim of this work is the study of a special class of nonequilibrium systems where the formulation of an ensemble theory of some generality is possible. These are the so-called canonical-dissipative systems, where the driving terms are determined by invariants of motion. We construct canonical-dissipative systems which are ergodic on certain surfaces on the phase plane. These systems may be described by a non-equilibrium microcanocical ensemble, corresponding to an equal distribution on the target surface. Next we construct and solve Fokker-Planck equations; this leads to a kind of canonical-dissipative ensemble. In the last part we discuss the thoretical problem how to define bifurcations in the framework of nonequilibrium statistics and several possible applications.
Igneous-related geothermal systems
Energy Technology Data Exchange (ETDEWEB)
Smith, R L; Shaw, H R
1976-01-01
A preliminary survey of the geothermal resource base associated with igneous-derived thermal anomalies in the upper 10 km of the crust is presented. The approach to numerical estimates of igneous-related heat contents rests on estimates of the probable volumes of high-level magma chambers and determinations of the radiometric ages of the youngest volcanism from those chambers combined with simple thermal calculations based on these values. (MHR)
International Nuclear Information System (INIS)
Gross, D.H.E.
2006-01-01
Heat can flow from cold to hot at any phase separation even in macroscopic systems. Therefore also Lynden-Bell's famous gravo-thermal catastrophe must be reconsidered. In contrast to traditional canonical Boltzmann-Gibbs statistics this is correctly described only by microcanonical statistics. Systems studied in chemical thermodynamics (ChTh) by using canonical statistics consist of several homogeneous macroscopic phases. Evidently, macroscopic statistics as in chemistry cannot and should not be applied to non-extensive or inhomogeneous systems like nuclei or galaxies. Nuclei are small and inhomogeneous. Multifragmented nuclei are even more inhomogeneous and the fragments even smaller. Phase transitions of first order and especially phase separations therefore cannot be described by a (homogeneous) canonical ensemble. Taking this serious, fascinating perspectives open for statistical nuclear fragmentation as test ground for the basic principles of statistical mechanics, especially of phase transitions, without the use of the thermodynamic limit. Moreover, there is also a lot of similarity between the accessible phase space of fragmenting nuclei and inhomogeneous multistellar systems. This underlines the fundamental significance for statistical physics in general. (orig.)
Baijal, Shruti; Nakatani, Chie; van Leeuwen, Cees; Srinivasan, Narayanan
2013-06-07
Human observers show remarkable efficiency in statistical estimation; they are able, for instance, to estimate the mean size of visual objects, even if their number exceeds the capacity limits of focused attention. This ability has been understood as the result of a distinct mode of attention, i.e. distributed attention. Compared to the focused attention mode, working memory representations under distributed attention are proposed to be more compressed, leading to reduced working memory loads. An alternate proposal is that distributed attention uses less structured, feature-level representations. These would fill up working memory (WM) more, even when target set size is low. Using event-related potentials, we compared WM loading in a typical distributed attention task (mean size estimation) to that in a corresponding focused attention task (object recognition), using a measure called contralateral delay activity (CDA). Participants performed both tasks on 2, 4, or 8 different-sized target disks. In the recognition task, CDA amplitude increased with set size; notably, however, in the mean estimation task the CDA amplitude was high regardless of set size. In particular for set-size 2, the amplitude was higher in the mean estimation task than in the recognition task. The result showed that the task involves full WM loading even with a low target set size. This suggests that in the distributed attention mode, representations are not compressed, but rather less structured than under focused attention conditions. Copyright © 2012 Elsevier Ltd. All rights reserved.
On nonequilibrium many-body systems. 1: The nonequilibrium statistical operator method
International Nuclear Information System (INIS)
Algarte, A.C.S.; Vasconcellos, A.R.; Luzzi, R.; Sampaio, A.J.C.
1985-01-01
The theoretical aspects involved in the treatment of many-body systems strongly departed from equilibrium are discussed. The nonequilibrium statistical operator (NSO) method is considered in detail. Using Jaynes' maximum entropy formalism complemented with an ad hoc hypothesis a nonequilibrium statistical operator is obtained. This approach introduces irreversibility from the outset and we recover statistical operators like those of Green-Mori and Zubarev as particular cases. The connection with Generalized Thermodynamics and the construction of nonlinear transport equations are briefly described. (Author) [pt
DYNAMIC STABILITY OF THE SOLAR SYSTEM: STATISTICALLY INCONCLUSIVE RESULTS FROM ENSEMBLE INTEGRATIONS
Energy Technology Data Exchange (ETDEWEB)
Zeebe, Richard E., E-mail: zeebe@soest.hawaii.edu [School of Ocean and Earth Science and Technology, University of Hawaii at Manoa, 1000 Pope Road, MSB 629, Honolulu, HI 96822 (United States)
2015-01-01
Due to the chaotic nature of the solar system, the question of its long-term stability can only be answered in a statistical sense, for instance, based on numerical ensemble integrations of nearby orbits. Destabilization of the inner planets, leading to close encounters and/or collisions can be initiated through a large increase in Mercury's eccentricity, with a currently assumed likelihood of ∼1%. However, little is known at present about the robustness of this number. Here I report ensemble integrations of the full equations of motion of the eight planets and Pluto over 5 Gyr, including contributions from general relativity. The results show that different numerical algorithms lead to statistically different results for the evolution of Mercury's eccentricity (e{sub M}). For instance, starting at present initial conditions (e{sub M}≃0.21), Mercury's maximum eccentricity achieved over 5 Gyr is, on average, significantly higher in symplectic ensemble integrations using heliocentric rather than Jacobi coordinates and stricter error control. In contrast, starting at a possible future configuration (e{sub M}≃0.53), Mercury's maximum eccentricity achieved over the subsequent 500 Myr is, on average, significantly lower using heliocentric rather than Jacobi coordinates. For example, the probability for e{sub M} to increase beyond 0.53 over 500 Myr is >90% (Jacobi) versus only 40%-55% (heliocentric). This poses a dilemma because the physical evolution of the real system—and its probabilistic behavior—cannot depend on the coordinate system or the numerical algorithm chosen to describe it. Some tests of the numerical algorithms suggest that symplectic integrators using heliocentric coordinates underestimate the odds for destabilization of Mercury's orbit at high initial e{sub M}.
A statistical-based approach for fault detection and diagnosis in a photovoltaic system
Garoudja, Elyes; Harrou, Fouzi; Sun, Ying; Kara, Kamel; Chouder, Aissa; Silvestre, Santiago
2017-01-01
This paper reports a development of a statistical approach for fault detection and diagnosis in a PV system. Specifically, the overarching goal of this work is to early detect and identify faults on the DC side of a PV system (e.g., short
Heimann, Dennis; Nieschulze, Jens; König-Ries, Birgitta
2010-04-20
Data management in the life sciences has evolved from simple storage of data to complex information systems providing additional functionalities like analysis and visualization capabilities, demanding the integration of statistical tools. In many cases the used statistical tools are hard-coded within the system. That leads to an expensive integration, substitution, or extension of tools because all changes have to be done in program code. Other systems are using generic solutions for tool integration but adapting them to another system is mostly rather extensive work. This paper shows a way to provide statistical functionality over a statistics web service, which can be easily integrated in any information system and set up using XML configuration files. The statistical functionality is extendable by simply adding the description of a new application to a configuration file. The service architecture as well as the data exchange process between client and service and the adding of analysis applications to the underlying service provider are described. Furthermore a practical example demonstrates the functionality of the service.
Miller, John
1994-01-01
Presents an approach to document numbering, document titling, and process measurement which, when used with fundamental techniques of statistical process control, reveals meaningful process-element variation as well as nominal productivity models. (SR)
Development of nuclear power plant online monitoring system using statistical quality control
International Nuclear Information System (INIS)
An, Sang Ha
2006-02-01
Statistical Quality Control techniques have been applied to many aspects of industrial engineering. An application to nuclear power plant maintenance and control is also presented that can greatly improve plant safety. As a demonstration of such an approach, a specific system is analyzed: the reactor coolant pumps (RCP) and the fouling resistance of heat exchanger. This research uses Shewart X-bar, R charts, Cumulative Sum charts (CUSUM), and Sequential Probability Ratio Test (SPRT) to analyze the process for the state of statistical control. And we made Control Chart Analyzer (CCA) to support these analyses that can make a decision of error in process. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators. Such a system would provide operators with enough time to respond to possible emergency situations and thus improve plant safety and reliability
The System of Indicators for the Statistical Evaluation of Market Conjuncture
Directory of Open Access Journals (Sweden)
Chernenko Daryna I.
2017-04-01
Full Text Available The article is aimed at systematizing and improving the system of statistical indicators for the market of laboratory health services (LHS and developing methods for their calculation. In the course of formation of the system of statistical indicators for the market of LHS, allocation of nine blocks has been proposed: market size; proportionality of market; market demand; market proposal; level and dynamics of prices; variation of the LHS; dynamics, development trends, and cycles of the market; market structure; level of competition and monopolization. The proposed system of statistical indicators together with methods for their calculation should ensure studying the trends and regularities in formation of the market for laboratory health services in Ukraine.
International Nuclear Information System (INIS)
Rebic, S.; Parkins, A.S.; Tan, S.M.
2002-01-01
We explore the photon statistics of light emitted from a system comprising a single four-level atom strongly coupled to a high-finesse optical cavity mode that is driven by a coherent laser field. In the weak driving regime this system is found to exhibit a photon blockade effect. For intermediate driving strengths we find a sudden change in the photon statistics of the light emitted from the cavity. Photon antibunching switches to photon bunching over a very narrow range of intracavity photon number. It is proven that this sudden change in photon statistics occurs due to the existence of robust quantum interference of transitions between the dressed states of the atom-cavity system. Furthermore, it is shown that the strong photon bunching is a nonclassical effect for certain values of driving field strength, violating classical inequalities for field correlations
Quantum statistics and squeezing for a microwave-driven interacting magnon system.
Haghshenasfard, Zahra; Cottam, Michael G
2017-02-01
Theoretical studies are reported for the statistical properties of a microwave-driven interacting magnon system. Both the magnetic dipole-dipole and the exchange interactions are included and the theory is developed for the case of parallel pumping allowing for the inclusion of the nonlinear processes due to the four-magnon interactions. The method of second quantization is used to transform the total Hamiltonian from spin operators to boson creation and annihilation operators. By using the coherent magnon state representation we have studied the magnon occupation number and the statistical behavior of the system. In particular, it is shown that the nonlinearities introduced by the parallel pumping field and the four-magnon interactions lead to non-classical quantum statistical properties of the system, such as magnon squeezing. Also control of the collapse-and-revival phenomena for the time evolution of the average magnon number is demonstrated by varying the parallel pumping amplitude and the four-magnon coupling.
Sapsis, Themistoklis P; Majda, Andrew J
2013-08-20
A framework for low-order predictive statistical modeling and uncertainty quantification in turbulent dynamical systems is developed here. These reduced-order, modified quasilinear Gaussian (ROMQG) algorithms apply to turbulent dynamical systems in which there is significant linear instability or linear nonnormal dynamics in the unperturbed system and energy-conserving nonlinear interactions that transfer energy from the unstable modes to the stable modes where dissipation occurs, resulting in a statistical steady state; such turbulent dynamical systems are ubiquitous in geophysical and engineering turbulence. The ROMQG method involves constructing a low-order, nonlinear, dynamical system for the mean and covariance statistics in the reduced subspace that has the unperturbed statistics as a stable fixed point and optimally incorporates the indirect effect of non-Gaussian third-order statistics for the unperturbed system in a systematic calibration stage. This calibration procedure is achieved through information involving only the mean and covariance statistics for the unperturbed equilibrium. The performance of the ROMQG algorithm is assessed on two stringent test cases: the 40-mode Lorenz 96 model mimicking midlatitude atmospheric turbulence and two-layer baroclinic models for high-latitude ocean turbulence with over 125,000 degrees of freedom. In the Lorenz 96 model, the ROMQG algorithm with just a single mode captures the transient response to random or deterministic forcing. For the baroclinic ocean turbulence models, the inexpensive ROMQG algorithm with 252 modes, less than 0.2% of the total, captures the nonlinear response of the energy, the heat flux, and even the one-dimensional energy and heat flux spectra.
International Nuclear Information System (INIS)
Anghel, Dragoş-Victor
2012-01-01
I show that if the total energy of a system of interacting particles may be written as a sum of quasiparticle energies, then the system of quasiparticles can be viewed, in general, as an ideal gas with fractional exclusion statistics (FES). The general method for calculating the FES parameters is also provided. The interacting particle system cannot be described as an ideal gas of Bose and Fermi quasiparticles except in trivial situations.
Intelligent tutorial system for teaching of probability and statistics at high school in Mexico
Directory of Open Access Journals (Sweden)
Fernando Gudino Penaloza, Miguel Gonzalez Mendoza, Neil Hernandez Gress, Jaime Mora Vargas
2009-12-01
Full Text Available This paper describes the implementation of an intelligent tutoring system dedicated to teaching probability and statistics atthe preparatory school (or high school in Mexico. The system solution was used as a desktop computer and adapted tocarry a mobile environment for the implementation of mobile learning or m-learning. The system complies with the idea ofbeing adaptable to the needs of each student and is able to adapt to three different teaching models that meet the criteriaof three student profiles.
Higher order capacity statistics of multi-hop transmission systems over Rayleigh fading channels
Yilmaz, Ferkan
2012-03-01
In this paper, we present an exact analytical expression to evaluate the higher order statistics of the channel capacity for amplify and forward (AF) multihop transmission systems operating over Rayleigh fading channels. Furthermore, we present simple and efficient closed-form expression to the higher order moments of the channel capacity of dual hop transmission system with Rayleigh fading channels. In order to analyze the behavior of the higher order capacity statistics and investigate the usefulness of the mathematical analysis, some selected numerical and simulation results are presented. Our results are found to be in perfect agreement. © 2012 IEEE.
System of National Accounts as an Information Base for Tax Statistics
Directory of Open Access Journals (Sweden)
A. E. Lyapin
2017-01-01
Full Text Available The article is devoted to those aspects of the system of national accounts, which together perform the role of information base of tax statistics. In our time, the tax system is one of the main subjects of the discussions about the methods and directions of its reform.Taxes are one of the main factors of regulation of the economy and act as an incentive for its development. Analysis of tax revenues to the budgets of different levels will enable to collect taxes and perform tax burden for various industries. From the amount of tax revenue it is possible to judge scales of reproductive processes in the country. It should be noted that taxes in the SNA are special. As mentioned earlier, in the SNA, taxes on products are treated in the form of income. At the same time, most economists prefer, their consideration in the form of consumption taxes, and taxes on various financial transactions (for example: taxes on the purchase/sale of securities are treated as taxes on production, including in cases when there are no services. It would be rational to revise and amend the SNA associated with the interpretation of all taxes and subsidies, to ensure better understanding and compliance with user needs.Taxes are an integral part of any state and an indispensable element of economic relations of any society. In turn, taxes and the budget are inextricably linked, as these relations have a clearly expressed, objective bilateral character. Taxes are the main groups of budget revenues, which makes it possible to finance all the government agencies and expenditure items, as well as the implementation of institutional subsidy units that make up the SNA sector “non-financial corporations”.The second side story is that taxes – a part of the money that is taken from producers and households. The total mass of taxes depends on the composition of taxes, tax rates, tax base and scope of benefits. The bulk of tax revenues also depends on possible changes in
Baez-Cazull, S. E.; McGuire, J.T.; Cozzarelli, I.M.; Voytek, M.A.
2008-01-01
Determining the processes governing aqueous biogeochemistry in a wetland hydrologically linked to an underlying contaminated aquifer is challenging due to the complex exchange between the systems and their distinct responses to changes in precipitation, recharge, and biological activities. To evaluate temporal and spatial processes in the wetland-aquifer system, water samples were collected using cm-scale multichambered passive diffusion samplers (peepers) to span the wetland-aquifer interface over a period of 3 yr. Samples were analyzed for major cations and anions, methane, and a suite of organic acids resulting in a large dataset of over 8000 points, which was evaluated using multivariate statistics. Principal component analysis (PCA) was chosen with the purpose of exploring the sources of variation in the dataset to expose related variables and provide insight into the biogeochemical processes that control the water chemistry of the system. Factor scores computed from PCA were mapped by date and depth. Patterns observed suggest that (i) fermentation is the process controlling the greatest variability in the dataset and it peaks in May; (ii) iron and sulfate reduction were the dominant terminal electron-accepting processes in the system and were associated with fermentation but had more complex seasonal variability than fermentation; (iii) methanogenesis was also important and associated with bacterial utilization of minerals as a source of electron acceptors (e.g., barite BaSO4); and (iv) seasonal hydrological patterns (wet and dry periods) control the availability of electron acceptors through the reoxidation of reduced iron-sulfur species enhancing iron and sulfate reduction. Copyright ?? 2008 by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America. All rights reserved.
A statistical study of weather-related disasters. Past, present and future
Energy Technology Data Exchange (ETDEWEB)
Visser, H.; Bouwman, A.; Petersen, A.; Ligtvoet, W.
2012-07-15
Disasters such as floods, storms, heatwaves and droughts may have serious implications for human health and the economic development of countries. One of the main findings of this report is that disaster burdens are dominated by economic and demographic developments, rather than climate change. Furthermore, disaster burden appears to be spread unequally over rich and poor countries. In Chapter 2 the background of the three regions used throughout this report is described: OECD, BRIICS (Brazil, Russia, India, Indonesia, China and South Africa and remaining countries. Furthermore, an overview of the disaster databases is given, along with definitions of disaster terminology. The statistical treatment of trends in disaster data is shortly exemplified. Chapter 3 gives on overview of the results for disaster burden and trends therein on a global scale. Results are split-up as for different disaster types. In Chapters 4 and 5 the same analysis is performed, but now split-up for three regions. In Chapter 4, disaster burdens are quantified, while analyses of trends in disaster burdens are given in Chapter 5. Here, the analyses are confined to weather-related disaster events only. In Chapter 6 the trend patterns found in Chapter 5, are explained as far as possible. Here, changes in wealth, changes in population, the role of climate change and changes due to adaptation are treated in separate sections. Chapter 7 shortly deals with communicational aspects of disasters: the attribution of individual disasters to climate change and results in the literature which are contradictory to results presented here. Chapters 3 through 7 deal with historical data on disaster burden. In the subsequent Chapters 8 and 9 the future of disaster burden will be dealt with. Chapter 8 gives a short overview of the future of disasters as presented in the literature. In Chapter 9 a PBL case study for flooding on a global scale is given, with predictions for people at risk and economic losses at
Directory of Open Access Journals (Sweden)
Knyazheva Yu. V.
2014-06-01
Full Text Available The market economy causes need of development of the economic analysis first of all at microlevel, that is at the level of the separate enterprises as the enterprises are basis for market economy. Therefore improvement of the queuing system trading enterprise is an important economic problem. Analytical solutions of problems of the mass servicing are in described the theory, don’t correspond to real operating conditions of the queuing systems. Therefore in this article optimization of customer service process and improvement of settlement and cash service system trading enterprise are made by means of numerical statistical simulation of the queuing system trading enterprise. The article describe integrated statistical numerical simulation model of queuing systems trading enterprise working in nonstationary conditions with reference to different distribution laws of customers input stream. This model takes account of various behavior customers output stream, includes checkout service model which takes account of cashier rate of working, also this model includes staff motivation model, profit earning and profit optimization models that take into account possible revenue and costs. The created statistical numerical simulation model of queuing systems trading enterprise, at its realization in the suitable software environment, allows to perform optimization of the most important parameters of system. And when developing the convenient user interface, this model can be a component of support decision-making system for rationalization of organizational structure and for management optimization by trading enterprise.
Chung, Chi-Jung; Kuo, Yu-Chen; Hsieh, Yun-Yu; Li, Tsai-Chung; Lin, Cheng-Chieh; Liang, Wen-Miin; Liao, Li-Na; Li, Chia-Ing; Lin, Hsueh-Chun
2017-11-01
This study applied open source technology to establish a subject-enabled analytics model that can enhance measurement statistics of case studies with the public health data in cloud computing. The infrastructure of the proposed model comprises three domains: 1) the health measurement data warehouse (HMDW) for the case study repository, 2) the self-developed modules of online health risk information statistics (HRIStat) for cloud computing, and 3) the prototype of a Web-based process automation system in statistics (PASIS) for the health risk assessment of case studies with subject-enabled evaluation. The system design employed freeware including Java applications, MySQL, and R packages to drive a health risk expert system (HRES). In the design, the HRIStat modules enforce the typical analytics methods for biomedical statistics, and the PASIS interfaces enable process automation of the HRES for cloud computing. The Web-based model supports both modes, step-by-step analysis and auto-computing process, respectively for preliminary evaluation and real time computation. The proposed model was evaluated by computing prior researches in relation to the epidemiological measurement of diseases that were caused by either heavy metal exposures in the environment or clinical complications in hospital. The simulation validity was approved by the commercial statistics software. The model was installed in a stand-alone computer and in a cloud-server workstation to verify computing performance for a data amount of more than 230K sets. Both setups reached efficiency of about 10 5 sets per second. The Web-based PASIS interface can be used for cloud computing, and the HRIStat module can be flexibly expanded with advanced subjects for measurement statistics. The analytics procedure of the HRES prototype is capable of providing assessment criteria prior to estimating the potential risk to public health. Copyright © 2017 Elsevier B.V. All rights reserved.
A DoS/DDoS Attack Detection System Using Chi-Square Statistic Approach
Directory of Open Access Journals (Sweden)
Fang-Yie Leu
2010-04-01
Full Text Available Nowadays, users can easily access and download network attack tools, which often provide friendly interfaces and easily operated features, from the Internet. Therefore, even a naive hacker can also launch a large scale DoS or DDoS attack to prevent a system, i.e., the victim, from providing Internet services. In this paper, we propose an agent based intrusion detection architecture, which is a distributed detection system, to detect DoS/DDoS attacks by invoking a statistic approach that compares source IP addresses' normal and current packet statistics to discriminate whether there is a DoS/DDoS attack. It first collects all resource IPs' packet statistics so as to create their normal packet distribution. Once some IPs' current packet distribution suddenly changes, very often it is an attack. Experimental results show that this approach can effectively detect DoS/DDoS attacks.
Addressing the statistical mechanics of planet orbits in the solar system
Mogavero, Federico
2017-10-01
The chaotic nature of planet dynamics in the solar system suggests the relevance of a statistical approach to planetary orbits. In such a statistical description, the time-dependent position and velocity of the planets are replaced by the probability density function (PDF) of their orbital elements. It is natural to set up this kind of approach in the framework of statistical mechanics. In the present paper, I focus on the collisionless excitation of eccentricities and inclinations via gravitational interactions in a planetary system. The future planet trajectories in the solar system constitute the prototype of this kind of dynamics. I thus address the statistical mechanics of the solar system planet orbits and try to reproduce the PDFs numerically constructed by Laskar (2008, Icarus, 196, 1). I show that the microcanonical ensemble of the Laplace-Lagrange theory accurately reproduces the statistics of the giant planet orbits. To model the inner planets I then investigate the ansatz of equiprobability in the phase space constrained by the secular integrals of motion. The eccentricity and inclination PDFs of Earth and Venus are reproduced with no free parameters. Within the limitations of a stationary model, the predictions also show a reasonable agreement with Mars PDFs and that of Mercury inclination. The eccentricity of Mercury demands in contrast a deeper analysis. I finally revisit the random walk approach of Laskar to the time dependence of the inner planet PDFs. Such a statistical theory could be combined with direct numerical simulations of planet trajectories in the context of planet formation, which is likely to be a chaotic process.
Watts, A.L.; Lilienfeld, S.O.; Edens, J.F.; Douglas, K.S.; Skeem, J.L.; Verschuere, B.; LoPilato, A.C.
2016-01-01
Given that psychopathy is associated with narcissism, lack of insight, and pathological lying, the assumption that the validity of self-report psychopathy measures is compromised by response distortion has been widespread. We examined the statistical effects (moderation, suppression) of response
Relation between the Surface Friction of Plates and their Statistical Microgeometry
1980-01-01
hot-film calibration. The free stream velocity is measured using the Ott c-urrent meter which gives a value of velocity integrated over the area...sampling interval of 190 pim for each of the recorded profiles. A summary of the statistical analysis for the surface is given in Table 7-15. A small
GREY STATISTICS METHOD OF TECHNOLOGY SELECTION FOR ADVANCED PUBLIC TRANSPORTATION SYSTEMS
Directory of Open Access Journals (Sweden)
Chien Hung WEI
2003-01-01
Full Text Available Taiwan is involved in intelligent transportation systems planning, and is now selecting its prior focus areas for investment and development. The high social and economic impact associated with which intelligent transportation systems technology are chosen explains the efforts of various electronics and transportation corporations for developing intelligent transportation systems technology to expand their business opportunities. However, there has been no detailed research conducted with regard to selecting technology for advanced public transportation systems in Taiwan. Thus, the present paper demonstrates a grey statistics method integrated with a scenario method for solving the problem of selecting advanced public transportation systems technology for Taiwan. A comprehensive questionnaire survey was conducted to demonstrate the effectiveness of the grey statistics method. The proposed approach indicated that contactless smart card technology is the appropriate technology for Taiwan to develop in the near future. The significance of our research results implies that the grey statistics method is an effective method for selecting advanced public transportation systems technologies. We feel our information will be beneficial to the private sector for developing an appropriate intelligent transportation systems technology strategy.
International Nuclear Information System (INIS)
Tsallis, C.; Valle, J.W.F.
1979-01-01
The use of the Variational Method to discuss Quantum Statistical Mechanics of anharmonic systems requires, in order to be able to obtain the correct classical limit, the allowance for renormalization of every operator whose definition depends on the harmonic coefficients. The point is exhibited for a single anharmonic oscillator. In this particular case there is no need for mass renormalization. (Author) [pt
Araya, Takao; Kubo, Takuya; von Wirén, Nicolaus; Takahashi, Hideki
2016-03-01
Plant root development is strongly affected by nutrient availability. Despite the importance of structure and function of roots in nutrient acquisition, statistical modeling approaches to evaluate dynamic and temporal modulations of root system architecture in response to nutrient availability have remained as widely open and exploratory areas in root biology. In this study, we developed a statistical modeling approach to investigate modulations of root system architecture in response to nitrogen availability. Mathematical models were designed for quantitative assessment of root growth and root branching phenotypes and their dynamic relationships based on hierarchical configuration of primary and lateral roots formulating the fishbone-shaped root system architecture in Arabidopsis thaliana. Time-series datasets reporting dynamic changes in root developmental traits on different nitrate or ammonium concentrations were generated for statistical analyses. Regression analyses unraveled key parameters associated with: (i) inhibition of primary root growth under nitrogen limitation or on ammonium; (ii) rapid progression of lateral root emergence in response to ammonium; and (iii) inhibition of lateral root elongation in the presence of excess nitrate or ammonium. This study provides a statistical framework for interpreting dynamic modulation of root system architecture, supported by meta-analysis of datasets displaying morphological responses of roots to diverse nitrogen supplies. © 2015 Institute of Botany, Chinese Academy of Sciences.
Statistics of multi-tube detecting systems; Estadistica de sistemas de deteccion multitubo
Energy Technology Data Exchange (ETDEWEB)
Grau Carles, P.; Grau Malonda, A.
1994-07-01
In this paper three new statistical theorems are demonstrated and applied. These theorems simplify very much the obtention of the formulae to compute the counting efficiency when the detection system is formed by several photomultipliers associated in coincidence and sum. These theorems are applied to several photomultiplier arrangements in order to show their potential and the application way. (Author) 6 refs.
DEFF Research Database (Denmark)
Perrotta, Serena; D'Odorico, Valentina; Prochaska, J. Xavier
2016-01-01
We statistically study the physical properties of a sample of narrow absorption line (NAL) systems looking for empirical evidences to distinguish between intrinsic and intervening NALs without taking into account any a priori definition or velocity cut-off. We analyze the spectra of 100 quasars...
A multivariate statistical study on a diversified data gathering system for nuclear power plants
International Nuclear Information System (INIS)
Samanta, P.K.; Teichmann, T.; Levine, M.M.; Kato, W.Y.
1989-02-01
In this report, multivariate statistical methods are presented and applied to demonstrate their use in analyzing nuclear power plant operational data. For analyses of nuclear power plant events, approaches are presented for detecting malfunctions and degradations within the course of the event. At the system level, approaches are investigated as a means of diagnosis of system level performance. This involves the detection of deviations from normal performance of the system. The input data analyzed are the measurable physical parameters, such as steam generator level, pressurizer water level, auxiliary feedwater flow, etc. The study provides the methodology and illustrative examples based on data gathered from simulation of nuclear power plant transients and computer simulation of a plant system performance (due to lack of easily accessible operational data). Such an approach, once fully developed, can be used to explore statistically the detection of failure trends and patterns and prevention of conditions with serious safety implications. 33 refs., 18 figs., 9 tabs
Shukla, Pragya
2004-01-01
We find that the statistics of levels undergoing metal-insulator transition in systems with multi-parametric Gaussian disorders and non-interacting electrons behaves in a way similar to that of the single parametric Brownian ensembles \\cite{dy}. The latter appear during a Poisson $\\to$ Wigner-Dyson transition, driven by a random perturbation. The analogy provides the analytical evidence for the single parameter scaling of the level-correlations in disordered systems as well as a tool to obtai...
Assaraf, Roland
2014-12-01
We show that the recently proposed correlated sampling without reweighting procedure extends the locality (asymptotic independence of the system size) of a physical property to the statistical fluctuations of its estimator. This makes the approach potentially vastly more efficient for computing space-localized properties in large systems compared with standard correlated methods. A proof is given for a large collection of noninteracting fragments. Calculations on hydrogen chains suggest that this behavior holds not only for systems displaying short-range correlations, but also for systems with long-range correlations.
International Nuclear Information System (INIS)
Guikema, Seth D.
2009-01-01
Probabilistic risk analysis has historically been developed for situations in which measured data about the overall reliability of a system are limited and expert knowledge is the best source of information available. There continue to be a number of important problem areas characterized by a lack of hard data. However, in other important problem areas the emergence of information technology has transformed the situation from one characterized by little data to one characterized by data overabundance. Natural disaster risk assessments for events impacting large-scale, critical infrastructure systems such as electric power distribution systems, transportation systems, water supply systems, and natural gas supply systems are important examples of problems characterized by data overabundance. There are often substantial amounts of information collected and archived about the behavior of these systems over time. Yet it can be difficult to effectively utilize these large data sets for risk assessment. Using this information for estimating the probability or consequences of system failure requires a different approach and analysis paradigm than risk analysis for data-poor systems does. Statistical learning theory, a diverse set of methods designed to draw inferences from large, complex data sets, can provide a basis for risk analysis for data-rich systems. This paper provides an overview of statistical learning theory methods and discusses their potential for greater use in risk analysis
A fuzzy expert system based on relations
International Nuclear Information System (INIS)
Hall, L.O.; Kandel, A.
1986-01-01
The Fuzzy Expert System (FESS) is an expert system which makes use of the theory of fuzzy relations to perform inference. Relations are very general and can be used for any application, which only requires different types of relations be implemented and used. The incorporation of fuzzy reasoning techniques enables the expert system to deal with imprecision in a well-founded manner. The knowledge is represented in relational frames. FESS may operate in either a forward chaining or backward chaining manner. It uses primarily implication and factual relations. A unique methodology for combination of evidence has been developed. It makes uses of a blackboard for communication between the various knowledge sources which may operate in parallel. The expert system has been designed in such a manner that it may be used for diverse applications
Tornadoes and related damage costs: statistical modeling with a semi-Markov approach
Corini, Chiara; D'Amico, Guglielmo; Petroni, Filippo; Prattico, Flavio; Manca, Raimondo
2015-01-01
We propose a statistical approach to tornadoes modeling for predicting and simulating occurrences of tornadoes and accumulated cost distributions over a time interval. This is achieved by modeling the tornadoes intensity, measured with the Fujita scale, as a stochastic process. Since the Fujita scale divides tornadoes intensity into six states, it is possible to model the tornadoes intensity by using Markov and semi-Markov models. We demonstrate that the semi-Markov approach is able to reprod...
Study of energy fluctuation effect on the statistical mechanics of equilibrium systems
International Nuclear Information System (INIS)
Lysogorskiy, Yu V; Wang, Q A; Tayurskii, D A
2012-01-01
This work is devoted to the modeling of energy fluctuation effect on the behavior of small classical thermodynamic systems. It is known that when an equilibrium system gets smaller and smaller, one of the major quantities that becomes more and more uncertain is its internal energy. These increasing fluctuations can considerably modify the original statistics. The present model considers the effect of such energy fluctuations and is based on an overlapping between the Boltzmann-Gibbs statistics and the statistics of the fluctuation. Within this o verlap statistics , we studied the effects of several types of energy fluctuations on the probability distribution, internal energy and heat capacity. It was shown that the fluctuations can considerably change the temperature dependence of internal energy and heat capacity in the low energy range and at low temperatures. Particularly, it was found that, due to the lower energy limit of the systems, the fluctuations reduce the probability for the low energy states close to the lowest energy and increase the total average energy. This energy increasing is larger for lower temperatures, making negative heat capacity possible for this case.
International Nuclear Information System (INIS)
Beck, W.
1984-01-01
From the complexity of computer programs for the solution of scientific and technical problems results a lot of questions. Typical questions concern the strength and weakness of computer programs, the propagation of incertainties among the input data, the sensitivity of input data on output data and the substitute of complex models by more simple ones, which provide equivalent results in certain ranges. Those questions have a general practical meaning, principle answers may be found by statistical methods, which are based on the Monte Carlo Method. In this report the statistical methods are chosen, described and valuated. They are implemented into the modular program system STAR, which is an own component of the program system RSYST. The design of STAR considers users with different knowledge of data processing and statistics. The variety of statistical methods, generating and evaluating procedures. The processing of large data sets in complex structures. The coupling to other components of RSYST and RSYST foreign programs. That the system can be easily modificated and enlarged. Four examples are given, which demonstrate the application of STAR. (orig.) [de
Gòmez, Miguel-Ángel; Lorenzo, Alberto; Ortega, Enrique; Sampaio, Jaime; Ibàñez, Sergio-José
2009-01-01
The aim of the present study was to identify the game-related statistics that allow discriminating between starters and nonstarter players in women’s basketball when related to winning or losing games and best or worst teams. The sample comprised all 216 regular season games from the 2005 Women’s National Basketball Association League (WNBA). The game-related statistics included were 2- and 3- point field-goals (both successful and unsuccessful), free-throws (both successful and unsuccessful), defensive and offensive rebounds, assists, blocks, fouls, steals, turnovers and minutes played. Results from multivariate analysis showed that when best teams won, the discriminant game-related statistics were successful 2-point field-goals (SC = 0.47), successful free-throws (SC = 0.44), fouls (SC = -0.41), assists (SC = 0.37), and defensive rebounds (SC = 0.37). When the worst teams won, the discriminant game-related statistics were successful 2-point field- goals (SC = 0.37), successful free-throws (SC = 0.45), assists (SC = 0.58), and steals (SC = 0.35). The results showed that the successful 2-point field-goals, successful free-throws and the assists were the most powerful variables discriminating between starters and nonstarters. These specific characteristics helped to point out the importance of starters’ players shooting and passing ability during competitions. Key points The players’ game-related statistical profile varied according to team status, game outcome and team quality in women’s basketball. The results of this work help to point out the different player’s performance described in women’s basketball compared with men’s basketball. The results obtained enhance the importance of starters and nonstarters contribution to team’s performance in different game contexts. Results showed the power of successful 2-point field-goals, successful free-throws and assists discriminating between starters and nonstarters in all the analyses. PMID:24149538
Directory of Open Access Journals (Sweden)
Masoud Ghodrati
2016-12-01
Full Text Available Humans are fast and accurate in categorizing complex natural images. It is, however, unclear what features of visual information are exploited by brain to perceive the images with such speed and accuracy. It has been shown that low-level contrast statistics of natural scenes can explain the variance of amplitude of event-related potentials (ERP in response to rapidly presented images. In this study, we investigated the effect of these statistics on frequency content of ERPs. We recorded ERPs from human subjects, while they viewed natural images each presented for 70 ms. Our results showed that Weibull contrast statistics, as a biologically plausible model, explained the variance of ERPs the best, compared to other image statistics that we assessed. Our time-frequency analysis revealed a significant correlation between these statistics and ERPs’ power within theta frequency band (~3-7 Hz. This is interesting, as theta band is believed to be involved in context updating and semantic encoding. This correlation became significant at ~110 ms after stimulus onset, and peaked at 138 ms. Our results show that not only the amplitude but also the frequency of neural responses can be modulated with low-level contrast statistics of natural images and highlights their potential role in scene perception.
Statistics associated with an elemental analysis system of particles induced by X-ray emission
International Nuclear Information System (INIS)
Romo K, C.M.
1987-01-01
In the quantitative elemental analysis by X-ray techniques one has to use data spectra which present fluctuations of statistical nature both from the energy and from the number of counts accumulated. While processing the results for the obtainment of a quantitative result, a detailed knowledge of the associated statistics distributions is needed. In this work, l) the statistics associated with the system photon's counting as well as 2) the distribution of the results as a function of the energy are analyzed. The first one is important for the definition of the expected values and uncertainties and for the spectra simulation (Mukoyama, 1975). The second one is fundamental for the determination of the contribution for each spectral line. (M.R.) [es
Elementary methods for statistical systems, mean field, large-n, and duality
International Nuclear Information System (INIS)
Itzykson, C.
1983-01-01
Renormalizable field theories are singled out by such precise restraints that regularization schemes must be used to break these invariances. Statistical methods can be adapted to these problems where asymptotically free models fail. This lecture surveys approximation schemes developed in the context of statistical mechanics. The confluence point of statistical mechanics and field theory is the use of discretized path integrals, where continuous space time has been replaced by a regular lattice. Dynamic variables, a Boltzman weight factor, and boundary conditions are the ingredients. Mean field approximations --field equations, Random field transform, and gauge invariant systems--are surveyed. Under Large-N limits vector models are found to simplify tremendously. The reasons why matrix models drawn from SU (n) gauge theories do not simplify are discussed. In the epilogue, random curves versus random surfaces are offered as an example where global and local symmetries are not alike
MAI statistics estimation and analysis in a DS-CDMA system
Alami Hassani, A.; Zouak, M.; Mrabti, M.; Abdi, F.
2018-05-01
A primary limitation of Direct Sequence Code Division Multiple Access DS-CDMA link performance and system capacity is multiple access interference (MAI). To examine the performance of CDMA systems in the presence of MAI, i.e., in a multiuser environment, several works assumed that the interference can be approximated by a Gaussian random variable. In this paper, we first develop a new and simple approach to characterize the MAI in a multiuser system. In addition to statistically quantifying the MAI power, the paper also proposes a statistical model for both variance and mean of the MAI for synchronous and asynchronous CDMA transmission. We show that the MAI probability density function (PDF) is Gaussian for the equal-received-energy case and validate it by computer simulations.
Won, Chang-Hee; Michel, Anthony N
2008-01-01
This volume - dedicated to Michael K. Sain on the occasion of his seventieth birthday - is a collection of chapters covering recent advances in stochastic optimal control theory and algebraic systems theory. Written by experts in their respective fields, the chapters are thematically organized into four parts: Part I focuses on statistical control theory, where the cost function is viewed as a random variable and performance is shaped through cost cumulants. In this respect, statistical control generalizes linear-quadratic-Gaussian and H-infinity control. Part II addresses algebraic systems th
Directory of Open Access Journals (Sweden)
Takahiro eKawabe
2013-09-01
Full Text Available Humans can acquire the statistical features of the external world and employ them to control behaviors. Some external events occur in harmony with an agent’s action, and thus humans should also be able to acquire the statistical features between an action and its external outcome. We report that the acquired action-outcome statistical features alter the visual appearance of the action outcome. Pressing either of two assigned keys triggered visual motion whose direction was statistically biased either upward or downward, and observers judged the stimulus motion direction. Points of subjective equality (PSE for judging motion direction were shifted repulsively from the mean of the distribution associated with each key. Our Bayesian model accounted for the PSE shifts, indicating the optimal acquisition of the action-effect statistical relation. The PSE shifts were moderately attenuated when the action-outcome contingency was reduced. The Bayesian model again accounted for the attenuated PSE shifts. On the other hand, when the action-outcome contiguity was greatly reduced, the PSE shifts were greatly attenuated, and however, the Bayesian model could not accounted for the shifts. The results indicate that visual appearance can be modified by prediction based on the optimal acquisition of action-effect causal relation.
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization
Finding the Root Causes of Statistical Inconsistency in Community Earth System Model Output
Milroy, D.; Hammerling, D.; Baker, A. H.
2017-12-01
Baker et al (2015) developed the Community Earth System Model Ensemble Consistency Test (CESM-ECT) to provide a metric for software quality assurance by determining statistical consistency between an ensemble of CESM outputs and new test runs. The test has proved useful for detecting statistical difference caused by compiler bugs and errors in physical modules. However, detection is only the necessary first step in finding the causes of statistical difference. The CESM is a vastly complex model comprised of millions of lines of code which is developed and maintained by a large community of software engineers and scientists. Any root cause analysis is correspondingly challenging. We propose a new capability for CESM-ECT: identifying the sections of code that cause statistical distinguishability. The first step is to discover CESM variables that cause CESM-ECT to classify new runs as statistically distinct, which we achieve via Randomized Logistic Regression. Next we use a tool developed to identify CESM components that define or compute the variables found in the first step. Finally, we employ the application Kernel GENerator (KGEN) created in Kim et al (2016) to detect fine-grained floating point differences. We demonstrate an example of the procedure and advance a plan to automate this process in our future work.
Exercise alleviates depression related systemic inflammation in ...
African Journals Online (AJOL)
Exercise alleviates depression related systemic inflammation in chronic obstructive pulmonary disease patients. ... African Health Sciences ... Currently, physical activity is an important lifestyle factor that has the potential to modify inflammatory ...
A model of seismic focus and related statistical distributions of earthquakes
International Nuclear Information System (INIS)
Apostol, Bogdan-Felix
2006-01-01
A growth model for accumulating seismic energy in a localized seismic focus is described, which introduces a fractional parameter r on geometrical grounds. The model is employed for deriving a power-type law for the statistical distribution in energy, where the parameter r contributes to the exponent, as well as corresponding time and magnitude distributions for earthquakes. The accompanying seismic activity of foreshocks and aftershocks is discussed in connection with this approach, as based on Omori distributions, and the rate of released energy is derived
Towards a best practice of modeling unit of measure and related statistical metadata
Grossmann, Wilfried
2011-01-01
Data and metadata exchange between organizations requires a common language for describing structure and content of statistical data and metadata. The SDMX consortium develops content oriented guidelines (COG) recommending harmonized cross-domain concepts and terminology to increase the efficiency of (meta-) data exchange. A recent challenge is a recommended code list for the unit of measure. Based on examples from SDMX sponsor organizations this paper analyses the diversity of ""unit of measure"" as used in practice, including potential breakdowns and interdependencies of the respective meta-
Statistical methods for including two-body forces in large system calculations
International Nuclear Information System (INIS)
Grimes, S.M.
1980-07-01
Large systems of interacting particles are often treated by assuming that the effect on any one particle of the remaining N-1 may be approximated by an average potential. This approach reduces the problem to that of finding the bound-state solutions for a particle in a potential; statistical mechanics is then used to obtain the properties of the many-body system. In some physical systems this approach may not be acceptable, because the two-body force component cannot be treated in this one-body limit. A technique for incorporating two-body forces in such calculations in a more realistic fashion is described. 1 figure
a Statistical Dynamic Approach to Structural Evolution of Complex Capital Market Systems
Shao, Xiao; Chai, Li H.
As an important part of modern financial systems, capital market has played a crucial role on diverse social resource allocations and economical exchanges. Beyond traditional models and/or theories based on neoclassical economics, considering capital markets as typical complex open systems, this paper attempts to develop a new approach to overcome some shortcomings of the available researches. By defining the generalized entropy of capital market systems, a theoretical model and nonlinear dynamic equation on the operations of capital market are proposed from statistical dynamic perspectives. The US security market from 1995 to 2001 is then simulated and analyzed as a typical case. Some instructive results are discussed and summarized.
Energy Technology Data Exchange (ETDEWEB)
de Supinski, B R; Miller, B P; Liblit, B
2011-09-13
Petascale platforms with O(10{sup 5}) and O(10{sup 6}) processing cores are driving advancements in a wide range of scientific disciplines. These large systems create unprecedented application development challenges. Scalable correctness tools are critical to shorten the time-to-solution on these systems. Currently, many DOE application developers use primitive manual debugging based on printf or traditional debuggers such as TotalView or DDT. This paradigm breaks down beyond a few thousand cores, yet bugs often arise above that scale. Programmers must reproduce problems in smaller runs to analyze them with traditional tools, or else perform repeated runs at scale using only primitive techniques. Even when traditional tools run at scale, the approach wastes substantial effort and computation cycles. Continued scientific progress demands new paradigms for debugging large-scale applications. The Correctness on Petascale Systems (CoPS) project is developing a revolutionary debugging scheme that will reduce the debugging problem to a scale that human developers can comprehend. The scheme can provide precise diagnoses of the root causes of failure, including suggestions of the location and the type of errors down to the level of code regions or even a single execution point. Our fundamentally new strategy combines and expands three relatively new complementary debugging approaches. The Stack Trace Analysis Tool (STAT), a 2011 R&D 100 Award Winner, identifies behavior equivalence classes in MPI jobs and highlights behavior when elements of the class demonstrate divergent behavior, often the first indicator of an error. The Cooperative Bug Isolation (CBI) project has developed statistical techniques for isolating programming errors in widely deployed code that we will adapt to large-scale parallel applications. Finally, we are developing a new approach to parallelizing expensive correctness analyses, such as analysis of memory usage in the Memgrind tool. In the first two
International Nuclear Information System (INIS)
Xiu Yan; Shi Hongcheng; Liu Wenguan; Chen Xuefen; Gu Yushen; Chen Shuguang; Yu Haojun; Yu Yiping
2010-01-01
Objective: To investigate the cerebral blood flow (CBF) perfusion patterns and related factors in hyperthyroidism patients. Methods: Twenty-five patients with hyperthyroidism and twenty-two healthy controls matched for age, sex, education were enrolled. 99 Tc m -ethylene cysteinate dimer (ECD) SPECT CBF perfusion imaging was performed at rest. Statistical parametric mapping 5.0 software (SPM5) was used and a statistical threshold of P 3 , FT 4 ), thyroid autoimmune antibodies: sensitive thyroid stimulating hormone (sTSH), thyroid peroxidase antibody (TPOAb) and TSH receptor antibody (TRAb) by Pearson analysis, with disease duration by Spearman analysis. Results: rCBF was decreased significantly in limbic system and frontal lobe, including parahippocampal gyrus, uncus (posterior entorhinal cortex, posterior parolfactory cortex, parahippocampal cortex, anterior cingulate, right inferior temporal gyrus), left hypothalamus and caudate nucleus (P 3 (r=-0.468, -0.417, both P 4 (r=-0.4M, -0.418, -0.415, -0.459, all P 4 (r=0.419, 0.412, both P<0.05). rCBF in left insula was negatively correlated with concentration of sTSH, and right auditory associated cortex was positively correlated with concentration of sTSH (r=-0.504, 0.429, both P<0.05). rCBF in left middle temporal gyrus, left angular gyrus was positively correlated with concentration of TRAb while that in right thalamus, right hypothalamus, left anterior nucleus,left ventralis nucleus was negatively correlated with concentration of TRAb (r=0.750, 0.862, -0.691, -0.835, -0.713, -0.759, all P<0.05). rCBF in right anterior cingulate, right cuneus, right rectus gyrus, right superior marginal gyrus was positively correlated with concentration of TPOAb (r=0.696, 0.581, 0.779, 0.683, all P<0.05). rCBF in postcentral gyrus, temporal gyrus, left superior marginal gyrus and auditory associated cortex was positively correlated with disease duration (r=0.502, 0.457, 0.524, 0.440, all P<0.05). Conclusion: Hypoperfusions in
Tornadoes and related damage costs: statistical modelling with a semi-Markov approach
Directory of Open Access Journals (Sweden)
Guglielmo D’Amico
2016-09-01
Full Text Available We propose a statistical approach to modelling for predicting and simulating occurrences of tornadoes and accumulated cost distributions over a time interval. This is achieved by modelling the tornado intensity, measured with the Fujita scale, as a stochastic process. Since the Fujita scale divides tornado intensity into six states, it is possible to model the tornado intensity by using Markov and semi-Markov models. We demonstrate that the semi-Markov approach is able to reproduce the duration effect that is detected in tornado occurrence. The superiority of the semi-Markov model as compared to the Markov chain model is also affirmed by means of a statistical test of hypothesis. As an application, we compute the expected value and the variance of the costs generated by the tornadoes over a given time interval in a given area. The paper contributes to the literature by demonstrating that semi-Markov models represent an effective tool for physical analysis of tornadoes as well as for the estimation of the economic damages to human things.
Willard, Melissa A Bodnar; McGuffin, Victoria L; Smith, Ruth Waddell
2012-01-01
Salvia divinorum is a hallucinogenic herb that is internationally regulated. In this study, salvinorin A, the active compound in S. divinorum, was extracted from S. divinorum plant leaves using a 5-min extraction with dichloromethane. Four additional Salvia species (Salvia officinalis, Salvia guaranitica, Salvia splendens, and Salvia nemorosa) were extracted using this procedure, and all extracts were analyzed by gas chromatography-mass spectrometry. Differentiation of S. divinorum from other Salvia species was successful based on visual assessment of the resulting chromatograms. To provide a more objective comparison, the total ion chromatograms (TICs) were subjected to principal components analysis (PCA). Prior to PCA, the TICs were subjected to a series of data pretreatment procedures to minimize non-chemical sources of variance in the data set. Successful discrimination of S. divinorum from the other four Salvia species was possible based on visual assessment of the PCA scores plot. To provide a numerical assessment of the discrimination, a series of statistical procedures such as Euclidean distance measurement, hierarchical cluster analysis, Student's t tests, Wilcoxon rank-sum tests, and Pearson product moment correlation were also applied to the PCA scores. The statistical procedures were then compared to determine the advantages and disadvantages for forensic applications.
Energy Technology Data Exchange (ETDEWEB)
Wallace, Jack, E-mail: jack.wallace@ce.queensu.ca [Department of Civil Engineering, Queen’s University, Ellis Hall, 58 University Avenue, Kingston, Ontario K7L 3N6 (Canada); Champagne, Pascale, E-mail: champagne@civil.queensu.ca [Department of Civil Engineering, Queen’s University, Ellis Hall, 58 University Avenue, Kingston, Ontario K7L 3N6 (Canada); Monnier, Anne-Charlotte, E-mail: anne-charlotte.monnier@insa-lyon.fr [National Institute for Applied Sciences – Lyon, 20 Avenue Albert Einstein, 69621 Villeurbanne Cedex (France)
2015-01-15
Highlights: • Performance of a hybrid passive landfill leachate treatment system was evaluated. • 33 Water chemistry parameters were sampled for 21 months and statistically analyzed. • Parameters were strongly linked and explained most (>40%) of the variation in data. • Alkalinity, ammonia, COD, heavy metals, and iron were criteria for performance. • Eight other parameters were key in modeling system dynamics and criteria. - Abstract: A pilot-scale hybrid-passive treatment system operated at the Merrick Landfill in North Bay, Ontario, Canada, treats municipal landfill leachate and provides for subsequent natural attenuation. Collected leachate is directed to a hybrid-passive treatment system, followed by controlled release to a natural attenuation zone before entering the nearby Little Sturgeon River. The study presents a comprehensive evaluation of the performance of the system using multivariate statistical techniques to determine the interactions between parameters, major pollutants in the leachate, and the biological and chemical processes occurring in the system. Five parameters (ammonia, alkalinity, chemical oxygen demand (COD), “heavy” metals of interest, with atomic weights above calcium, and iron) were set as criteria for the evaluation of system performance based on their toxicity to aquatic ecosystems and importance in treatment with respect to discharge regulations. System data for a full range of water quality parameters over a 21-month period were analyzed using principal components analysis (PCA), as well as principal components (PC) and partial least squares (PLS) regressions. PCA indicated a high degree of association for most parameters with the first PC, which explained a high percentage (>40%) of the variation in the data, suggesting strong statistical relationships among most of the parameters in the system. Regression analyses identified 8 parameters (set as independent variables) that were most frequently retained for modeling
A Relational Database System for Student Use.
Fertuck, Len
1982-01-01
Describes an APL implementation of a relational database system suitable for use in a teaching environment in which database development and database administration are studied, and discusses the functions of the user and the database administrator. An appendix illustrating system operation and an eight-item reference list are attached. (Author/JL)
Principle of maximum Fisher information from Hardy's axioms applied to statistical systems.
Frieden, B Roy; Gatenby, Robert A
2013-10-01
Consider a finite-sized, multidimensional system in parameter state a. The system is either at statistical equilibrium or general nonequilibrium, and may obey either classical or quantum physics. L. Hardy's mathematical axioms provide a basis for the physics obeyed by any such system. One axiom is that the number N of distinguishable states a in the system obeys N=max. This assumes that N is known as deterministic prior knowledge. However, most observed systems suffer statistical fluctuations, for which N is therefore only known approximately. Then what happens if the scope of the axiom N=max is extended to include such observed systems? It is found that the state a of the system must obey a principle of maximum Fisher information, I=I(max). This is important because many physical laws have been derived, assuming as a working hypothesis that I=I(max). These derivations include uses of the principle of extreme physical information (EPI). Examples of such derivations were of the De Broglie wave hypothesis, quantum wave equations, Maxwell's equations, new laws of biology (e.g., of Coulomb force-directed cell development and of in situ cancer growth), and new laws of economic fluctuation and investment. That the principle I=I(max) itself derives from suitably extended Hardy axioms thereby eliminates its need to be assumed in these derivations. Thus, uses of I=I(max) and EPI express physics at its most fundamental level, its axiomatic basis in math.
Nonlinear Fluctuation Behavior of Financial Time Series Model by Statistical Physics System
Directory of Open Access Journals (Sweden)
Wuyang Cheng
2014-01-01
Full Text Available We develop a random financial time series model of stock market by one of statistical physics systems, the stochastic contact interacting system. Contact process is a continuous time Markov process; one interpretation of this model is as a model for the spread of an infection, where the epidemic spreading mimics the interplay of local infections and recovery of individuals. From this financial model, we study the statistical behaviors of return time series, and the corresponding behaviors of returns for Shanghai Stock Exchange Composite Index (SSECI and Hang Seng Index (HSI are also comparatively studied. Further, we investigate the Zipf distribution and multifractal phenomenon of returns and price changes. Zipf analysis and MF-DFA analysis are applied to investigate the natures of fluctuations for the stock market.
Path integral molecular dynamics for exact quantum statistics of multi-electronic-state systems.
Liu, Xinzijian; Liu, Jian
2018-03-14
An exact approach to compute physical properties for general multi-electronic-state (MES) systems in thermal equilibrium is presented. The approach is extended from our recent progress on path integral molecular dynamics (PIMD), Liu et al. [J. Chem. Phys. 145, 024103 (2016)] and Zhang et al. [J. Chem. Phys. 147, 034109 (2017)], for quantum statistical mechanics when a single potential energy surface is involved. We first define an effective potential function that is numerically favorable for MES-PIMD and then derive corresponding estimators in MES-PIMD for evaluating various physical properties. Its application to several representative one-dimensional and multi-dimensional models demonstrates that MES-PIMD in principle offers a practical tool in either of the diabatic and adiabatic representations for studying exact quantum statistics of complex/large MES systems when the Born-Oppenheimer approximation, Condon approximation, and harmonic bath approximation are broken.
Bell Correlations in a Many-Body System with Finite Statistics
Wagner, Sebastian; Schmied, Roman; Fadel, Matteo; Treutlein, Philipp; Sangouard, Nicolas; Bancal, Jean-Daniel
2017-10-01
A recent experiment reported the first violation of a Bell correlation witness in a many-body system [Science 352, 441 (2016)]. Following discussions in this Letter, we address here the question of the statistics required to witness Bell correlated states, i.e., states violating a Bell inequality, in such experiments. We start by deriving multipartite Bell inequalities involving an arbitrary number of measurement settings, two outcomes per party and one- and two-body correlators only. Based on these inequalities, we then build up improved witnesses able to detect Bell correlated states in many-body systems using two collective measurements only. These witnesses can potentially detect Bell correlations in states with an arbitrarily low amount of spin squeezing. We then establish an upper bound on the statistics needed to convincingly conclude that a measured state is Bell correlated.
An approach to build knowledge base for reactor accident diagnostic system using statistical method
International Nuclear Information System (INIS)
Kohsaka, Atsuo; Yokobayashi, Masao; Matsumoto, Kiyoshi; Fujii, Minoru
1988-01-01
In the development of a rule based expert system, one of key issues is how to build a knowledge base (KB). A systematic approach has been attempted for building an objective KB efficiently. The approach is based on the concept that a prototype KB should first be generated in a systematic way and then it is to be modified and/or improved by expert for practical use. The statistical method, Factor Analysis, was applied to build a prototype KB for the JAERI expert system DISKET using source information obtained from a PWR simulator. The prototype KB was obtained and the inference with this KB was performed against several types of transients. In each diagnosis, the transient type was well identified. From this study, it is concluded that the statistical method used is useful for building a prototype knowledge base. (author)
Morphology of Laplacian growth processes and statistics of equivalent many-body systems
International Nuclear Information System (INIS)
Blumenfeld, R.
1994-01-01
The authors proposes a theory for the nonlinear evolution of two dimensional interfaces in Laplacian fields. The growing region is conformally mapped onto the unit disk, generating an equivalent many-body system whose dynamics and statistics are studied. The process is shown to be Hamiltonian, with the Hamiltonian being the imaginary part of the complex electrostatic potential. Surface effects are introduced through the Hamiltonian as an external field. An extension to a continuous density of particles is presented. The results are used to study the morphology of the interface using statistical mechanics for the many-body system. The distribution of the curvature and the moments of the growth probability along the interface are calculated exactly from the distribution of the particles. In the dilute limit, the distribution of the curvature is shown to develop algebraic tails, which may, for the first time, explain the origin of fractality in diffusion controlled processes
Yakunin, A. G.; Hussein, H. M.
2018-01-01
The article shows how the known statistical methods, which are widely used in solving financial problems and a number of other fields of science and technology, can be effectively applied after minor modification for solving such problems in climate and environment monitoring systems, as the detection of anomalies in the form of abrupt changes in signal levels, the occurrence of positive and negative outliers and the violation of the cycle form in periodic processes.
Hamiltonian formulation and statistics of an attracting system of nonlinear oscillators
International Nuclear Information System (INIS)
Tasso, H.
1987-10-01
An attracting system of r nonlinear oscillators of an extended van der Pol type was investigated with respect to Hamiltonian formulation. The case of r=2 is rather simple, though nontrivial. For r>2 the tests with Jacobi's identity and Frechet derivatives are negative if Hamiltonians in the natural variables are looked for. Independently, a Liouville theorem is proved and equilibrium statistics is made possible, which leads to a Gaussian distribution in the natural variables. (orig.)
Statistical approach to bistable behaviour of a nonlinear system in a stationary field
International Nuclear Information System (INIS)
Luks, A.; Perina, J.; Perinova, V.; Bertolotti, M.; Sibilia, C.
1984-01-01
The quantum statistical properties of an elastic scattering process are investigated comprising crossed light beams which are in interaction with a particle (electron) beam treated as ''two-step'' system. Using the master equation and the generalized Fokker-Planck equation techniques, the integrated intensities are characterized by their probability distributions and it is demonstrated that single modes exhibit two-peak bistable behaviour. (author)
Directory of Open Access Journals (Sweden)
Kępniak M.
2016-12-01
Full Text Available This paper addresses the tensile and flexural strength of HPC (high performance concrete. The aim of the paper is to analyse the efficiency of models proposed in different codes. In particular, three design procedures from: the ACI 318 [1], Eurocode 2 [2] and the Model Code 2010 [3] are considered. The associations between design tensile strength of concrete obtained from these three codes and compressive strength are compared with experimental results of tensile strength and flexural strength by statistical tools. Experimental results of tensile strength were obtained in the splitting test. Based on this comparison, conclusions are drawn according to the fit between the design methods and the test data. The comparison shows that tensile strength and flexural strength of HPC depend on more influential factors and not only compressive strength.
Comment on star–star relations in statistical mechanics and elliptic gamma-function identities
International Nuclear Information System (INIS)
Bazhanov, Vladimir V; Kels, Andrew P; Sergeev, Sergey M
2013-01-01
We prove a recently conjectured star–star relation, which plays the role of an integrability condition for a class of 2D Ising-type models with multicomponent continuous spin variables. Namely, we reduce this relation to an identity for elliptic gamma functions, previously obtained by Rains. (fast track communication)
Godleski, Stephanie A.; Ostrov, Jamie M.
2010-01-01
The present study used both categorical and dimensional approaches to test the association between relational and physical aggression and hostile intent attributions for both relational and instrumental provocation situations using the National Institute of Child Health and Human Development longitudinal Study of Early Child Care and Youth…
Kuntze, Sebastian; Aizikovitsh-Udi, Einav; Clarke, David
2017-01-01
Stimulating thinking related to mathematical content is the focus of many tasks in the mathematics classroom. Beyond such content-related thinking, promoting forms of higher order thinking is among the goals of mathematics instruction as well. So-called hybrid tasks focus on combining both goals: they aim at fostering mathematical thinking and…
High order statistical signatures from source-driven measurements of subcritical fissile systems
International Nuclear Information System (INIS)
Mattingly, J.K.
1998-01-01
This research focuses on the development and application of high order statistical analyses applied to measurements performed with subcritical fissile systems driven by an introduced neutron source. The signatures presented are derived from counting statistics of the introduced source and radiation detectors that observe the response of the fissile system. It is demonstrated that successively higher order counting statistics possess progressively higher sensitivity to reactivity. Consequently, these signatures are more sensitive to changes in the composition, fissile mass, and configuration of the fissile assembly. Furthermore, it is shown that these techniques are capable of distinguishing the response of the fissile system to the introduced source from its response to any internal or inherent sources. This ability combined with the enhanced sensitivity of higher order signatures indicates that these techniques will be of significant utility in a variety of applications. Potential applications include enhanced radiation signature identification of weapons components for nuclear disarmament and safeguards applications and augmented nondestructive analysis of spent nuclear fuel. In general, these techniques expand present capabilities in the analysis of subcritical measurements
Directory of Open Access Journals (Sweden)
N. A. Bazhayev
2017-01-01
Full Text Available We propose a method of information security monitoring for a wireless network segments of low-power devices, "smart house", "Internet of Things". We have carried out the analysis of characteristics of systems based on wireless technologies, resulting from passive surveillance and active polling of devices that make up the network infrastructure. We have considered a number of external signs of unauthorized access to a wireless network by the potential information security malefactor. The model for analysis of information security conditions is based on the identity, quantity, frequency, and time characteristics. Due to the main features of devices providing network infrastructure, estimation of information security state is directed to the analysis of the system normal operation, rather than the search for signatures and anomalies during performance of various kinds of information attacks. An experiment is disclosed that provides obtaining statistical information on the remote wireless devices, where the accumulation of data for decision-making is done by comparing the statistical information service messages from end nodes in passive and active modes. We present experiment results of the information influence on a typical system. The proposed approach to the analysis of network infrastructure statistical data based on naive Bayesian classifier can be used to determine the state of information security.
Extended statistical entropy analysis as a quantitative management tool for water resource systems
Sobantka, Alicja; Rechberger, Helmut
2010-05-01
The use of entropy in hydrology and water resources has been applied to various applications. As water resource systems are inherently spatial and complex, a stochastic description of these systems is needed, and entropy theory enables development of such a description by providing determination of the least-biased probability distributions with limited knowledge and data. Entropy can also serve as a basis for risk and reliability analysis. The relative entropy has been variously interpreted as a measure freedom of choice, uncertainty and disorder, information content, missing information or information gain or loss. In the analysis of empirical data, entropy is another measure of dispersion, an alternative to the variance. Also, as an evaluation tool, the statistical entropy analysis (SEA) has been developed by previous workers to quantify the power of a process to concentrate chemical elements. Within this research programme the SEA is aimed to be extended for application to chemical compounds and tested for its deficits and potentials in systems where water resources play an important role. The extended SEA (eSEA) will be developed first for the nitrogen balance in waste water treatment plants (WWTP). Later applications on the emission of substances to water bodies such as groundwater (e.g. leachate from landfills) will also be possible. By applying eSEA to the nitrogen balance in a WWTP, all possible nitrogen compounds, which may occur during the water treatment process, are taken into account and are quantified in their impact towards the environment and human health. It has been shown that entropy reducing processes are part of modern waste management. Generally, materials management should be performed in a way that significant entropy rise is avoided. The entropy metric might also be used to perform benchmarking on WWTPs. The result out of this management tool would be the determination of the efficiency of WWTPs. By improving and optimizing the efficiency
Jansma, J Martijn; de Zwart, Jacco A; van Gelderen, Peter; Duyn, Jeff H; Drevets, Wayne C; Furey, Maura L
2013-05-15
Technical developments in MRI have improved signal to noise, allowing use of analysis methods such as Finite impulse response (FIR) of rapid event related functional MRI (er-fMRI). FIR is one of the most informative analysis methods as it determines onset and full shape of the hemodynamic response function (HRF) without any a priori assumptions. FIR is however vulnerable to multicollinearity, which is directly related to the distribution of stimuli over time. Efficiency can be optimized by simplifying a design, and restricting stimuli distribution to specific sequences, while more design flexibility necessarily reduces efficiency. However, the actual effect of efficiency on fMRI results has never been tested in vivo. Thus, it is currently difficult to make an informed choice between protocol flexibility and statistical efficiency. The main goal of this study was to assign concrete fMRI signal to noise values to the abstract scale of FIR statistical efficiency. Ten subjects repeated a perception task with five random and m-sequence based protocol, with varying but, according to literature, acceptable levels of multicollinearity. Results indicated substantial differences in signal standard deviation, while the level was a function of multicollinearity. Experiment protocols varied up to 55.4% in standard deviation. Results confirm that quality of fMRI in an FIR analysis can significantly and substantially vary with statistical efficiency. Our in vivo measurements can be used to aid in making an informed decision between freedom in protocol design and statistical efficiency. Published by Elsevier B.V.
Uncertainty analysis of reactor safety systems with statistically correlated failure data
International Nuclear Information System (INIS)
Dezfuli, H.; Modarres, M.
1985-01-01
The probability of occurrence of the top event of a fault tree is estimated from failure probability of components that constitute the fault tree. Component failure probabilities are subject to statistical uncertainties. In addition, there are cases where the failure data are statistically correlated. Most fault tree evaluations have so far been based on uncorrelated component failure data. The subject of this paper is the description of a method of assessing the probability intervals for the top event failure probability of fault trees when component failure data are statistically correlated. To estimate the mean and variance of the top event, a second-order system moment method is presented through Taylor series expansion, which provides an alternative to the normally used Monte-Carlo method. For cases where component failure probabilities are statistically correlated, the Taylor expansion terms are treated properly. A moment matching technique is used to obtain the probability distribution function of the top event through fitting a Johnson Ssub(B) distribution. The computer program (CORRELATE) was developed to perform the calculations necessary for the implementation of the method developed. The CORRELATE code is very efficient and consumes minimal computer time. This is primarily because it does not employ the time-consuming Monte-Carlo method. (author)
Second-Order Statistics for Wave Propagation through Complex Optical Systems
DEFF Research Database (Denmark)
Yura, H.T.; Hanson, Steen Grüner
1989-01-01
Closed-form expressions are derived for various statistical functions that arise in optical propagation through arbitrary optical systems that can be characterized by a complex ABCD matrix in the presence of distributed random inhomogeneities along the optical path. Specifically, within the second......-order Rytov approximation, explicit general expressions are presented for the mutual coherence function, the log-amplitude and phase correlation functions, and the mean-square irradiance that are obtained in propagation through an arbitrary paraxial ABCD optical system containing Gaussian-shaped limiting...
High-throughput automated system for statistical biosensing employing microcantilevers arrays
DEFF Research Database (Denmark)
Bosco, Filippo; Chen, Ching H.; Hwu, En T.
2011-01-01
In this paper we present a completely new and fully automated system for parallel microcantilever-based biosensing. Our platform is able to monitor simultaneously the change of resonance frequency (dynamic mode), of deflection (static mode), and of surface roughness of hundreds of cantilevers...... in a very short time over multiple biochemical reactions. We have proven that our system is capable to measure 900 independent microsensors in less than a second. Here, we report statistical biosensing results performed over a haptens-antibody assay, where complete characterization of the biochemical...
Ibáñez, Sergio J.; García, Javier; Feu, Sebastian; Lorenzo, Alberto; Sampaio, Jaime
2009-01-01
The aim of the present study was to identify the game-related statistics that discriminated basketball winning and losing teams in each of the three consecutive games played in a condensed tournament format. The data were obtained from the Spanish Basketball Federation and included game-related statistics from the Under-20 league (2005-2006 and 2006-2007 seasons). A total of 223 games were analyzed with the following game-related statistics: two and three-point field goal (made and missed), free-throws (made and missed), offensive and defensive rebounds, assists, steals, turnovers, blocks (made and received), fouls committed, ball possessions and offensive rating. Results showed that winning teams in this competition had better values in all game-related statistics, with the exception of three point field goals made, free-throws missed and turnovers (p ≥ 0.05). The main effect of game number was only identified in turnovers, with a statistical significant decrease between the second and third game. No interaction was found in the analysed variables. A discriminant analysis allowed identifying the two-point field goals made, the defensive rebounds and the assists as discriminators between winning and losing teams in all three games. Additionally to these, only the three-point field goals made contributed to discriminate teams in game three, suggesting a moderate effect of fatigue. Coaches may benefit from being aware of this variation in game determinant related statistics and, also, from using offensive and defensive strategies in the third game, allowing to explore or hide the three point field-goals performance. Key points Overall team performances along the three consecutive games were very similar, not confirming an accumulated fatigue effect. The results from the three-point field goals in the third game suggested that winning teams were able to shoot better from longer distances and this could be the result of exhibiting higher conditioning status and
A statistical estimator for the boiler power and its related parameters
International Nuclear Information System (INIS)
Tang, H.
2001-01-01
To determine the boiler power accurately is important for both controlling the plant and maximizing the plant productivity. There are two computed boiler powers for each boiler. They are steam based boiler power and feedwater based boiler power. The steam based boiler power is computed as the enthalpy difference between the feedwater enthalpy and the boiler steam enthalpy. The feedwater based boiler power is computed as enthalpy absorbed by the feedwater. The steam based boiler power is computed in RRS program and used in calibrating the measured reactor power, while the feedwater based boiler power is computed in CSTAT program and used for indication. Since the steam based boiler power is used as feedback in the reactor control, it is chosen to be the one estimated in this work. Because the boiler power employs steam flow, feedwater flow and feedwater temperature measurements, and because any measurement contains constant or drifting noise and bias, the reconciliation and rectification procedures are needed to determine the boiler power more accurately. A statistic estimator is developed to perform the function of data reconciliation, gross error detection and instruments performance monitoring
International Nuclear Information System (INIS)
Tariq, Saadia R.; Shah, Munir H.; Shaheen, Nazia
2009-01-01
Two tanning units of Pakistan, namely, Kasur and Mian Channun were investigated with respect to the tanning processes (chrome and vegetable, respectively) and the effects of the tanning agents on the quality of soil in vicinity of tanneries were evaluated. The effluent and soil samples from 16 tanneries each of Kasur and Mian Channun were collected. The levels of selected metals (Na, K, Ca, Mg, Fe, Cr, Mn, Co, Cd, Ni, Pb and Zn) were determined by using flame atomic absorption spectrophotometer under optimum analytical conditions. The data thus obtained were subjected to univariate and multivariate statistical analyses. Most of the metals exhibited considerably higher concentrations in the effluents and soils of Kasur compared with those of Mian Channun. It was observed that the soil of Kasur was highly contaminated by Na, K, Ca and Mg emanating from various processes of leather manufacture. Furthermore, the levels of Cr were also present at much enhanced levels than its background concentration due to the adoption of chrome tanning. The levels of Cr determined in soil samples collected from the vicinity of Mian Channun tanneries were almost comparable to the background levels. The soil of this city was found to have contaminated only by the metals originating from pre-tanning processes. The apportionment of selected metals in the effluent and soil samples was determined by a multivariate cluster analysis, which revealed significant differences in chrome and vegetable tanning processes.
Tariq, Saadia R; Shah, Munir H; Shaheen, Nazia
2009-09-30
Two tanning units of Pakistan, namely, Kasur and Mian Channun were investigated with respect to the tanning processes (chrome and vegetable, respectively) and the effects of the tanning agents on the quality of soil in vicinity of tanneries were evaluated. The effluent and soil samples from 16 tanneries each of Kasur and Mian Channun were collected. The levels of selected metals (Na, K, Ca, Mg, Fe, Cr, Mn, Co, Cd, Ni, Pb and Zn) were determined by using flame atomic absorption spectrophotometer under optimum analytical conditions. The data thus obtained were subjected to univariate and multivariate statistical analyses. Most of the metals exhibited considerably higher concentrations in the effluents and soils of Kasur compared with those of Mian Channun. It was observed that the soil of Kasur was highly contaminated by Na, K, Ca and Mg emanating from various processes of leather manufacture. Furthermore, the levels of Cr were also present at much enhanced levels than its background concentration due to the adoption of chrome tanning. The levels of Cr determined in soil samples collected from the vicinity of Mian Channun tanneries were almost comparable to the background levels. The soil of this city was found to have contaminated only by the metals originating from pre-tanning processes. The apportionment of selected metals in the effluent and soil samples was determined by a multivariate cluster analysis, which revealed significant differences in chrome and vegetable tanning processes.
Energy Technology Data Exchange (ETDEWEB)
Tariq, Saadia R. [Department of Chemistry, Lahore College for Women University, Lahore (Pakistan); Shah, Munir H., E-mail: munir_qau@yahoo.com [Department of Chemistry, Quaid-i-Azam University, Islamabad 45320 (Pakistan); Shaheen, Nazia [Department of Chemistry, Quaid-i-Azam University, Islamabad 45320 (Pakistan)
2009-09-30
Two tanning units of Pakistan, namely, Kasur and Mian Channun were investigated with respect to the tanning processes (chrome and vegetable, respectively) and the effects of the tanning agents on the quality of soil in vicinity of tanneries were evaluated. The effluent and soil samples from 16 tanneries each of Kasur and Mian Channun were collected. The levels of selected metals (Na, K, Ca, Mg, Fe, Cr, Mn, Co, Cd, Ni, Pb and Zn) were determined by using flame atomic absorption spectrophotometer under optimum analytical conditions. The data thus obtained were subjected to univariate and multivariate statistical analyses. Most of the metals exhibited considerably higher concentrations in the effluents and soils of Kasur compared with those of Mian Channun. It was observed that the soil of Kasur was highly contaminated by Na, K, Ca and Mg emanating from various processes of leather manufacture. Furthermore, the levels of Cr were also present at much enhanced levels than its background concentration due to the adoption of chrome tanning. The levels of Cr determined in soil samples collected from the vicinity of Mian Channun tanneries were almost comparable to the background levels. The soil of this city was found to have contaminated only by the metals originating from pre-tanning processes. The apportionment of selected metals in the effluent and soil samples was determined by a multivariate cluster analysis, which revealed significant differences in chrome and vegetable tanning processes.
Labushev, Mikhail M.; Khokhlov, Alexander N.
2012-01-01
Index of proportionality of atomic weights of chemical elements is proposed for determining the relative age of minerals and rocks. Their chemical analysis results serve to be initial data for calculations. For rocks of different composition the index is considered to be classification value as well. Crystal lattice energy change in minerals and their associations can be measured by the index value change, thus contributing to the solution of important practical problems. There was determined...
International Nuclear Information System (INIS)
Hnat, B.; O’Connell, D.; Nakariakov, V. M.; Sundberg, T.
2016-01-01
We obtain dispersion relations of magnetic field fluctuations for two crossings of the terrestrial foreshock by Cluster spacecraft. These crossings cover plasma conditions that differ significantly in their plasma β and in the density of the reflected ion beam, but not in the properties of the encountered ion population, both showing shell-like distribution function. Dispersion relations are reconstructed using two-point instantaneous wave number estimations from pairs of Cluster spacecraft. The accessible range of wave vectors, limited by the available spacecraft separations, extends to ≈2 × 10 4 km. Results show multiple branches of dispersion relations, associated with different powers of magnetic field fluctuations. We find that sunward propagating fast magnetosonic waves and beam resonant modes are dominant for the high plasma β interval with a dense beam, while the dispersions of the interval with low beam density include Alfvén and fast magnetosonic modes propagating sunward and anti-sunward.
Energy Technology Data Exchange (ETDEWEB)
Hnat, B.; O’Connell, D.; Nakariakov, V. M. [Centre for Fusion, Space and Astrophysics, University of Warwick (United Kingdom); Sundberg, T., E-mail: B.Hnat@warwick.ac.uk [School of Physics and Astronomy, Queen Mary University of London (United Kingdom)
2016-08-20
We obtain dispersion relations of magnetic field fluctuations for two crossings of the terrestrial foreshock by Cluster spacecraft. These crossings cover plasma conditions that differ significantly in their plasma β and in the density of the reflected ion beam, but not in the properties of the encountered ion population, both showing shell-like distribution function. Dispersion relations are reconstructed using two-point instantaneous wave number estimations from pairs of Cluster spacecraft. The accessible range of wave vectors, limited by the available spacecraft separations, extends to ≈2 × 10{sup 4} km. Results show multiple branches of dispersion relations, associated with different powers of magnetic field fluctuations. We find that sunward propagating fast magnetosonic waves and beam resonant modes are dominant for the high plasma β interval with a dense beam, while the dispersions of the interval with low beam density include Alfvén and fast magnetosonic modes propagating sunward and anti-sunward.
Simulation of statistical systems with not necessarily real and positive probabilities
International Nuclear Information System (INIS)
Kalkreuter, T.
1991-01-01
A new method to determine expectation values of observables in statistical systems with not necessarily real and positive probabilities is proposed. It is tested in a numerical study of the two-dimensional O(3)-symmetric nonlinear σ-model with Symanzik's one-loop improved lattice action. This model is simulated as polymer system with field dependent activities which can be made positive definite or indefinite by adjusting additive constants of the action. For a system with indefinite activities the new proposal is found to work. It is also verified that local observables are not affected by far-away ploymers with indefinite activities when the system has no long-range order. (orig.)
Statistical evaluation of failures and repairs of the V-1 measuring and control system
International Nuclear Information System (INIS)
Laurinec, R.; Korec, J.; Mitosinka, J.; Zarnovican, V.
1984-01-01
A failure record card system was introduced for evaluating the reliability of the measurement and control equipment of the V-1 nuclear power plant. The SPU-800 microcomputer system is used for recording data on magnetic tape and their transmission to the central data processing department. The data are used for evaluating the reliability of components and circuits and a selection is made of the most failure-prone components, and the causes of failures are evaluated as are failure identification, repair and causes of outages. The system provides monthly, annual and total assessment data since the system was commissioned. The results of the statistical evaluation of failures are used for planning preventive maintenance and for determining optimal repair intervals. (E.S.)
Directory of Open Access Journals (Sweden)
E. A. Tatokchin
2017-01-01
Full Text Available Development of the modern educational technologies caused by broad introduction of comput-er testing and development of distant forms of education does necessary revision of methods of an examination of pupils. In work it was shown, need transition to mathematical criteria, exami-nations of knowledge which are deprived of subjectivity. In article the review of the problems arising at realization of this task and are offered approaches for its decision. The greatest atten-tion is paid to discussion of a problem of objective transformation of rated estimates of the ex-pert on to the scale estimates of the student. In general, the discussion this question is was con-cluded that the solution to this problem lies in the creation of specialized intellectual systems. The basis for constructing intelligent system laid the mathematical model of self-organizing nonequilibrium dissipative system, which is a group of students. This article assumes that the dissipative system is provided by the constant influx of new test items of the expert and non-equilibrium – individual psychological characteristics of students in the group. As a result, the system must self-organize themselves into stable patterns. This patern will allow for, relying on large amounts of data, get a statistically significant assessment of student. To justify the pro-posed approach in the work presents the data of the statistical analysis of the results of testing a large sample of students (> 90. Conclusions from this statistical analysis allowed to develop intelligent system statistically significant examination of student performance. It is based on data clustering algorithm (k-mean for the three key parameters. It is shown that this approach allows you to create of the dynamics and objective expertise evaluation.
Gray, Alistair; Veale, Jaimie F.; Binson, Diane; Sell, Randell L.
2013-01-01
Objective. Effectively addressing health disparities experienced by sexual minority populations requires high-quality official data on sexual orientation. We developed a conceptual framework of sexual orientation to improve the quality of sexual orientation data in New Zealand's Official Statistics System. Methods. We reviewed conceptual and methodological literature, culminating in a draft framework. To improve the framework, we held focus groups and key-informant interviews with sexual minority stakeholders and producers and consumers of official statistics. An advisory board of experts provided additional guidance. Results. The framework proposes working definitions of the sexual orientation topic and measurement concepts, describes dimensions of the measurement concepts, discusses variables framing the measurement concepts, and outlines conceptual grey areas. Conclusion. The framework proposes standard definitions and concepts for the collection of official sexual orientation data in New Zealand. It presents a model for producers of official statistics in other countries, who wish to improve the quality of health data on their citizens. PMID:23840231
Directory of Open Access Journals (Sweden)
João Henrique Gomes
2017-05-01
Full Text Available Abstract AIMS This study aimed to verify th erelation ship between of anthropometric and physical performance variables with game-related statistics in professional elite basketball players during a competition. METHODS Eleven male basketball players were evaluated during 10 weeks in two distinct moments (regular season and playoffs. Overall, 11 variables of physical fitness and 13 variables of game-related statistics were analysed. RESULTS The following significant Pearson’scorrelations were found in regular season: percentage of fat mass with assists (r = -0.62 and steals (r = -0.63; height (r = 0.68, lean mass (r = 0.64, and maximum strength (r = 0.67 with blocks; squat jump with steals (r = 0.63; and time in the T-test with success ful two-point field-goals (r = -0.65, success ful free-throws (r = -0.61, and steals (r = -0.62. However, in playoffs, only stature and lean mass maintained these correlations (p ≤ 0.05. CONCLUSIONS The anthropometric and physical characteristics of the players showed few correlations with the game-related statistics in regular season, and these correlations are even lower in the playoff games of a professional elite Champion ship, wherefore, not being good predictors of technical performance.
Gómez, Miguel A; Lorenzo, Alberto; Barakat, Rubén; Ortega, Enrique; Palao, José M
2008-02-01
The aim of the present study was to identify game-related statistics that differentiate winning and losing teams according to game location. The sample included 306 games of the 2004-2005 regular season of the Spanish professional men's league (ACB League). The independent variables were game location (home or away) and game result (win or loss). The game-related statistics registered were free throws (successful and unsuccessful), 2- and 3-point field goals (successful and unsuccessful), offensive and defensive rebounds, blocks, assists, fouls, steals, and turnovers. Descriptive and inferential analyses were done (one-way analysis of variance and discriminate analysis). The multivariate analysis showed that winning teams differ from losing teams in defensive rebounds (SC = .42) and in assists (SC = .38). Similarly, winning teams differ from losing teams when they play at home in defensive rebounds (SC = .40) and in assists (SC = .41). On the other hand, winning teams differ from losing teams when they play away in defensive rebounds (SC = .44), assists (SC = .30), successful 2-point field goals (SC = .31), and unsuccessful 3-point field goals (SC = -.35). Defensive rebounds and assists were the only game-related statistics common to all three analyses.
Equilibration, thermalisation, and the emergence of statistical mechanics in closed quantum systems.
Gogolin, Christian; Eisert, Jens
2016-05-01
We review selected advances in the theoretical understanding of complex quantum many-body systems with regard to emergent notions of quantum statistical mechanics. We cover topics such as equilibration and thermalisation in pure state statistical mechanics, the eigenstate thermalisation hypothesis, the equivalence of ensembles, non-equilibration dynamics following global and local quenches as well as ramps. We also address initial state independence, absence of thermalisation, and many-body localisation. We elucidate the role played by key concepts for these phenomena, such as Lieb-Robinson bounds, entanglement growth, typicality arguments, quantum maximum entropy principles and the generalised Gibbs ensembles, and quantum (non-)integrability. We put emphasis on rigorous approaches and present the most important results in a unified language.
Trajectory-probed instability and statistics of desynchronization events in coupled chaotic systems
Energy Technology Data Exchange (ETDEWEB)
Oliveira, Gilson F. de, E-mail: gilson@otica.ufpb.br; Chevrollier, Martine; Oriá, Marcos [Departamento de Física, Universidade Federal da Paraíba, Caixa Postal 5008, 58051-900 João Pessoa-PB (Brazil); Passerat de Silans, Thierry [Departamento de Física, Universidade Federal da Paraíba, Caixa Postal 5008, 58051-900 João Pessoa-PB (Brazil); UAF, Universidade Federal de Campina Grande, 58429-900 Campina Grande, PB (Brazil); Souza Cavalcante, Hugo L. D. de [Departamento de Informática, Centro de Informática, Universidade Federal da Paraíba, Av. dos Escoteiros s/n, Mangabeira VII, 58055-000 João Pessoa, PB (Brazil)
2015-11-15
Complex systems, such as financial markets, earthquakes, and neurological networks, exhibit extreme events whose mechanisms of formation are not still completely understood. These mechanisms may be identified and better studied in simpler systems with dynamical features similar to the ones encountered in the complex system of interest. For instance, sudden and brief departures from the synchronized state observed in coupled chaotic systems were shown to display non-normal statistical distributions similar to events observed in the complex systems cited above. The current hypothesis accepted is that these desynchronization events are influenced by the presence of unstable object(s) in the phase space of the system. Here, we present further evidence that the occurrence of large events is triggered by the visitation of the system's phase-space trajectory to the vicinity of these unstable objects. In the system studied here, this visitation is controlled by a single parameter, and we exploit this feature to observe the effect of the visitation rate in the overall instability of the synchronized state. We find that the probability of escapes from the synchronized state and the size of those desynchronization events are enhanced in attractors whose shapes permit the chaotic trajectories to approach the region of strong instability. This result shows that the occurrence of large events requires not only a large local instability to amplify noise, or to amplify the effect of parameter mismatch between the coupled subsystems, but also that the trajectories of the system wander close to this local instability.
Trajectory-probed instability and statistics of desynchronization events in coupled chaotic systems
International Nuclear Information System (INIS)
Oliveira, Gilson F. de; Chevrollier, Martine; Oriá, Marcos; Passerat de Silans, Thierry; Souza Cavalcante, Hugo L. D. de
2015-01-01
Complex systems, such as financial markets, earthquakes, and neurological networks, exhibit extreme events whose mechanisms of formation are not still completely understood. These mechanisms may be identified and better studied in simpler systems with dynamical features similar to the ones encountered in the complex system of interest. For instance, sudden and brief departures from the synchronized state observed in coupled chaotic systems were shown to display non-normal statistical distributions similar to events observed in the complex systems cited above. The current hypothesis accepted is that these desynchronization events are influenced by the presence of unstable object(s) in the phase space of the system. Here, we present further evidence that the occurrence of large events is triggered by the visitation of the system's phase-space trajectory to the vicinity of these unstable objects. In the system studied here, this visitation is controlled by a single parameter, and we exploit this feature to observe the effect of the visitation rate in the overall instability of the synchronized state. We find that the probability of escapes from the synchronized state and the size of those desynchronization events are enhanced in attractors whose shapes permit the chaotic trajectories to approach the region of strong instability. This result shows that the occurrence of large events requires not only a large local instability to amplify noise, or to amplify the effect of parameter mismatch between the coupled subsystems, but also that the trajectories of the system wander close to this local instability
Trajectory-probed instability and statistics of desynchronization events in coupled chaotic systems
de Oliveira, Gilson F.; Chevrollier, Martine; Passerat de Silans, Thierry; Oriá, Marcos; de Souza Cavalcante, Hugo L. D.
2015-11-01
Complex systems, such as financial markets, earthquakes, and neurological networks, exhibit extreme events whose mechanisms of formation are not still completely understood. These mechanisms may be identified and better studied in simpler systems with dynamical features similar to the ones encountered in the complex system of interest. For instance, sudden and brief departures from the synchronized state observed in coupled chaotic systems were shown to display non-normal statistical distributions similar to events observed in the complex systems cited above. The current hypothesis accepted is that these desynchronization events are influenced by the presence of unstable object(s) in the phase space of the system. Here, we present further evidence that the occurrence of large events is triggered by the visitation of the system's phase-space trajectory to the vicinity of these unstable objects. In the system studied here, this visitation is controlled by a single parameter, and we exploit this feature to observe the effect of the visitation rate in the overall instability of the synchronized state. We find that the probability of escapes from the synchronized state and the size of those desynchronization events are enhanced in attractors whose shapes permit the chaotic trajectories to approach the region of strong instability. This result shows that the occurrence of large events requires not only a large local instability to amplify noise, or to amplify the effect of parameter mismatch between the coupled subsystems, but also that the trajectories of the system wander close to this local instability.
International Nuclear Information System (INIS)
Shimizu, S.; Ando, Y.; Morioka, T.
1990-01-01
Plant maintenance is recently becoming important with the increase in the number of nuclear power stations and in plant operating time. Various kinds of requirements for plant maintenance, such as countermeasures for equipment degradation and saving maintenance costs while keeping up plant reliability and productivity, are proposed. For this purpose, plant maintenance programs should be improved based on equipment reliability estimated by field data. In order to meet these requirements, it is planned to develop an equipment maintenance management support system for nuclear power plants based on statistical analysis of equipment maintenance history data. The large difference between this proposed new method and current similar methods is to evaluate not only failure data but maintenance data, which includes normal termination data and some degree of degradation or functional disorder data for equipment and parts. So, it is possible to utilize these field data for improving maintenance schedules and to evaluate actual equipment and parts reliability under the current maintenance schedule. In the present paper, the authors show the objectives of this system, an outline of this system and its functions, and the basic technique for collecting and managing of maintenance history data on statistical analysis. It is shown, from the results of feasibility tests using simulation data of maintenance history, that this system has the ability to provide useful information for maintenance and the design enhancement
MacLean, Adam L.
2015-12-16
The last decade has seen an explosion in models that describe phenomena in systems medicine. Such models are especially useful for studying signaling pathways, such as the Wnt pathway. In this chapter we use the Wnt pathway to showcase current mathematical and statistical techniques that enable modelers to gain insight into (models of) gene regulation and generate testable predictions. We introduce a range of modeling frameworks, but focus on ordinary differential equation (ODE) models since they remain the most widely used approach in systems biology and medicine and continue to offer great potential. We present methods for the analysis of a single model, comprising applications of standard dynamical systems approaches such as nondimensionalization, steady state, asymptotic and sensitivity analysis, and more recent statistical and algebraic approaches to compare models with data. We present parameter estimation and model comparison techniques, focusing on Bayesian analysis and coplanarity via algebraic geometry. Our intention is that this (non-exhaustive) review may serve as a useful starting point for the analysis of models in systems medicine.
Directory of Open Access Journals (Sweden)
Hsueh-Hsien Chang
2017-04-01
Full Text Available This paper proposes statistical feature extraction methods combined with artificial intelligence (AI approaches for fault locations in non-intrusive single-line-to-ground fault (SLGF detection of low voltage distribution systems. The input features of the AI algorithms are extracted using statistical moment transformation for reducing the dimensions of the power signature inputs measured by using non-intrusive fault monitoring (NIFM techniques. The data required to develop the network are generated by simulating SLGF using the Electromagnetic Transient Program (EMTP in a test system. To enhance the identification accuracy, these features after normalization are given to AI algorithms for presenting and evaluating in this paper. Different AI techniques are then utilized to compare which identification algorithms are suitable to diagnose the SLGF for various power signatures in a NIFM system. The simulation results show that the proposed method is effective and can identify the fault locations by using non-intrusive monitoring techniques for low voltage distribution systems.
Statistical models for the analysis of water distribution system pipe break data
International Nuclear Information System (INIS)
Yamijala, Shridhar; Guikema, Seth D.; Brumbelow, Kelly
2009-01-01
The deterioration of pipes leading to pipe breaks and leaks in urban water distribution systems is of concern to water utilities throughout the world. Pipe breaks and leaks may result in reduction in the water-carrying capacity of the pipes and contamination of water in the distribution systems. Water utilities incur large expenses in the replacement and rehabilitation of water mains, making it critical to evaluate the current and future condition of the system for maintenance decision-making. This paper compares different statistical regression models proposed in the literature for estimating the reliability of pipes in a water distribution system on the basis of short time histories. The goals of these models are to estimate the likelihood of pipe breaks in the future and determine the parameters that most affect the likelihood of pipe breaks. The data set used for the analysis comes from a major US city, and these data include approximately 85,000 pipe segments with nearly 2500 breaks from 2000 through 2005. The results show that the set of statistical models previously proposed for this problem do not provide good estimates with the test data set. However, logistic generalized linear models do provide good estimates of pipe reliability and can be useful for water utilities in planning pipe inspection and maintenance
Energy Technology Data Exchange (ETDEWEB)
Kashiwamura, T [NHK Spring Co. Ltd., Yokohama (Japan); Shiratori, M; Yu, Q; Koda, I [Yokohama National University, Yokohama (Japan)
1997-10-01
The authors proposed a new practical optimum design method called statistical design support system, which consists of five steps: the effectivity analysis, reanalysis, evaluation of dispersion, the optimiza4ion and evaluation of structural reliability. In this study, the authors applied the present system to analyze and optimum design of an automobile seat frame subjected to crushing. This study should that the present method could be applied to the complex nonlinear problems such as large deformation, material nonlinearity as well as impact problem. It was shown that the optimum design of the seat frame has been solved easily using the present system. 6 refs., 5 figs., 5 tabs.
DEFF Research Database (Denmark)
Paulsen, Rasmus Reinhold; Larsen, Rasmus; Ersbøll, Bjarne Kjær
2002-01-01
surface models are built by using the anatomical landmarks to warp a template mesh onto all shapes in the training set. Testing the gender related differences is done by initially reducing the dimensionality using principal component analysis of the vertices of the warped meshes. The number of components...... to retain is chosen using Horn's parallel analysis. Finally a multivariate analysis of variance is performed on these components....
11.2 YIP Human In the Loop Statistical RelationalLearners
2017-10-23
variety of top-tier conferences and journals including International Conference on AI (AAAI), International Conference on Data Mining (ICDM...on Data Mining (ICDM) Conference Location: Atlantic City, NJ, USA Paper Title: Transfer Learning via Relational Type Matching Publication Status: 1...complex tasks such as natural language processing for extracting adverse drugs events from text [5]. First we will describe the advice-based methods that
The statistical model of origin and evolution planets of Solar system and planetary satellities
Krot, A.
There are the theories for exploring Solar system formation in accord Titius-Bode's low: electromagnetic theories (Birkeland (1912), Alfven (1942)), gravitational theories (Schmidt (1944), Woolfson (1964), Safronov (1969), Dole (1970)), nebular theories (Weizsaecker (1943), Kuiper (1949), Nakano (1970)) [1]-[3]. In spite of great number of work aimed to exploring formation of the Solar system, however, the mentioned theories were not able to explain all phenomena. In this connection the statistical theory for a cosmological body forming (so-called the spheroidal body model) has been proposed in [4]-[11]. Within the framework of this theory, bodies have fuzzy outlines and are represented by means of spheroidal forms. In the work [6], which is a continuation of the papers [4], [5], it has been investigated a slowly evolving in time process of a gravitational compression of a spheroidal body close to an unstable equilibrium state. In the papers [7],[8]the equation of motion of particles inside the weakly gravitating spheroidal body modeled by means of an ideal liquid has been obtained. Using Schwarzschild's and Kerr's metrics a consistency of the proposed statistical model with the general relativity has been shown in [12]. The proposed theory proceeds from the conception for forming a spheroidal body as a protoplanet from planetary nebula; it permits to derive the form of distribution functions for an immovable and rotating spheroidal body [4]-[6],[10]-[13] as well as their density masses (gravitational potentials and strengths) and also to find the distribution function of specific angular momentum of the rotating uniformly spheroidal body [13],[14]. Using the specific angular momentum distribution function this work considers a gas- dust protoplanetary cloud as a rotating and gravitating spheroidal body. Because the specific angular momenta are averaged during conglomeration process the specific angular momenta for a planets of Solar system is found. As a result a
International Nuclear Information System (INIS)
Gonchar, N.S.
1986-01-01
This paper presents a mathematical method developed for investigating a class of systems of infinite-dimensional integral equations which have application in statistical mechanics. Necessary and sufficient conditions are obtained for the uniqueness and bifurcation of the solution of this class of systems of equations. Problems of equilibrium statistical mechanics are considered on the basis of this method
On the Relation Between Management and Economics from the Perspective of Institutional Statistics
Directory of Open Access Journals (Sweden)
Dan Yang
2007-10-01
Full Text Available Every country's economic development affects all levels of its society and thus the results of its social science research. To make social science research better serve their economic development, many countries have established social science research institutes, among which are management research institutes more related to economic research institutes. Through comparative research of the locations and founding dates of the institutes in different countries, this article analyses the development trends and the relationship between economics and management research, providing us with the relevant experience and background for planning purposes.
Topological systems versus attachment relations | Guido ...
African Journals Online (AJOL)
The manuscript makes a category-theoretic comparison between the concepts of topological system of S. Vickers and attachment relation of C. Guido. With the help of the recent definition of the notions in the framework of an arbitrary variety of algebras, we consider functorial relationships between the categories of such ...
A statistical-based approach for fault detection and diagnosis in a photovoltaic system
Garoudja, Elyes
2017-07-10
This paper reports a development of a statistical approach for fault detection and diagnosis in a PV system. Specifically, the overarching goal of this work is to early detect and identify faults on the DC side of a PV system (e.g., short-circuit faults; open-circuit faults; and partial shading faults). Towards this end, we apply exponentially-weighted moving average (EWMA) control chart on the residuals obtained from the one-diode model. Such a choice is motivated by the greater sensitivity of EWMA chart to incipient faults and its low-computational cost making it easy to implement in real time. Practical data from a 3.2 KWp photovoltaic plant located within an Algerian research center is used to validate the proposed approach. Results show clearly the efficiency of the developed method in monitoring PV system status.
Statistical Analysis of Compression Methods for Storing Binary Image for Low-Memory Systems
Directory of Open Access Journals (Sweden)
Roman Slaby
2013-01-01
Full Text Available The paper is focused on the statistical comparison of the selected compression methods which are used for compression of the binary images. The aim is to asses, which of presented compression method for low-memory system requires less number of bytes of memory. For assessment of the success rates of the input image to binary image the correlation functions are used. Correlation function is one of the methods of OCR algorithm used for the digitization of printed symbols. Using of compression methods is necessary for systems based on low-power micro-controllers. The data stream saving is very important for such systems with limited memory as well as the time required for decoding the compressed data. The success rate of the selected compression algorithms is evaluated using the basic characteristics of the exploratory analysis. The searched samples represent the amount of bytes needed to compress the test images, representing alphanumeric characters.
Selecting a Relational Database Management System for Library Automation Systems.
Shekhel, Alex; O'Brien, Mike
1989-01-01
Describes the evaluation of four relational database management systems (RDBMSs) (Informix Turbo, Oracle 6.0 TPS, Unify 2000 and Relational Technology's Ingres 5.0) to determine which is best suited for library automation. The evaluation criteria used to develop a benchmark specifically designed to test RDBMSs for libraries are discussed. (CLB)
International Nuclear Information System (INIS)
Levy, G.C.; Dudewicz, E.J.; Harner, T.J.
1989-01-01
The main research goal has been to evalute significant factors affecting the in vivo magnetic resonance imaging (MRI) parameters R 1 , T 2 , and 1 H density. This approach differs significantly from other such projects in that the experimental data analysis is being performed while concurrently developing automated, computer-aided analysis software for such MRI tissue parameters. In the experimental portion of the project, statistical analyses, and a heuristic minimum/maximum discriminant analysis algorithm have been explored. Both methods have been used to classify tissue types from 1.5 Tesla transaxial MR images of the human brain. The developing program, written in the logic programming language Prolog, similar in a number of ways to many existing expert systems now in use for other medical applications; inclusion of the underlying statistical data base and advanced statistical analyses is the main differentiating feature of the current approach. First results indicate promising classification accuracy of various brain tissues such as gray and white matter, as well as differentiation of different types of gray matter and white matter (e.g.: caudate-nucleus vs. thalamus, both representatives of gray matter; and, cortical white matter vs. internal capsule as representative of white matter). Taking all four tissue types together, the percentage of correct classifications ranges from 73 to 87%. (author)
Titov, A. G.; Okladnikov, I. G.; Gordov, E. P.
2017-11-01
The use of large geospatial datasets in climate change studies requires the development of a set of Spatial Data Infrastructure (SDI) elements, including geoprocessing and cartographical visualization web services. This paper presents the architecture of a geospatial OGC web service system as an integral part of a virtual research environment (VRE) general architecture for statistical processing and visualization of meteorological and climatic data. The architecture is a set of interconnected standalone SDI nodes with corresponding data storage systems. Each node runs a specialized software, such as a geoportal, cartographical web services (WMS/WFS), a metadata catalog, and a MySQL database of technical metadata describing geospatial datasets available for the node. It also contains geospatial data processing services (WPS) based on a modular computing backend realizing statistical processing functionality and, thus, providing analysis of large datasets with the results of visualization and export into files of standard formats (XML, binary, etc.). Some cartographical web services have been developed in a system’s prototype to provide capabilities to work with raster and vector geospatial data based on OGC web services. The distributed architecture presented allows easy addition of new nodes, computing and data storage systems, and provides a solid computational infrastructure for regional climate change studies based on modern Web and GIS technologies.
Statistical mechanics of nonequilibrium liquids
Evans, Denis J; Craig, D P; McWeeny, R
1990-01-01
Statistical Mechanics of Nonequilibrium Liquids deals with theoretical rheology. The book discusses nonlinear response of systems and outlines the statistical mechanical theory. In discussing the framework of nonequilibrium statistical mechanics, the book explains the derivation of a nonequilibrium analogue of the Gibbsian basis for equilibrium statistical mechanics. The book reviews the linear irreversible thermodynamics, the Liouville equation, and the Irving-Kirkwood procedure. The text then explains the Green-Kubo relations used in linear transport coefficients, the linear response theory,
The evolution of the causation concept and its relation with statistical methods in Epidemiology
Directory of Open Access Journals (Sweden)
Luis Fernando Lisboa
2008-09-01
Full Text Available A historical review places the first registers of Epidemiology in ancient Greece, with Hippocrates, who identified environmental causes of diseases. Along the centuries, the evolution of the causation concept started to be related to changes in scientific paradigms. In London, during the 17th century, the quantitative method was introduced in Epidemiology, but it was only by the end of the 19th century that the concept of the environment and a mathematical approach to understanding Public Health issues were well established. This was a very rich period to setting new concepts and systematizations in epidemiologic methodology. The beginning of the 20th century consolidated Epidemiology as a scientific discipline and the development of computers in the post-war years brought much advance in this field. Nowadays, Epidemiology plays an important role as it integrates scientific knowledge on the health/disease process to the professional area, participating in population healthcare efforts.
International Nuclear Information System (INIS)
Derrida, B.; Flyvbjerg, H.
1987-02-01
The statistical properties of the multivalley structure of disordered systems and of randomly broken objects have lots of features in common. For all these problems, if W s denotes the weight of the s th piece, we show that the probability distributions P 1 (W 1 ) of the largest piece W 1 , P 2 (W 2 ) of the second largest piece, and Π(Y) of Y = Σ s W s 2 have always singularities at W 1 = 1/n, W 2 = 1/n and Y = 1/n, n = 1, 2, 3,... (orig.)
Six Sigma Quality Management System and Design of Risk-based Statistical Quality Control.
Westgard, James O; Westgard, Sten A
2017-03-01
Six sigma concepts provide a quality management system (QMS) with many useful tools for managing quality in medical laboratories. This Six Sigma QMS is driven by the quality required for the intended use of a test. The most useful form for this quality requirement is the allowable total error. Calculation of a sigma-metric provides the best predictor of risk for an analytical examination process, as well as a design parameter for selecting the statistical quality control (SQC) procedure necessary to detect medically important errors. Simple point estimates of sigma at medical decision concentrations are sufficient for laboratory applications. Copyright © 2016 Elsevier Inc. All rights reserved.
Shewhart, Mark
1991-01-01
Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.
Preliminary study of energy confinement data with a statistical analysis system in HL-2A tokamak
International Nuclear Information System (INIS)
Xu Yuan; Cui Zhengying; Ji Xiaoquan; Dong Chunfeng; Yang Qingwei; O J W F Kardaun
2010-01-01
Taking advantage of the HL-2A experimental data,an energy confinement database facing ITERL DB2.0 version has been originally established. As for this database,a world widely used statistical analysis system (SAS) has been adopted for the first time to analyze and evaluate the confinement data from HL-2A and the research on scaling laws of energy confinement time corresponding to plasma density is developed, some preliminary results having been achieved. Finally, through comparing with both ITER scaling law and previous ASDEX database, the investigation about L-mode confinement quality on HL-2A and influence of temperature on Spitzer resistivity will be discussed. (authors)
International Nuclear Information System (INIS)
Anosova, Z.P.
1988-01-01
A statistical criterion is proposed for distinguishing between random and physical groupings of stars and galaxies. The criterion is applied to nearby wide multiple stars, triplets of galaxies in the list of Karachentsev, Karachentseva, and Shcherbanovskii, and double galaxies in the list of Dahari, in which the principal components are Seyfert galaxies. Systems that are almost certainly physical, probably physical, probably optical, and almost certainly optical are identified. The limiting difference between the radial velocities of the components of physical multiple galaxies is estimated
Quantum integrable systems related to lie algebras
International Nuclear Information System (INIS)
Olshanetsky, M.A.; Perelomov, A.M.
1983-01-01
Some quantum integrable finite-dimensional systems related to Lie algebras are considered. This review continues the previous review of the same authors (1981) devoted to the classical aspects of these systems. The dynamics of some of these systems is closely related to free motion in symmetric spaces. Using this connection with the theory of symmetric spaces some results such as the forms of spectra, wave functions, S-matrices, quantum integrals of motion are derived. In specific cases the considered systems describe the one-dimensional n-body systems interacting pairwise via potentials g 2 v(q) of the following 5 types: vsub(I)(q)=q - 2 , vsub(II)(q)=sinh - 2 q, vsub(III)(q)=sin - 2 q, vsub(IV)(q)=P(q), vsub(V)(q)=q - 2 +#betta# 2 q 2 . Here P(q) is the Weierstrass function, so that the first three cases are merely subcases on the fourth. The system characterized by the Toda nearest-neighbour potential exp(qsub(j)-qsub(j+1)) is moreover considered. This review presents from a general and universal point of view results obtained mainly over the past fifteen years. Besides, it contains some new results both of physical and mathematical interest. (orig.)
Karuppiah, R.; Faldi, A.; Laurenzi, I.; Usadi, A.; Venkatesh, A.
2014-12-01
An increasing number of studies are focused on assessing the environmental footprint of different products and processes, especially using life cycle assessment (LCA). This work shows how combining statistical methods and Geographic Information Systems (GIS) with environmental analyses can help improve the quality of results and their interpretation. Most environmental assessments in literature yield single numbers that characterize the environmental impact of a process/product - typically global or country averages, often unchanging in time. In this work, we show how statistical analysis and GIS can help address these limitations. For example, we demonstrate a method to separately quantify uncertainty and variability in the result of LCA models using a power generation case study. This is important for rigorous comparisons between the impacts of different processes. Another challenge is lack of data that can affect the rigor of LCAs. We have developed an approach to estimate environmental impacts of incompletely characterized processes using predictive statistical models. This method is applied to estimate unreported coal power plant emissions in several world regions. There is also a general lack of spatio-temporal characterization of the results in environmental analyses. For instance, studies that focus on water usage do not put in context where and when water is withdrawn. Through the use of hydrological modeling combined with GIS, we quantify water stress on a regional and seasonal basis to understand water supply and demand risks for multiple users. Another example where it is important to consider regional dependency of impacts is when characterizing how agricultural land occupation affects biodiversity in a region. We developed a data-driven methodology used in conjuction with GIS to determine if there is a statistically significant difference between the impacts of growing different crops on different species in various biomes of the world.
Walz, M. A.; Donat, M.; Leckebusch, G. C.
2017-12-01
As extreme wind speeds are responsible for large socio-economic losses in Europe, a skillful prediction would be of great benefit for disaster prevention as well as for the actuarial community. Here we evaluate patterns of large-scale atmospheric variability and the seasonal predictability of extreme wind speeds (e.g. >95th percentile) in the European domain in the dynamical seasonal forecast system ECMWF System 4, and compare to the predictability based on a statistical prediction model. The dominant patterns of atmospheric variability show distinct differences between reanalysis and ECMWF System 4, with most patterns in System 4 extended downstream in comparison to ERA-Interim. The dissimilar manifestations of the patterns within the two models lead to substantially different drivers associated with the occurrence of extreme winds in the respective model. While the ECMWF System 4 is shown to provide some predictive power over Scandinavia and the eastern Atlantic, only very few grid cells in the European domain have significant correlations for extreme wind speeds in System 4 compared to ERA-Interim. In contrast, a statistical model predicts extreme wind speeds during boreal winter in better agreement with the observations. Our results suggest that System 4 does not seem to capture the potential predictability of extreme winds that exists in the real world, and therefore fails to provide reliable seasonal predictions for lead months 2-4. This is likely related to the unrealistic representation of large-scale patterns of atmospheric variability. Hence our study points to potential improvements of dynamical prediction skill by improving the simulation of large-scale atmospheric dynamics.
Identification of Crew-Systems Interactions and Decision Related Trends
Jones, Sharon Monica; Evans, Joni K.; Reveley, Mary S.; Withrow, Colleen A.; Ancel, Ersin; Barr, Lawrence
2013-01-01
NASA Vehicle System Safety Technology (VSST) project management uses systems analysis to identify key issues and maintain a portfolio of research leading to potential solutions to its three identified technical challenges. Statistical data and published safety priority lists from academic, industry and other government agencies were reviewed and analyzed by NASA Aviation Safety Program (AvSP) systems analysis personnel to identify issues and future research needs related to one of VSST's technical challenges, Crew Decision Making (CDM). The data examined in the study were obtained from the National Transportation Safety Board (NTSB) Aviation Accident and Incident Data System, Federal Aviation Administration (FAA) Accident/Incident Data System and the NASA Aviation Safety Reporting System (ASRS). In addition, this report contains the results of a review of safety priority lists, information databases and other documented references pertaining to aviation crew systems issues and future research needs. The specific sources examined were: Commercial Aviation Safety Team (CAST) Safety Enhancements Reserved for Future Implementation (SERFIs), Flight Deck Automation Issues (FDAI) and NTSB Most Wanted List and Open Recommendations. Various automation issues taxonomies and priority lists pertaining to human factors, automation and flight design were combined to create a list of automation issues related to CDM.
Tan, Yen Hock; Huang, He; Kihara, Daisuke
2006-08-15
Aligning distantly related protein sequences is a long-standing problem in bioinformatics, and a key for successful protein structure prediction. Its importance is increasing recently in the context of structural genomics projects because more and more experimentally solved structures are available as templates for protein structure modeling. Toward this end, recent structure prediction methods employ profile-profile alignments, and various ways of aligning two profiles have been developed. More fundamentally, a better amino acid similarity matrix can improve a profile itself; thereby resulting in more accurate profile-profile alignments. Here we have developed novel amino acid similarity matrices from knowledge-based amino acid contact potentials. Contact potentials are used because the contact propensity to the other amino acids would be one of the most conserved features of each position of a protein structure. The derived amino acid similarity matrices are tested on benchmark alignments at three different levels, namely, the family, the superfamily, and the fold level. Compared to BLOSUM45 and the other existing matrices, the contact potential-based matrices perform comparably in the family level alignments, but clearly outperform in the fold level alignments. The contact potential-based matrices perform even better when suboptimal alignments are considered. Comparing the matrices themselves with each other revealed that the contact potential-based matrices are very different from BLOSUM45 and the other matrices, indicating that they are located in a different basin in the amino acid similarity matrix space.
Statistical analysis of dispersion relations in turbulent solar wind fluctuations using Cluster data
Perschke, C.; Narita, Y.
2012-12-01
Multi-spacecraft measurements enable us to resolve three-dimensional spatial structures without assuming Taylor's frozen-in-flow hypothesis. This is very useful to study frequency-wave vector diagram in solar wind turbulence through direct determination of three-dimensional wave vectors. The existence and evolution of dispersion relation and its role in fully-developed plasma turbulence have been drawing attention of physicists, in particular, if solar wind turbulence represents kinetic Alfvén or whistler mode as the carrier of spectral energy among different scales through wave-wave interactions. We investigate solar wind intervals of Cluster data for various flow velocities with a high-resolution wave vector analysis method, Multi-point Signal Resonator technique, at the tetrahedral separation about 100 km. Magnetic field data and ion data are used to determine the frequency- wave vector diagrams in the co-moving frame of the solar wind. We find primarily perpendicular wave vectors in solar wind turbulence which justify the earlier discussions about kinetic Alfvén or whistler wave. The frequency- wave vector diagrams confirm (a) wave vector anisotropy and (b) scattering in frequencies.
International Nuclear Information System (INIS)
Kim, Seung Kook
1990-01-01
The paper is based on the record of researching the patients with cancer in the Chun-nam National University Hospital from September 1985 to December 1988. The results are the as follows ; 1. Among the total O.P.D. 921, 028, the patients of Therapeutic Radiology (Opening the Therapeutic Radiology in September) are classified into 27, 159 (2.95%), (186 in 1985, 2,388 in 1986, 10,511 in 1987, and 14,074 in 1988) 2. Among the 4,925 cancer patients, cervix and uterus cancer patients are 1, 138(23.10%), stomach cancer patients are 592(12.02%), brain and thyroid cancer patients are 565(11.47%), liver cancer patients are 400(8 .12%), lung cancer patients are 355 (7.20%) and sexual ratio appeared 1 : 1.13. Therefore, female patients are a slightly more than the male patients. 3. The age distribution of cancer was that of 45∼54 ages are 1,244(25.26%), 55∼64 ages are 1,119(22.72%) and 35∼44 ages are 773(15.70%) and the half of all the cancer patients are 45∼64 ages. 4. Among the 2,519 cancer patients, 742(29.46%) are in the uterus system, 620 (24.62%) are in the brain and thyroid part, 402(15.96%) are in the lungs. Therefore, these three kinds of cancer consist of 70%. 5. The occupational distribution of 3,067 cancer patients(87∼88 year) house wives are 636(20.73%), orderly farmers are 622(20.28%) public service personnells are 193(6.29%), salarymen are 162 (5.28%) and businessmen are 159 (5.18%)
Energy Technology Data Exchange (ETDEWEB)
Li, Ke; Tang, Jie [Department of Medical Physics, University of Wisconsin-Madison, 1111 Highland Avenue, Madison, Wisconsin 53705 (United States); Chen, Guang-Hong, E-mail: gchen7@wisc.edu [Department of Medical Physics, University of Wisconsin-Madison, 1111 Highland Avenue, Madison, Wisconsin 53705 and Department of Radiology, University of Wisconsin-Madison, 600 Highland Avenue, Madison, Wisconsin 53792 (United States)
2014-04-15
Purpose: To reduce radiation dose in CT imaging, the statistical model based iterative reconstruction (MBIR) method has been introduced for clinical use. Based on the principle of MBIR and its nonlinear nature, the noise performance of MBIR is expected to be different from that of the well-understood filtered backprojection (FBP) reconstruction method. The purpose of this work is to experimentally assess the unique noise characteristics of MBIR using a state-of-the-art clinical CT system. Methods: Three physical phantoms, including a water cylinder and two pediatric head phantoms, were scanned in axial scanning mode using a 64-slice CT scanner (Discovery CT750 HD, GE Healthcare, Waukesha, WI) at seven different mAs levels (5, 12.5, 25, 50, 100, 200, 300). At each mAs level, each phantom was repeatedly scanned 50 times to generate an image ensemble for noise analysis. Both the FBP method with a standard kernel and the MBIR method (Veo{sup ®}, GE Healthcare, Waukesha, WI) were used for CT image reconstruction. Three-dimensional (3D) noise power spectrum (NPS), two-dimensional (2D) NPS, and zero-dimensional NPS (noise variance) were assessed both globally and locally. Noise magnitude, noise spatial correlation, noise spatial uniformity and their dose dependence were examined for the two reconstruction methods. Results: (1) At each dose level and at each frequency, the magnitude of the NPS of MBIR was smaller than that of FBP. (2) While the shape of the NPS of FBP was dose-independent, the shape of the NPS of MBIR was strongly dose-dependent; lower dose lead to a “redder” NPS with a lower mean frequency value. (3) The noise standard deviation (σ) of MBIR and dose were found to be related through a power law of σ ∝ (dose){sup −β} with the component β ≈ 0.25, which violated the classical σ ∝ (dose){sup −0.5} power law in FBP. (4) With MBIR, noise reduction was most prominent for thin image slices. (5) MBIR lead to better noise spatial
Understanding Statistics - Cancer Statistics
Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.
Testing relativity with solar system dynamics
Hellings, R. W.
1984-01-01
A major breakthrough is described in the accuracy of Solar System dynamical tests of relativistic gravity. The breakthrough was achieved by factoring in ranging data from Viking Landers 1 and 2 from the surface of Mars. Other key data sources included optical transit circle observations, lunar laser ranging, planetary radar, and spacecraft (Mariner 9 to Mars and Mariner 10 to Mercury). The Solar System model which is used to fit the data and the process by which such fits are performed are explained and results are discussed. The results are fully consistent with the predictions of General Relativity.
Yasukawa, Kazutaka; Nakamura, Kentaro; Fujinaga, Koichiro; Ikehara, Minoru; Kato, Yasuhiro
2017-09-12
Multiple transient global warming events occurred during the early Palaeogene. Although these events, called hyperthermals, have been reported from around the globe, geologic records for the Indian Ocean are limited. In addition, the recovery processes from relatively modest hyperthermals are less constrained than those from the severest and well-studied hothouse called the Palaeocene-Eocene Thermal Maximum. In this study, we constructed a new and high-resolution geochemical dataset of deep-sea sediments clearly recording multiple Eocene hyperthermals in the Indian Ocean. We then statistically analysed the high-dimensional data matrix and extracted independent components corresponding to the biogeochemical responses to the hyperthermals. The productivity feedback commonly controls and efficiently sequesters the excess carbon in the recovery phases of the hyperthermals via an enhanced biological pump, regardless of the magnitude of the events. Meanwhile, this negative feedback is independent of nannoplankton assemblage changes generally recognised in relatively large environmental perturbations.
Ramkilowan, A.; Griffith, D. J.
2017-10-01
Surveillance modelling in terms of the standard Detect, Recognise and Identify (DRI) thresholds remains a key requirement for determining the effectiveness of surveillance sensors. With readily available computational resources it has become feasible to perform statistically representative evaluations of the effectiveness of these sensors. A new capability for performing this Monte-Carlo type analysis is demonstrated in the MORTICIA (Monte- Carlo Optical Rendering for Theatre Investigations of Capability under the Influence of the Atmosphere) software package developed at the Council for Scientific and Industrial Research (CSIR). This first generation, python-based open-source integrated software package, currently in the alpha stage of development aims to provide all the functionality required to perform statistical investigations of the effectiveness of optical surveillance systems in specific or generic deployment theatres. This includes modelling of the mathematical and physical processes that govern amongst other components of a surveillance system; a sensor's detector and optical components, a target and its background as well as the intervening atmospheric influences. In this paper we discuss integral aspects of the bespoke framework that are critical to the longevity of all subsequent modelling efforts. Additionally, some preliminary results are presented.
International Nuclear Information System (INIS)
Fukuda, Toshio; Mitsuoka, Toyokazu.
1985-01-01
The detection of leak in piping system is an important diagnostic technique for facilities to prevent accidents and to take maintenance measures, since the occurrence of leak lowers productivity and causes environmental destruction. As the first step, it is necessary to detect the occurrence of leak without delay, and as the second step, if the place of leak occurrence in piping system can be presumed, accident countermeasures become easy. The detection of leak by pressure is usually used for detecting large leak. But the method depending on pressure is simple and advantageous, therefore the extension of the detecting technique by pressure gradient method to the detection of smaller scale leak using statistical analysis techniques was examined for a pipeline in steady operation in this study. Since the flow in a pipe irregularly varies during pumping, statistical means is required for the detection of small leak by pressure. The index for detecting leak proposed in this paper is the difference of the pressure gradient at the both ends of a pipeline. The experimental results on water and air in nylon tubes are reported. (Kako, I.)
Official Statistics and Statistics Education: Bridging the Gap
Directory of Open Access Journals (Sweden)
Gal Iddo
2017-03-01
Full Text Available This article aims to challenge official statistics providers and statistics educators to ponder on how to help non-specialist adult users of statistics develop those aspects of statistical literacy that pertain to official statistics. We first document the gap in the literature in terms of the conceptual basis and educational materials needed for such an undertaking. We then review skills and competencies that may help adults to make sense of statistical information in areas of importance to society. Based on this review, we identify six elements related to official statistics about which non-specialist adult users should possess knowledge in order to be considered literate in official statistics: (1 the system of official statistics and its work principles; (2 the nature of statistics about society; (3 indicators; (4 statistical techniques and big ideas; (5 research methods and data sources; and (6 awareness and skills for citizens’ access to statistical reports. Based on this ad hoc typology, we discuss directions that official statistics providers, in cooperation with statistics educators, could take in order to (1 advance the conceptualization of skills needed to understand official statistics, and (2 expand educational activities and services, specifically by developing a collaborative digital textbook and a modular online course, to improve public capacity for understanding of official statistics.
Li, Jin-Na; Er, Meng-Joo; Tan, Yen-Kheng; Yu, Hai-Bin; Zeng, Peng
2015-09-01
This paper investigates an adaptive sampling rate control scheme for networked control systems (NCSs) subject to packet disordering. The main objectives of the proposed scheme are (a) to avoid heavy packet disordering existing in communication networks and (b) to stabilize NCSs with packet disordering, transmission delay and packet loss. First, a novel sampling rate control algorithm based on statistical characteristics of disordering entropy is proposed; secondly, an augmented closed-loop NCS that consists of a plant, a sampler and a state-feedback controller is transformed into an uncertain and stochastic system, which facilitates the controller design. Then, a sufficient condition for stochastic stability in terms of Linear Matrix Inequalities (LMIs) is given. Moreover, an adaptive tracking controller is designed such that the sampling period tracks a desired sampling period, which represents a significant contribution. Finally, experimental results are given to illustrate the effectiveness and advantages of the proposed scheme. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.
Statistical Requirements For Pass-Fail Testing Of Contraband Detection Systems
International Nuclear Information System (INIS)
Gilliam, David M.
2011-01-01
Contraband detection systems for homeland security applications are typically tested for probability of detection (PD) and probability of false alarm (PFA) using pass-fail testing protocols. Test protocols usually require specified values for PD and PFA to be demonstrated at a specified level of statistical confidence CL. Based on a recent more theoretical treatment of this subject [1], this summary reviews the definition of CL and provides formulas and spreadsheet functions for constructing tables of general test requirements and for determining the minimum number of tests required. The formulas and tables in this article may be generally applied to many other applications of pass-fail testing, in addition to testing of contraband detection systems.
STATISTIC MODEL OF DYNAMIC DELAY AND DROPOUT ON CELLULAR DATA NETWORKED CONTROL SYSTEM
Directory of Open Access Journals (Sweden)
MUHAMMAD A. MURTI
2017-07-01
Full Text Available Delay and dropout are important parameters influence overall control performance in Networked Control System (NCS. The goal of this research is to find a model of delay and dropout of data communication link in the NCS. Experiments have been done in this research to a water level control of boiler tank as part of the NCS based on internet communication network using High Speed Packet Access (HSPA cellular technology. By this experiments have been obtained closed-loop system response as well as data delay and dropout of data packets. This research contributes on modeling of the NCS which is combination of controlled plant and data communication link. Another contribution is statistical model of delay and dropout on the NCS.
Statistical method application to knowledge base building for reactor accident diagnostic system
International Nuclear Information System (INIS)
Yoshida, Kazuo; Yokobayashi, Masao; Matsumoto, Kiyoshi; Kohsaka, Atsuo
1989-01-01
In the development of a knowledge based expert system, one of key issues is how to build the knowledge base (KB) in an efficient way with keeping the objectivity of KB. In order to solve this issue, an approach has been proposed to build a prototype KB systematically by a statistical method, factor analysis. For the verification of this approach, factor analysis was applied to build a prototype KB for the JAERI expert system DISKET. To this end, alarm and process information was generated by a PWR simulator and the factor analysis was applied to this information to define taxonomy of accident hypotheses and to extract rules for each hypothesis. The prototype KB thus built was tested through inferring against several types of transients including double-failures. In each diagnosis, the transient type was well identified. Furthermore, newly introduced standards for rule extraction showed good effects on the enhancement of the performance of prototype KB. (author)
STATISTICAL INVESTIGATION OF THE GROUNDWATER SYSTEM IN DARB EL-ARBAEIN, SOUTHWESTERN DESERT, EGYPT
Directory of Open Access Journals (Sweden)
Kashouty Mohamed El
2009-12-01
Full Text Available In Darb El Arbaein, the groundwater is the only water resources. The aquifer system starts from Paleozoic-Mesozoic to Upper Cretaceous sandstone rocks. They overlay the basement rocks and the aquifer is confined. In the present research, the performance of the statistical analyses to classify groundwater samples depending on their chemical characters has been tested. The hydrogeological and hydrogeochemical data of 92 groundwater samples was obtained from the GARPAD authority in northern, central, and southern Darb El Arbaein. A robust classification scheme for partitioning groundwater chemistry into homogeneous groups was an important tool for the characterization of Nubian sandstone aquifer. We test the performance of the many available graphical and statistical methodologies used to classify water samples. R-mode, Q-mode, correlation analysis, and principal component analysis were investigated. All the methods were discussed and compared as to their ability to cluster, ease of use, and ease of interpretation. The correlation investigation clarifies the relationship among the lithology, hydrogeology, and anthropogenic. Factor investigation revealed three factors namely; the evaporation process-agriculturalimpact-lithogenic dissolution, the hydrogeological characteristics of the aquifer system, and the surface meteoric water that rechargethe aquifer system. Two main clusters that subdivided into four sub clusters were identified in groundwater system based on hydrogeological and hydrogeochemical data. They reflect the impact of geomedia, hydrogeology, geographic position, and agricultural wastewater. The groundwater is undersaturated with respect to most selected minerals. The groundwater was supersaturated with respect to iron minerals in northern and southern Darb El Arbaein. The partial pressure of CO2 of the groundwater versus saturation index of calcite shows the gradual change in PCO2 from atmospheric to the present aquifer
International Nuclear Information System (INIS)
Ishii, Kyoko; Matsumiya, Hisato; Horie, Hideki; Miyagi, Kazumi
2009-01-01
The purpose of this work is to evaluate quantitatively and statistically the safety performance of Super-Safe, Small, and Simple reactor (4S) by analyzing with ARGO code, a plant dynamics code for a sodium-cooled fast reactor. In this evaluation, an Anticipated Transient Without Scram (ATWS) is assumed, and an Unprotected Loss of Flow (ULOF) event is selected as a typical ATWS case. After a metric concerned with safety design is defined as performance factor a Phenomena Identification Ranking Table (PIRT) is produced in order to select the plausible phenomena that affect the metric. Then a sensitivity analysis is performed for the parameters related to the selected plausible phenomena. Finally the metric is evaluated with statistical methods whether it satisfies the given safety acceptance criteria. The result is as follows: The Cumulative Damage Fraction (CDF) for the cladding is defined as a metric, and the statistical estimation of the one-sided upper tolerance limit of 95 percent probability at a 95 percent confidence level in CDF is within the safety acceptance criterion; CDF < 0.1. The result shows that the 4S safety performance is acceptable in the ULOF event. (author)
Group & Intergroup Relations in Living Human Systems.
1980-06-01
organizational diagnosis , the group is itself a living human system. A group may be underbounded, overbounded, or optimally bounded. The state of group...very im- portant to understand and to use in order to conduct organizational diagnosis " using group methods. 2 -43 (Alderfer, 1977b). The group...Boundary Relations and Organizational Diagnosis . In H. Meltzer and F.W. Wickert (eds.) Humanizing Organizational Behavior. Springfield, Illinois: Thomas
Energy Technology Data Exchange (ETDEWEB)
Jha, Sumit Kumar [University of Central Florida, Orlando; Pullum, Laura L [ORNL; Ramanathan, Arvind [ORNL
2016-01-01
Embedded intelligent systems ranging from tiny im- plantable biomedical devices to large swarms of autonomous un- manned aerial systems are becoming pervasive in our daily lives. While we depend on the flawless functioning of such intelligent systems, and often take their behavioral correctness and safety for granted, it is notoriously difficult to generate test cases that expose subtle errors in the implementations of machine learning algorithms. Hence, the validation of intelligent systems is usually achieved by studying their behavior on representative data sets, using methods such as cross-validation and bootstrapping.In this paper, we present a new testing methodology for studying the correctness of intelligent systems. Our approach uses symbolic decision procedures coupled with statistical hypothesis testing to. We also use our algorithm to analyze the robustness of a human detection algorithm built using the OpenCV open-source computer vision library. We show that the human detection implementation can fail to detect humans in perturbed video frames even when the perturbations are so small that the corresponding frames look identical to the naked eye.
Statistical evaluation of information reported to ISI and ISKO systems from a safety point of view
International Nuclear Information System (INIS)
Alonso Pallares, C.
1993-01-01
This paper describes he event percentages made by the main systems or equipment groups being the cause of incidents or directly linked to the incident. Command and protection systems, first-circuit equipment (BPC, VPC, volume compensator) safety systems, reactor installation and electrical input systems are analyzed. More over the main causes of notifies events are stressed and those where operation experience obtained in WWER-type nuclear power plants shows that and important part of incidents related to safety are due to personnel errors
Analysis of TCE Fate and Transport in Karst Groundwater Systems Using Statistical Mixed Models
Anaya, A. A.; Padilla, I. Y.
2012-12-01
Karst groundwater systems are highly productive and provide an important fresh water resource for human development and ecological integrity. Their high productivity is often associated with conduit flow and high matrix permeability. The same characteristics that make these aquifers productive also make them highly vulnerable to contamination and a likely for contaminant exposure. Of particular interest are trichloroethylene, (TCE) and Di-(2-Ethylhexyl) phthalate (DEHP). These chemicals have been identified as potential precursors of pre-term birth, a leading cause of neonatal complications with a significant health and societal cost. Both of these contaminants have been found in the karst groundwater formations in this area of the island. The general objectives of this work are to: (1) develop fundamental knowledge and determine the processes controlling the release, mobility, persistence, and possible pathways of contaminants in karst groundwater systems, and (2) characterize transport processes in conduit and diffusion-dominated flow under base flow and storm flow conditions. The work presented herein focuses on the use of geo-hydro statistical tools to characterize flow and transport processes under different flow regimes, and their application in the analysis of fate and transport of TCE. Multidimensional, laboratory-scale Geo-Hydrobed models (GHM) were used for this purpose. The models consist of stainless-steel tanks containing karstified limestone blocks collected from the karst aquifer formation of northern Puerto Rico. The models integrates a network of sampling wells to monitor flow, pressure, and solute concentrations temporally and spatially. Experimental work entails injecting dissolved CaCl2 tracers and TCE in the upstream boundary of the GHM while monitoring TCE and tracer concentrations spatially and temporally in the limestone under different groundwater flow regimes. Analysis of the temporal and spatial concentration distributions of solutes
Statistical mechanics of a time-homogeneous system of money and antimoney
Schmitt, Matthias; Schacker, Andreas; Braun, Dieter
2014-03-01
Financial crises appear throughout human history. While there are many schools of thought on what the actual causes of such crises are, it has been suggested that the creation of credit money might be a source of financial instability. We discuss how the credit mechanism in a system of fractional reserve banking leads to non-local transfers of purchasing power that also affect non-involved agents. To overcome this issue, we impose the local symmetry of time homogeneity on the monetary system. A bi-currency system of non-bank assets (money) and bank assets (antimoney) is considered. A payment is either made by passing on money or by receiving antimoney. As a result, a free floating exchange rate between non-bank assets and bank assets is established. Credit creation is replaced by the simultaneous transfer of money and antimoney at a negotiated exchange rate. This is in contrast to traditional discussions of full reserve banking, which stalls creditary lending. With money and antimoney, the problem of credit crunches is mitigated while a full time symmetry of the monetary system is maintained. As a test environment for such a monetary system, we discuss an economy of random transfers. Random transfers are a strong criterion to probe the stability of monetary systems. The analysis using statistical physics provides analytical solutions and confirms that a money-antimoney system could be functional. Equally important to the probing of the stability of such a monetary system is the question of how to implement the credit default dynamics. This issue remains open.
Statistical mechanics of a time-homogeneous system of money and antimoney
International Nuclear Information System (INIS)
Schmitt, Matthias; Schacker, Andreas; Braun, Dieter
2014-01-01
Financial crises appear throughout human history. While there are many schools of thought on what the actual causes of such crises are, it has been suggested that the creation of credit money might be a source of financial instability. We discuss how the credit mechanism in a system of fractional reserve banking leads to non-local transfers of purchasing power that also affect non-involved agents. To overcome this issue, we impose the local symmetry of time homogeneity on the monetary system. A bi-currency system of non-bank assets (money) and bank assets (antimoney) is considered. A payment is either made by passing on money or by receiving antimoney. As a result, a free floating exchange rate between non-bank assets and bank assets is established. Credit creation is replaced by the simultaneous transfer of money and antimoney at a negotiated exchange rate. This is in contrast to traditional discussions of full reserve banking, which stalls creditary lending. With money and antimoney, the problem of credit crunches is mitigated while a full time symmetry of the monetary system is maintained. As a test environment for such a monetary system, we discuss an economy of random transfers. Random transfers are a strong criterion to probe the stability of monetary systems. The analysis using statistical physics provides analytical solutions and confirms that a money–antimoney system could be functional. Equally important to the probing of the stability of such a monetary system is the question of how to implement the credit default dynamics. This issue remains open
Niemann, Brand Lee
A major field program to study beta-mesoscale transport and dispersion over complex mountainous terrain was conducted during 1969 with the cooperation of three government agencies at the White Sands Missile Range in central Utah. The purpose of the program was to measure simultaneously on a large number of days the synoptic and mesoscale wind fields, the relative dispersion between pairs of particle trajectories and the rate of small scale turbulence dissipation. The field program included measurements during more than 60 days in the months of March, June, and November. The large quantity of data generated from this program has been processed and analyzed to provide case studies and statistics to evaluate and refine Lagrangian variable trajectory models. The case studies selected to illustrate the complexities of mesoscale transport and dispersion over complex terrain include those with terrain blocking, lee waves, and stagnation, as well as those with large vertical wind shears and horizontal wind field deformation. The statistics of relative particle dispersion were computed and compared to the classical theories of Richardson and Batchelor and the more recent theories of Lin and Kao among others. The relative particle dispersion was generally found to increase with travel time in the alongwind and crosswind directions, but in a more oscillatory than sustained or even accelerated manner as predicted by most theories, unless substantial wind shears or finite vertical separations between particles were present. The relative particle dispersion in the vertical was generally found to be small and bounded even when substantial vertical motions due to lee waves were present because of the limiting effect of stable temperature stratification. The data show that velocity shears have a more significant effect than turbulence on relative particle dispersion and that sufficient turbulence may not always be present above the planetary boundary layer for "wind direction shear
Tayurskii, Dmitrii; Abe, Sumiyoshi; Alexandre Wang, Q.
2012-11-01
The 3rd International Workshop on Statistical Physics and Mathematics for Complex Systems (SPMCS2012) was held between 25-30 August at Kazan (Volga Region) Federal University, Kazan, Russian Federation. This workshop was jointly organized by Kazan Federal University and Institut Supérieur des Matériaux et Mécaniques Avancées (ISMANS), France. The series of SPMCS workshops was created in 2008 with the aim to be an interdisciplinary incubator for the worldwide exchange of innovative ideas and information about the latest results. The first workshop was held at ISMANS, Le Mans (France) in 2008, and the third at Huazhong Normal University, Wuhan (China) in 2010. At SPMCS2012, we wished to bring together a broad community of researchers from the different branches of the rapidly developing complexity science to discuss the fundamental theoretical challenges (geometry/topology, number theory, statistical physics, dynamical systems, etc) as well as experimental and applied aspects of many practical problems (condensed matter, disordered systems, financial markets, chemistry, biology, geoscience, etc). The program of SPMCS2012 was prepared based on three categories: (i) physical and mathematical studies (quantum mechanics, generalized nonequilibrium thermodynamics, nonlinear dynamics, condensed matter physics, nanoscience); (ii) natural complex systems (physical, geophysical, chemical and biological); (iii) social, economical, political agent systems and man-made complex systems. The conference attracted 64 participants from 10 countries. There were 10 invited lectures, 12 invited talks and 28 regular oral talks in the morning and afternoon sessions. The book of Abstracts is available from the conference website (http://www.ksu.ru/conf/spmcs2012/?id=3). A round table was also held, the topic of which was 'Recent and Anticipated Future Progress in Science of Complexity', discussing a variety of questions and opinions important for the understanding of the concept of
Management of Uncertainty by Statistical Process Control and a Genetic Tuned Fuzzy System
Directory of Open Access Journals (Sweden)
Stephan Birle
2016-01-01
Full Text Available In food industry, bioprocesses like fermentation often are a crucial part of the manufacturing process and decisive for the final product quality. In general, they are characterized by highly nonlinear dynamics and uncertainties that make it difficult to control these processes by the use of traditional control techniques. In this context, fuzzy logic controllers offer quite a straightforward way to control processes that are affected by nonlinear behavior and uncertain process knowledge. However, in order to maintain process safety and product quality it is necessary to specify the controller performance and to tune the controller parameters. In this work, an approach is presented to establish an intelligent control system for oxidoreductive yeast propagation as a representative process biased by the aforementioned uncertainties. The presented approach is based on statistical process control and fuzzy logic feedback control. As the cognitive uncertainty among different experts about the limits that define the control performance as still acceptable may differ a lot, a data-driven design method is performed. Based upon a historic data pool statistical process corridors are derived for the controller inputs control error and change in control error. This approach follows the hypothesis that if the control performance criteria stay within predefined statistical boundaries, the final process state meets the required quality definition. In order to keep the process on its optimal growth trajectory (model based reference trajectory a fuzzy logic controller is used that alternates the process temperature. Additionally, in order to stay within the process corridors, a genetic algorithm was applied to tune the input and output fuzzy sets of a preliminarily parameterized fuzzy controller. The presented experimental results show that the genetic tuned fuzzy controller is able to keep the process within its allowed limits. The average absolute error to the
Improving the Statistical Modeling of the TRMM Extreme Precipitation Monitoring System
Demirdjian, L.; Zhou, Y.; Huffman, G. J.
2016-12-01
This project improves upon an existing extreme precipitation monitoring system based on the Tropical Rainfall Measuring Mission (TRMM) daily product (3B42) using new statistical models. The proposed system utilizes a regional modeling approach, where data from similar grid locations are pooled to increase the quality and stability of the resulting model parameter estimates to compensate for the short data record. The regional frequency analysis is divided into two stages. In the first stage, the region defined by the TRMM measurements is partitioned into approximately 27,000 non-overlapping clusters using a recursive k-means clustering scheme. In the second stage, a statistical model is used to characterize the extreme precipitation events occurring in each cluster. Instead of utilizing the block-maxima approach used in the existing system, where annual maxima are fit to the Generalized Extreme Value (GEV) probability distribution at each cluster separately, the present work adopts the peak-over-threshold (POT) method of classifying points as extreme if they exceed a pre-specified threshold. Theoretical considerations motivate the use of the Generalized-Pareto (GP) distribution for fitting threshold exceedances. The fitted parameters can be used to construct simple and intuitive average recurrence interval (ARI) maps which reveal how rare a particular precipitation event is given its spatial location. The new methodology eliminates much of the random noise that was produced by the existing models due to a short data record, producing more reasonable ARI maps when compared with NOAA's long-term Climate Prediction Center (CPC) ground based observations. The resulting ARI maps can be useful for disaster preparation, warning, and management, as well as increased public awareness of the severity of precipitation events. Furthermore, the proposed methodology can be applied to various other extreme climate records.
International Nuclear Information System (INIS)
Kwag, Shinyoung; Gupta, Abhinav
2017-01-01
Highlights: • This study presents the development of Bayesian framework for probabilistic risk assessment (PRA) of structural systems under multiple hazards. • The concepts of Bayesian network and Bayesian inference are combined by mapping the traditionally used fault trees into a Bayesian network. • The proposed mapping allows for consideration of dependencies as well as correlations between events. • Incorporation of Bayesian inference permits a novel way for exploration of a scenario that is likely to result in a system level “vulnerability.” - Abstract: Conventional probabilistic risk assessment (PRA) methodologies (USNRC, 1983; IAEA, 1992; EPRI, 1994; Ellingwood, 2001) conduct risk assessment for different external hazards by considering each hazard separately and independent of each other. The risk metric for a specific hazard is evaluated by a convolution of the fragility and the hazard curves. The fragility curve for basic event is obtained by using empirical, experimental, and/or numerical simulation data for a particular hazard. Treating each hazard as an independently can be inappropriate in some cases as certain hazards are statistically correlated or dependent. Examples of such correlated events include but are not limited to flooding induced fire, seismically induced internal or external flooding, or even seismically induced fire. In the current practice, system level risk and consequence sequences are typically calculated using logic trees to express the causative relationship between events. In this paper, we present the results from a study on multi-hazard risk assessment that is conducted using a Bayesian network (BN) with Bayesian inference. The framework can consider statistical dependencies among risks from multiple hazards, allows updating by considering the newly available data/information at any level, and provide a novel way to explore alternative failure scenarios that may exist due to vulnerabilities.
Energy Technology Data Exchange (ETDEWEB)
Kwag, Shinyoung [North Carolina State University, Raleigh, NC 27695 (United States); Korea Atomic Energy Research Institute, Daejeon 305-353 (Korea, Republic of); Gupta, Abhinav, E-mail: agupta1@ncsu.edu [North Carolina State University, Raleigh, NC 27695 (United States)
2017-04-15
Highlights: • This study presents the development of Bayesian framework for probabilistic risk assessment (PRA) of structural systems under multiple hazards. • The concepts of Bayesian network and Bayesian inference are combined by mapping the traditionally used fault trees into a Bayesian network. • The proposed mapping allows for consideration of dependencies as well as correlations between events. • Incorporation of Bayesian inference permits a novel way for exploration of a scenario that is likely to result in a system level “vulnerability.” - Abstract: Conventional probabilistic risk assessment (PRA) methodologies (USNRC, 1983; IAEA, 1992; EPRI, 1994; Ellingwood, 2001) conduct risk assessment for different external hazards by considering each hazard separately and independent of each other. The risk metric for a specific hazard is evaluated by a convolution of the fragility and the hazard curves. The fragility curve for basic event is obtained by using empirical, experimental, and/or numerical simulation data for a particular hazard. Treating each hazard as an independently can be inappropriate in some cases as certain hazards are statistically correlated or dependent. Examples of such correlated events include but are not limited to flooding induced fire, seismically induced internal or external flooding, or even seismically induced fire. In the current practice, system level risk and consequence sequences are typically calculated using logic trees to express the causative relationship between events. In this paper, we present the results from a study on multi-hazard risk assessment that is conducted using a Bayesian network (BN) with Bayesian inference. The framework can consider statistical dependencies among risks from multiple hazards, allows updating by considering the newly available data/information at any level, and provide a novel way to explore alternative failure scenarios that may exist due to vulnerabilities.
DHLAS: A web-based information system for statistical genetic analysis of HLA population data.
Thriskos, P; Zintzaras, E; Germenis, A
2007-03-01
DHLAS (database HLA system) is a user-friendly, web-based information system for the analysis of human leukocyte antigens (HLA) data from population studies. DHLAS has been developed using JAVA and the R system, it runs on a Java Virtual Machine and its user-interface is web-based powered by the servlet engine TOMCAT. It utilizes STRUTS, a Model-View-Controller framework and uses several GNU packages to perform several of its tasks. The database engine it relies upon for fast access is MySQL, but others can be used a well. The system estimates metrics, performs statistical testing and produces graphs required for HLA population studies: (i) Hardy-Weinberg equilibrium (calculated using both asymptotic and exact tests), (ii) genetics distances (Euclidian or Nei), (iii) phylogenetic trees using the unweighted pair group method with averages and neigbor-joining method, (iv) linkage disequilibrium (pairwise and overall, including variance estimations), (v) haplotype frequencies (estimate using the expectation-maximization algorithm) and (vi) discriminant analysis. The main merit of DHLAS is the incorporation of a database, thus, the data can be stored and manipulated along with integrated genetic data analysis procedures. In addition, it has an open architecture allowing the inclusion of other functions and procedures.
Directory of Open Access Journals (Sweden)
VIMALA C.
2015-05-01
Full Text Available In recent years, speech technology has become a vital part of our daily lives. Various techniques have been proposed for developing Automatic Speech Recognition (ASR system and have achieved great success in many applications. Among them, Template Matching techniques like Dynamic Time Warping (DTW, Statistical Pattern Matching techniques such as Hidden Markov Model (HMM and Gaussian Mixture Models (GMM, Machine Learning techniques such as Neural Networks (NN, Support Vector Machine (SVM, and Decision Trees (DT are most popular. The main objective of this paper is to design and develop a speaker-independent isolated speech recognition system for Tamil language using the above speech recognition techniques. The background of ASR system, the steps involved in ASR, merits and demerits of the conventional and machine learning algorithms and the observations made based on the experiments are presented in this paper. For the above developed system, highest word recognition accuracy is achieved with HMM technique. It offered 100% accuracy during training process and 97.92% for testing process.
Statistical mechanics of few-particle systems: exact results for two useful models
Miranda, Enrique N.
2017-11-01
The statistical mechanics of small clusters (n ˜ 10-50 elements) of harmonic oscillators and two-level systems is studied exactly, following the microcanonical, canonical and grand canonical formalisms. For clusters with several hundred particles, the results from the three formalisms coincide with those found in the thermodynamic limit. However, for clusters formed by a few tens of elements, the three ensembles yield different results. For a cluster with a few tens of harmonic oscillators, when the heat capacity per oscillator is evaluated within the canonical formalism, it reaches a limit value equal to k B , as in the thermodynamic case, while within the microcanonical formalism the limit value is k B (1-1/n). This difference could be measured experimentally. For a cluster with a few tens of two-level systems, the heat capacity evaluated within the canonical and microcanonical ensembles also presents differences that could be detected experimentally. Both the microcanonical and grand canonical formalism show that the entropy is non-additive for systems this small, while the canonical ensemble reaches the opposite conclusion. These results suggest that the microcanonical ensemble is the most appropriate for dealing with systems with tens of particles.
Mansouri, Majdi; Nounou, Mohamed N; Nounou, Hazem N
2017-09-01
In our previous work, we have demonstrated the effectiveness of the linear multiscale principal component analysis (PCA)-based moving window (MW)-generalized likelihood ratio test (GLRT) technique over the classical PCA and multiscale principal component analysis (MSPCA)-based GLRT methods. The developed fault detection algorithm provided optimal properties by maximizing the detection probability for a particular false alarm rate (FAR) with different values of windows, and however, most real systems are nonlinear, which make the linear PCA method not able to tackle the issue of non-linearity to a great extent. Thus, in this paper, first, we apply a nonlinear PCA to obtain an accurate principal component of a set of data and handle a wide range of nonlinearities using the kernel principal component analysis (KPCA) model. The KPCA is among the most popular nonlinear statistical methods. Second, we extend the MW-GLRT technique to one that utilizes exponential weights to residuals in the moving window (instead of equal weightage) as it might be able to further improve fault detection performance by reducing the FAR using exponentially weighed moving average (EWMA). The developed detection method, which is called EWMA-GLRT, provides improved properties, such as smaller missed detection and FARs and smaller average run length. The idea behind the developed EWMA-GLRT is to compute a new GLRT statistic that integrates current and previous data information in a decreasing exponential fashion giving more weight to the more recent data. This provides a more accurate estimation of the GLRT statistic and provides a stronger memory that will enable better decision making with respect to fault detection. Therefore, in this paper, a KPCA-based EWMA-GLRT method is developed and utilized in practice to improve fault detection in biological phenomena modeled by S-systems and to enhance monitoring process mean. The idea behind a KPCA-based EWMA-GLRT fault detection algorithm is to
Statistical distribution of the local purity in a large quantum system
International Nuclear Information System (INIS)
De Pasquale, A; Pascazio, S; Facchi, P; Giovannetti, V; Parisi, G; Scardicchio, A
2012-01-01
The local purity of large many-body quantum systems can be studied by following a statistical mechanical approach based on a random matrix model. Restricting the analysis to the case of global pure states, this method proved to be successful, and a full characterization of the statistical properties of the local purity was obtained by computing the partition function of the problem. Here we generalize these techniques to the case of global mixed states. In this context, by uniformly sampling the phase space of states with assigned global mixedness, we determine the exact expression of the first two moments of the local purity and a general expression for the moments of higher order. This generalizes previous results obtained for globally pure configurations. Furthermore, through the introduction of a partition function for a suitable canonical ensemble, we compute the approximate expression of the first moment of the marginal purity in the high-temperature regime. In the process, we establish a formal connection with the theory of quantum twirling maps that provides an alternative, possibly fruitful, way of performing the calculation. (paper)
Marimon, Maria Paula C; Roisenberg, Ari; Suhogusoff, Alexandra V; Viero, Antonio Pedro
2013-06-01
High fluoride concentrations (up to 11 mg/L) have been reported in the groundwater of the Guarani Aquifer System (Santa Maria Formation) in the central region of the state of Rio Grande do Sul, Southern Brazil. In this area, dental fluorosis is an endemic disease. This paper presents the geochemical data and the combination of statistical analysis (Principal components and cluster analyses) and geochemical modeling to achieve the hydrogeochemistry of the groundwater and discusses the possible fluoride origin. The groundwater from the Santa Maria Formation is comprised of four different geochemical groups. The first group corresponds to a sodium chloride groundwater which evolves to sodium bicarbonate, the second one, both containing fluoride anomalies. The third group is represented by calcium bicarbonate groundwater, and in the fourth, magnesium is the distinctive parameter. The statistical and geochemical analyses supported by isotopic measurements indicated that groundwater may have originated from mixtures of deeper aquifers and the fluoride concentrations could be derived from rock/water interactions (e.g., desorption from clay minerals).
DEFF Research Database (Denmark)
Tryggestad, Kjell
2004-01-01
The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...
Laurinaviciene, Aida; Plancoulaine, Benoit; Baltrusaityte, Indra; Meskauskas, Raimundas; Besusparis, Justinas; Lesciute-Krilaviciene, Daiva; Raudeliunas, Darius; Iqbal, Yasir; Herlin, Paulette; Laurinavicius, Arvydas
2014-01-01
Digital immunohistochemistry (IHC) is one of the most promising applications brought by new generation image analysis (IA). While conventional IHC staining quality is monitored by semi-quantitative visual evaluation of tissue controls, IA may require more sensitive measurement. We designed an automated system to digitally monitor IHC multi-tissue controls, based on SQL-level integration of laboratory information system with image and statistical analysis tools. Consecutive sections of TMA containing 10 cores of breast cancer tissue were used as tissue controls in routine Ki67 IHC testing. Ventana slide label barcode ID was sent to the LIS to register the serial section sequence. The slides were stained and scanned (Aperio ScanScope XT), IA was performed by the Aperio/Leica Colocalization and Genie Classifier/Nuclear algorithms. SQL-based integration ensured automated statistical analysis of the IA data by the SAS Enterprise Guide project. Factor analysis and plot visualizations were performed to explore slide-to-slide variation of the Ki67 IHC staining results in the control tissue. Slide-to-slide intra-core IHC staining analysis revealed rather significant variation of the variables reflecting the sample size, while Brown and Blue Intensity were relatively stable. To further investigate this variation, the IA results from the 10 cores were aggregated to minimize tissue-related variance. Factor analysis revealed association between the variables reflecting the sample size detected by IA and Blue Intensity. Since the main feature to be extracted from the tissue controls was staining intensity, we further explored the variation of the intensity variables in the individual cores. MeanBrownBlue Intensity ((Brown+Blue)/2) and DiffBrownBlue Intensity (Brown-Blue) were introduced to better contrast the absolute intensity and the colour balance variation in each core; relevant factor scores were extracted. Finally, tissue-related factors of IHC staining variance were
Statistical properties of spectra in harmonically trapped spin-orbit coupled systems
DEFF Research Database (Denmark)
V. Marchukov, O.; G. Volosniev, A.; V. Fedorov, D.
2014-01-01
We compute single-particle energy spectra for a one-body Hamiltonian consisting of a two-dimensional deformed harmonic oscillator potential, the Rashba spin-orbit coupling and the Zeeman term. To investigate the statistical properties of the obtained spectra as functions of deformation, spin......-orbit and Zeeman strengths we examine the distributions of the nearest neighbor spacings. We find that the shapes of these distributions depend strongly on the three potential parameters. We show that the obtained shapes in some cases can be well approximated with the standard Poisson, Brody and Wigner...... distributions. The Brody and Wigner distributions characterize irregular motion and help identify quantum chaotic systems. We present a special choices of deformation and spin-orbit strengths without the Zeeman term which provide a fair reproduction of the fourth-power repelling Wigner distribution. By adding...
Directory of Open Access Journals (Sweden)
Frank Pega
2013-01-01
Full Text Available Objective. Effectively addressing health disparities experienced by sexual minority populations requires high-quality official data on sexual orientation. We developed a conceptual framework of sexual orientation to improve the quality of sexual orientation data in New Zealand’s Official Statistics System. Methods. We reviewed conceptual and methodological literature, culminating in a draft framework. To improve the framework, we held focus groups and key-informant interviews with sexual minority stakeholders and producers and consumers of official statistics. An advisory board of experts provided additional guidance. Results. The framework proposes working definitions of the sexual orientation topic and measurement concepts, describes dimensions of the measurement concepts, discusses variables framing the measurement concepts, and outlines conceptual grey areas. Conclusion. The framework proposes standard definitions and concepts for the collection of official sexual orientation data in New Zealand. It presents a model for producers of official statistics in other countries, who wish to improve the quality of health data on their citizens.
Directory of Open Access Journals (Sweden)
Fiser Ondrej
2011-01-01
Full Text Available Long-term monthly and annual statistics of the attenuation of electromagnetic waves that have been obtained from 6 years of measurements on a free space optical path, 853 meters long, with a wavelength of 850 nm and on a precisely parallel radio path with a frequency of 58 GHz are presented. All the attenuation events observed are systematically classified according to the hydrometeor type causing the particular event. Monthly and yearly propagation statistics on the free space optical path and radio path are obtained. The influence of individual hydrometeors on attenuation is analysed. The obtained propagation statistics are compared to the calculated statistics using ITU-R models. The calculated attenuation statistics both at 850 nm and 58 GHz underestimate the measured statistics for higher attenuation levels. The availability performance of a simulated hybrid FSO/RF system is analysed based on the measured data.
Dispersion relations in three-particle systems
International Nuclear Information System (INIS)
Grach, I.L.; Harodetskij, I.M.; Shmatikov, M.Zh.
1979-01-01
Positions of all dynamical singularities of the triangular nonrelativistic diagram are calculated including the form factors. The jumps of the amplitude are written in an analitical form. The dispersion method predictions for bound states in the three-particle system are compared with the results of the Amado exactly solvable model. It is shown that the one-channel N/D method is equivalent to the pole approximation in the Amado model, and that the three-particle s channel unitarity should be taken into account calculating (in the dispersion method) the ground and excited states of the three-particle system. The relation of the three-particle unitary contribution to the Thomas theorem and Efimov effect is briefly discussed
Harrou, Fouzi
2017-09-18
This study reports the development of an innovative fault detection and diagnosis scheme to monitor the direct current (DC) side of photovoltaic (PV) systems. Towards this end, we propose a statistical approach that exploits the advantages of one-diode model and those of the univariate and multivariate exponentially weighted moving average (EWMA) charts to better detect faults. Specifically, we generate array\\'s residuals of current, voltage and power using measured temperature and irradiance. These residuals capture the difference between the measurements and the predictions MPP for the current, voltage and power from the one-diode model, and use them as fault indicators. Then, we apply the multivariate EWMA (MEWMA) monitoring chart to the residuals to detect faults. However, a MEWMA scheme cannot identify the type of fault. Once a fault is detected in MEWMA chart, the univariate EWMA chart based on current and voltage indicators is used to identify the type of fault (e.g., short-circuit, open-circuit and shading faults). We applied this strategy to real data from the grid-connected PV system installed at the Renewable Energy Development Center, Algeria. Results show the capacity of the proposed strategy to monitors the DC side of PV systems and detects partial shading.
International Nuclear Information System (INIS)
Wang Yuming; Cao Hao; Chen Junhong; Zhang Tengfei; Yu Sijie; Zheng Huinan; Shen Chenglong; Wang, S.; Zhang Jie
2010-01-01
In this paper, we present an automated system, which has the capability to catch and track solar limb prominences based on observations from the extreme-ultraviolet (EUV) 304 A passband. The characteristic parameters and their evolution, including height, position angle, area, length, and brightness, are obtained without manual interventions. By applying the system to the STEREO-B/SECCHI/EUVI 304 A data during 2007 April-2009 October, we obtain a total of 9477 well-tracked prominences and a catalog of these events available online. A detailed analysis of these prominences suggests that the system has a rather good performance. We have obtained several interesting statistical results based on the catalog. Most prominences appear below the latitude of 60 0 and at the height of about 26 Mm above the solar surface. Most of them are quite stable during the period they are tracked. Nevertheless, some prominences have an upward speed of more than 100 km s -1 , and some others show significant downward and/or azimuthal speeds. There are strong correlations among the brightness, area, and height. The expansion of a prominence is probably one major cause of its fading during the rising or erupting process.
Statistical properties of highly excited quantum eigenstates of a strongly chaotic system
International Nuclear Information System (INIS)
Aurich, R.; Steiner, F.
1992-06-01
Statistical properties of highly excited quantal eigenstates are studied for the free motion (geodesic flow) on a compact surface of constant negative curvature (hyperbolic octagon) which represents a strongly chaotic system (K-system). The eigenstates are expanded in a circular-wave basis, and it turns out that the expansion coefficients behave as Gaussian pseudo-random numbers. It is shown that this property leads to a Gaussian amplitude distribution P(ψ) in the semiclassical limit, i.e. the wavefunctions behave as Gaussian random functions. This behaviour, which should hold for chaotic systems in general, is nicely confirmed for eigenstates lying 10000 states above the ground state thus probing the semiclassical limit. In addition, the autocorrelation function and the path-correlation function are calculated and compared with a crude semiclassical Bessel-function approximation. Agreement with the semiclassical prediction is only found, if a local averaging is performed over roughly 1000 de Broglie wavelengths. On smaller scales, the eigenstates show much more structure than predicted by the first semiclassical approximation. (orig.)
International Nuclear Information System (INIS)
Lin, C.W.; Li, D.L.
1987-01-01
A statistical study is conducted to determine the effect of input time history duration on the response of systems supported by the structure. The model used in the study is a one-degree-of-freedom system mass supported by another one degree of freedom structure mass. The input used is generated from a Monte-Carlo simulation procedure with a prescribed power spectrum density such that the input response spectrum matched the Reg. Guide 1.60 response spectrum. The models were analyzed for different combinations of mass ratios and frequency ratios (ratios of the system versus the supporting structure). Time history inputs used vary from 5 to 20 seconds. Only the 20 second time history matched the Reg. Guide 1.60 response spectrum. Time history inputs shorter than 20 seconds were simply truncated at the tail end. The results of the study indicate that it is necessary to increase the response magnitude by about 20% if a 5-second time history is to be used. For a 10-second input, an increase of 10% will suffice. Whereas for a 15-second input, no adjustment is necessary. (orig./HP)
Statistical analysis and dimensioning of a wind farm energy storage system
Directory of Open Access Journals (Sweden)
Waśkowicz Bartosz
2017-06-01
Full Text Available The growth in renewable power generation and more strict local regulations regarding power quality indices will make it necessary to use energy storage systems with renewable power plants in the near future. The capacity of storage systems can be determined using different methods most of which can be divided into either deterministic or stochastic. Deterministic methods are often complicated with numerous parameters and complex models for long term prediction often incorporating meteorological data. Stochastic methods use statistics for ESS (Energy Storage System sizing, which is somewhat intuitive for dealing with the random element of wind speed variation. The proposed method in this paper performs stabilization of output power at one minute intervals to reduce the negative influence of the wind farm on the power grid in order to meet local regulations. This paper shows the process of sizing the ESS for two selected wind farms, based on their levels of variation in generated power and also, for each, how the negative influences on the power grid in the form of voltage variation and a shortterm flicker factor are decreased.
Energy Technology Data Exchange (ETDEWEB)
Rodriguez Garcia, Alfredo; De la Torre Vega, Eli [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico)
2009-07-01
The integration of the first Aeolian farm of large scale (La Venta II) to the National Interconnected System requires taking into account the random and discontinuous nature of the Aeolian energy. An important tool, for this task, is a system for the prognosis of the Aeolian energy in the short term. For this reason, the Instituto of Investigaciones Electricas (IIE) developed a statistical model to realize this prognosis. The prediction is done through an adaptable linear combination of alternative competing models, where the weights given to each model are based on its more recent prognosis quality. Also, the application results of the prognoses system are presented and analyzed. [Spanish] La integracion de la primera grana eolica de gran escala (La Venta II) al Sistema Interconectado Nacional requiere tomar en cuenta la naturaleza aleatoria y discontinua de la energia eolica. Una importante herramienta, para esta tarea, es un sistema para el pronostico de la energia eolica a corto plazo. Por ello, el Instituto de Investigaciones Electricas (IIE) desarrollo un modelo estadistico para realizar este pronostico. La prediccion es hecha a traves de una combinacion lineal adaptable de modelos competidores alternativos, donde los pesos dados a cada modelo estan basados en su mas reciente calidad de pronostico. Tambien se presentan y analizan los resultados de la aplicacion del sistema de pronosticos.
On Designing Multicore-Aware Simulators for Systems Biology Endowed with OnLine Statistics
Directory of Open Access Journals (Sweden)
Marco Aldinucci
2014-01-01
Full Text Available The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed.
On designing multicore-aware simulators for systems biology endowed with OnLine statistics.
Aldinucci, Marco; Calcagno, Cristina; Coppo, Mario; Damiani, Ferruccio; Drocco, Maurizio; Sciacca, Eva; Spinella, Salvatore; Torquati, Massimo; Troina, Angelo
2014-01-01
The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big) data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed.
Luna, Andrew L.
1998-01-01
The purpose of this study was to determine trends and difficulties concerning student incident reports within the residence halls as they relate to the incident reporting system from the Department of Housing and Residential Life at a Southeastern Doctoral I Granting Institution. This study used the frequency distributions of each classified…
Lü, Yiran; Hao, Shuxin; Zhang, Guoqing; Liu, Jie; Liu, Yue; Xu, Dongqun
2018-01-01
To implement the online statistical analysis function in information system of air pollution and health impact monitoring, and obtain the data analysis information real-time. Using the descriptive statistical method as well as time-series analysis and multivariate regression analysis, SQL language and visual tools to implement online statistical analysis based on database software. Generate basic statistical tables and summary tables of air pollution exposure and health impact data online; Generate tendency charts of each data part online and proceed interaction connecting to database; Generate butting sheets which can lead to R, SAS and SPSS directly online. The information system air pollution and health impact monitoring implements the statistical analysis function online, which can provide real-time analysis result to its users.
Directory of Open Access Journals (Sweden)
Haruhiko Madarame
2018-02-01
Full Text Available Abstract AIMS To advance knowledge of long-term development of basketball players, this study investigated age and sex differences in game-related statistics which discriminate winners from losers in World Basketball Championships held after the 2010 rule change. METHODS A total of 935 games from six categories (under-17, under-19 and open age for both men and women were analyzed. All games were classified into three types (balanced, unbalanced and very unbalanced according to point differential by a k-means cluster analysis. A discriminant analysis was performed to identify game-related statistics which discriminate winners from losers in each game type. An absolute value of a structural coefficient (SC equal to or above 0.30 was considered relevant for the discrimination. RESULTS In balanced games, assists discriminated winners from losers in open games (men, |SC| = 0.32; women, |SC| = 0.34, whereas successful free throws did so in under-17 games (men, |SC| = 0.30; women, |SC| = 0.31. Successful 2-point field goals discriminated winners from losers only in women’s games (under-19, |SC| = 0.38; open, |SC| = 0.36. CONCLUSION There were three novel findings in balanced games: 1 successful free throws but not assists discriminated winners from losers in under-17 games; 2 successful 2-point field goals discriminated winners from losers in women’s games but not in men’s games; and 3 discriminating power of successful 3-point field goals was extremely small in women’s games. These results may be related to the new rules for the shot clock and the 3-point distance.
Measuring Relative Coupling Strength in Circadian Systems.
Schmal, Christoph; Herzog, Erik D; Herzel, Hanspeter
2018-02-01
Modern imaging techniques allow the monitoring of circadian rhythms of single cells. Coupling between these single cellular circadian oscillators can generate coherent periodic signals on the tissue level that subsequently orchestrate physiological outputs. The strength of coupling in such systems of oscillators is often unclear. In particular, effects on coupling strength by varying cell densities, by knockouts, and by inhibitor applications are debated. In this study, we suggest to quantify the relative coupling strength via analyzing period, phase, and amplitude distributions in ensembles of individual circadian oscillators. Simulations of different oscillator networks show that period and phase distributions become narrower with increasing coupling strength. Moreover, amplitudes can increase due to resonance effects. Variances of periods and phases decay monotonically with coupling strength, and can serve therefore as measures of relative coupling strength. Our theoretical predictions are confirmed by studying recently published experimental data from PERIOD2 expression in slices of the suprachiasmatic nucleus during and after the application of tetrodotoxin (TTX). On analyzing the corresponding period, phase, and amplitude distributions, we can show that treatment with TTX can be associated with a reduced coupling strength in the system of coupled oscillators. Analysis of an oscillator network derived directly from the data confirms our conclusions. We suggest that our approach is also applicable to quantify coupling in fibroblast cultures and hepatocyte networks, and for social synchronization of circadian rhythmicity in rodents, flies, and bees.
Maximum entropy approach to H-theory: Statistical mechanics of hierarchical systems.
Vasconcelos, Giovani L; Salazar, Domingos S P; Macêdo, A M S
2018-02-01
A formalism, called H-theory, is applied to the problem of statistical equilibrium of a hierarchical complex system with multiple time and length scales. In this approach, the system is formally treated as being composed of a small subsystem-representing the region where the measurements are made-in contact with a set of "nested heat reservoirs" corresponding to the hierarchical structure of the system, where the temperatures of the reservoirs are allowed to fluctuate owing to the complex interactions between degrees of freedom at different scales. The probability distribution function (pdf) of the temperature of the reservoir at a given scale, conditioned on the temperature of the reservoir at the next largest scale in the hierarchy, is determined from a maximum entropy principle subject to appropriate constraints that describe the thermal equilibrium properties of the system. The marginal temperature distribution of the innermost reservoir is obtained by integrating over the conditional distributions of all larger scales, and the resulting pdf is written in analytical form in terms of certain special transcendental functions, known as the Fox H functions. The distribution of states of the small subsystem is then computed by averaging the quasiequilibrium Boltzmann distribution over the temperature of the innermost reservoir. This distribution can also be written in terms of H functions. The general family of distributions reported here recovers, as particular cases, the stationary distributions recently obtained by Macêdo et al. [Phys. Rev. E 95, 032315 (2017)10.1103/PhysRevE.95.032315] from a stochastic dynamical approach to the problem.
Directory of Open Access Journals (Sweden)
Gaião, C.
2012-06-01
Full Text Available This paper presents an expert system to support the inspection and diagnosis of partition walls or wall coverings mounted using the Drywall (DW construction method. This system includes a classification of anomalies in DW and their probable causes.
This inspection system was used in a field work that included the observation of 121 DWs. This paper includes a statistical analysis of the anomalies observed during these inspections and their probable causes. The correlation between anomalies and causes in the sample is also thoroughly analyzed. Anomalies are also evaluated for area affected, size, repair urgency and aesthetic value of the affected area.
The conclusions of the statistical analysis allowed the creation of an inventory of preventive measures to be implemented in the design, execution and use phases in order to lessen the magnitude or eradicate the occurrence of anomalies in DW. These measures could directly help improve the quality of construction.
Este trabajo presenta un sistema experto de apoyo a la inspección y diagnóstico de tabiques o revestimientos de yeso laminado. Dicho sistema, que permite la clasificación de las anomalías del yeso laminado y sus causas probables, se empleó en un trabajo de campo en el que se estudiaron 121 elementos construidos con este material. El trabajo incluye el análisis estadístico de las anomalías detectadas durante las inspecciones y sus motivos probables. También se analizó en detalle la correlación entre las anomalías y sus causas, evaluándose aquellas en función de la superficie afectada, la urgencia de las reparaciones y el valor estético de la zona implicada.
Las conclusiones del análisis estadístico permitieron la elaboración de un inventario de medidas preventivas que deberían implantarse en las fases de proyecto, ejecución y utilización de estos elementos a fin de erradicar la aparición de anomalías en el yeso laminado o reducir su frecuencia. Dichas
International Nuclear Information System (INIS)
Abdelgadir, O. M.
2002-09-01
In this study 587 Sudanese woman were studied those women were referred to gynecological clinics a infertile cases. Hormonal investigations were done for them, prolactin, (PRL). Female stimulating hormones (FSH) luotulizing hormones (LH) level were analyzed at Sudan Atomic Energy Commission (SAEC), (RIA ) lab, with the radioimmunoassay (RIA) method. The objective of this study was to find the relation between age versus hyperprolicinemia and (PCOS) polycystic ovary syndrome. Statistical analysis was done with the (SPSS) computer program. The result was 39.2% of the total patient 587 were high prolactin level hyper prolactin >370 mu/I which 10% of them were in the age between 25-30 years old. Age between 30-35 years old was found to be high frequency complain high FSH levels (>8 mu/ I) 29.1% of the patients. Found to be of high LH/FSH. Ratio which clear indication of polycystic ovary syndrome. (PCOS). (Author)
The product composition control system at Savannah River: Statistical process control algorithm
International Nuclear Information System (INIS)
Brown, K.G.
1994-01-01
The Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS) will be used to immobilize the approximately 130 million liters of high-level nuclear waste currently stored at the site in 51 carbon steel tanks. Waste handling operations separate this waste into highly radioactive insoluble sludge and precipitate and less radioactive water soluble salts. In DWPF, precipitate (PHA) is blended with insoluble sludge and ground glass frit to produce melter feed slurry which is continuously fed to the DWPF melter. The melter produces a molten borosilicate glass which is poured into stainless steel canisters for cooling and, ultimately, shipment to and storage in an geologic repository. Described here is the Product Composition Control System (PCCS) process control algorithm. The PCCS is the amalgam of computer hardware and software intended to ensure that the melt will be processable and that the glass wasteform produced will be acceptable. Within PCCS, the Statistical Process Control (SPC) Algorithm is the means which guides control of the DWPF process. The SPC Algorithm is necessary to control the multivariate DWPF process in the face of uncertainties arising from the process, its feeds, sampling, modeling, and measurement systems. This article describes the functions performed by the SPC Algorithm, characterization of DWPF prior to making product, accounting for prediction uncertainty, accounting for measurement uncertainty, monitoring a SME batch, incorporating process information, and advantages of the algorithm. 9 refs., 6 figs
Farrington, C. Paddy; Noufaily, Angela; Andrews, Nick J.; Charlett, Andre
2016-01-01
A large-scale multiple surveillance system for infectious disease outbreaks has been in operation in England and Wales since the early 1990s. Changes to the statistical algorithm at the heart of the system were proposed and the purpose of this paper is to compare two new algorithms with the original algorithm. Test data to evaluate performance are created from weekly counts of the number of cases of each of more than 2000 diseases over a twenty-year period. The time series of each disease is separated into one series giving the baseline (background) disease incidence and a second series giving disease outbreaks. One series is shifted forward by twelve months and the two are then recombined, giving a realistic series in which it is known where outbreaks have been added. The metrics used to evaluate performance include a scoring rule that appropriately balances sensitivity against specificity and is sensitive to variation in probabilities near 1. In the context of disease surveillance, a scoring rule can be adapted to reflect the size of outbreaks and this was done. Results indicate that the two new algorithms are comparable to each other and better than the algorithm they were designed to replace. PMID:27513749
Levine-Wissing, Robin
2012-01-01
All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep
Lee, Jae Eun; Sung, Jung Hye; Malouhi, Mohamad
2015-12-22
There is abundant evidence that neighborhood characteristics are significantly linked to the health of the inhabitants of a given space within a given time frame. This study is to statistically validate a web-based GIS application designed to support cardiovascular-related research developed by the NIH funded Research Centers in Minority Institutions (RCMI) Translational Research Network (RTRN) Data Coordinating Center (DCC) and discuss its applicability to cardiovascular studies. Geo-referencing, geocoding and geospatial analyses were conducted for 500 randomly selected home addresses in a U.S. southeastern Metropolitan area. The correlation coefficient, factor analysis and Cronbach's alpha (α) were estimated to quantify measures of the internal consistency, reliability and construct/criterion/discriminant validity of the cardiovascular-related geospatial variables (walk score, number of hospitals, fast food restaurants, parks and sidewalks). Cronbach's α for CVD GEOSPATIAL variables was 95.5%, implying successful internal consistency. Walk scores were significantly correlated with number of hospitals (r = 0.715; p restaurants (r = 0.729; p application were internally consistent and demonstrated satisfactory validity. Therefore, the GIS application may be useful to apply to cardiovascular-related studies aimed to investigate potential impact of geospatial factors on diseases and/or the long-term effect of clinical trials.
A concept of customer–provider relation monitoring system solution
Directory of Open Access Journals (Sweden)
Naděžda Chalupová
2008-01-01
Full Text Available The contribution deals with design of customer–provider relationship monitoring system solution with regard to needs of business managers and analytics and to possibilities of contemporaneous information and communication technologies.The attention is followed to targeted modelling, what brings possibilities of acquisition of bigger overview about things taking place in the relation. In consequence it describes the functionality of analytical systems producing these very strategically valuable models – to so-called business intelligence tools. Onward it deals with modern technologies conductive to above mentioned system implementation – with Ajax concept and with some XML applications: PMML for analytical models manipulation, XSLT for XML data transformations to various formats, SVG for representing pictures of statistical graphs etc. and MathML for description of mathematical formulas created in analytical systems.Following these basis it suggests technological solution of some parts of client–provider relationship watching and evaluating system and it discusses its potential advantages and problems, which can occur.
Traffic and related self-driven many-particle systems
Helbing, Dirk
2001-10-01
Since the subject of traffic dynamics has captured the interest of physicists, many surprising effects have been revealed and explained. Some of the questions now understood are the following: Why are vehicles sometimes stopped by ``phantom traffic jams'' even though drivers all like to drive fast? What are the mechanisms behind stop-and-go traffic? Why are there several different kinds of congestion, and how are they related? Why do most traffic jams occur considerably before the road capacity is reached? Can a temporary reduction in the volume of traffic cause a lasting traffic jam? Under which conditions can speed limits speed up traffic? Why do pedestrians moving in opposite directions normally organize into lanes, while similar systems ``freeze by heating''? All of these questions have been answered by applying and extending methods from statistical physics and nonlinear dynamics to self-driven many-particle systems. This article considers the empirical data and then reviews the main approaches to modeling pedestrian and vehicle traffic. These include microscopic (particle-based), mesoscopic (gas-kinetic), and macroscopic (fluid-dynamic) models. Attention is also paid to the formulation of a micro-macro link, to aspects of universality, and to other unifying concepts, such as a general modeling framework for self-driven many-particle systems, including spin systems. While the primary focus is upon vehicle and pedestrian traffic, applications to biological or socio-economic systems such as bacterial colonies, flocks of birds, panics, and stock market dynamics are touched upon as well.
Version E2 from Dimco-System for the statistical calculation of components
International Nuclear Information System (INIS)
Moreno Gonzalez, A.
1981-01-01
A short description of the general system Dimco, together with a detailed description of E2 version are presented. E2 version is a two-dimensional finite element structural code. To illustrate the posibilities of E2 version, some results obtained with this new version are presented. These results are related with the following behaviour of the material: a) elastic, b) thermo-elastic, c) Plastic and d) creep. (author)
Rodríguez, Yeinzon; Beltrán Almeida, Juan P.; Valenzuela-Toledo, César A.
2013-04-01
We present the different consistency relations that can be seen as variations of the well known Suyama-Yamaguchi (SY) consistency relation τNL>=((6/5)fNL)2, the latter involving the levels of non-gaussianity fNL and τNL in the primordial curvature perturbation ζ. It has been (implicitly) claimed that the following variation: τNL(k1,k3)>=((6/5))2fNL(k1)fNL(k3), which we call ``the fourth variety'', in the collapsed (for τNL) and squeezed (for fNL) limits is always satisfied independently of any physics; however, the proof depends sensitively on the assumption of scale-invariance (expressing this way the fourth variety of the SY consistency relation as τNL>=((6/5)fNL)2) which only applies for cosmological models involving Lorentz-invariant scalar fields (at least at tree level), leaving room for a strong violation of this variety of the consistency relation when non-trivial degrees of freedom, for instance vector fields, are in charge of the generation of the primordial curvature perturbation. With this in mind as a motivation, we explicitly state, in the first part of this work, under which conditions the SY consistency relation has been claimed to hold in its different varieties (implicitly) presented in the literature since its inception back in 2008; as a result, we show for the first time that the variety τNL(k1,k1)>=((6/5)fNL(k1))2, which we call ``the fifth variety'', is always satisfied even when there is strong scale-dependence and high levels of statistical anisotropy as long as statistical homogeneity holds: thus, an observed violation of this specific variety would prevent the comparison between theory and observation, shaking this way the foundations of cosmology as a science. In the second part, we concern about the existence of non-trivial degrees of freedom, concretely vector fields for which the levels of non-gaussianity have been calculated for very few models; among them, and by making use of the δN formalism at tree level, we study a class
International Nuclear Information System (INIS)
Rodríguez, Yeinzon; Almeida, Juan P. Beltrán; Valenzuela-Toledo, César A.
2013-01-01
We present the different consistency relations that can be seen as variations of the well known Suyama-Yamaguchi (SY) consistency relation τ NL ≥((6/5)f NL ) 2 , the latter involving the levels of non-gaussianity f NL and τ NL in the primordial curvature perturbation ζ. It has been (implicitly) claimed that the following variation: τ NL (k 1 ,k 3 )≥((6/5)) 2 f NL (k 1 )f NL (k 3 ), which we call ''the fourth variety'', in the collapsed (for τ NL ) and squeezed (for f NL ) limits is always satisfied independently of any physics; however, the proof depends sensitively on the assumption of scale-invariance (expressing this way the fourth variety of the SY consistency relation as τ NL ≥((6/5)f NL ) 2 ) which only applies for cosmological models involving Lorentz-invariant scalar fields (at least at tree level), leaving room for a strong violation of this variety of the consistency relation when non-trivial degrees of freedom, for instance vector fields, are in charge of the generation of the primordial curvature perturbation. With this in mind as a motivation, we explicitly state, in the first part of this work, under which conditions the SY consistency relation has been claimed to hold in its different varieties (implicitly) presented in the literature since its inception back in 2008; as a result, we show for the first time that the variety τ NL (k 1 ,k 1 )≥((6/5)f NL (k 1 )) 2 , which we call ''the fifth variety'', is always satisfied even when there is strong scale-dependence and high levels of statistical anisotropy as long as statistical homogeneity holds: thus, an observed violation of this specific variety would prevent the comparison between theory and observation, shaking this way the foundations of cosmology as a science. In the second part, we concern about the existence of non-trivial degrees of freedom, concretely vector fields for which the levels of non-gaussianity have been calculated for very few models; among them, and by making use of
Directory of Open Access Journals (Sweden)
Rui Xu
2013-01-01
Full Text Available Minimum description length (MDL based group-wise registration was a state-of-the-art method to determine the corresponding points of 3D shapes for the construction of statistical shape models (SSMs. However, it suffered from the problem that determined corresponding points did not uniformly spread on original shapes, since corresponding points were obtained by uniformly sampling the aligned shape on the parameterized space of unit sphere. We proposed a particle-system based method to obtain adaptive sampling positions on the unit sphere to resolve this problem. Here, a set of particles was placed on the unit sphere to construct a particle system whose energy was related to the distortions of parameterized meshes. By minimizing this energy, each particle was moved on the unit sphere. When the system became steady, particles were treated as vertices to build a spherical mesh, which was then relaxed to slightly adjust vertices to obtain optimal sampling-positions. We used 47 cases of (left and right lungs and 50 cases of livers, (left and right kidneys, and spleens for evaluations. Experiments showed that the proposed method was able to resolve the problem of the original MDL method, and the proposed method performed better in the generalization and specificity tests.
International Nuclear Information System (INIS)
Max Morris
2001-01-01
Recent advances in sensor technology and engineering have made it possible to assemble many related sensors in a common array, often of small physical size. Sensor arrays may report an entire vector of measured values in each data collection cycle, typically one value per sensor per sampling time. The larger quantities of data provided by larger arrays certainly contain more information, however in some cases experience suggests that dramatic increases in array size do not always lead to corresponding improvements in the practical value of the data. The work leading to this report was motivated by the need to develop computational planning tools to approximate the relative effectiveness of arrays of different size (or scale) in a wide variety of contexts. The basis of the work is a statistical model of a generic sensor array. It includes features representing measurement error, both common to all sensors and independent from sensor to sensor, and the stochastic relationships between the quantities to be measured by the sensors. The model can be used to assess the effectiveness of hypothetical arrays in classifying objects or events from two classes. A computer program is presented for evaluating the misclassification rates which can be expected when arrays are calibrated using a given number of training samples, or the number of training samples required to attain a given level of classification accuracy. The program is also available via email from the first author for a limited time
Directory of Open Access Journals (Sweden)
Jordan MI
2006-05-01
Full Text Available Abstract Background The statistical modeling of biomedical corpora could yield integrated, coarse-to-fine views of biological phenomena that complement discoveries made from analysis of molecular sequence and profiling data. Here, the potential of such modeling is demonstrated by examining the 5,225 free-text items in the Caenorhabditis Genetic Center (CGC Bibliography using techniques from statistical information retrieval. Items in the CGC biomedical text corpus were modeled using the Latent Dirichlet Allocation (LDA model. LDA is a hierarchical Bayesian model which represents a document as a random mixture over latent topics; each topic is characterized by a distribution over words. Results An LDA model estimated from CGC items had better predictive performance than two standard models (unigram and mixture of unigrams trained using the same data. To illustrate the practical utility of LDA models of biomedical corpora, a trained CGC LDA model was used for a retrospective study of nematode genes known to be associated with life span modification. Corpus-, document-, and word-level LDA parameters were combined with terms from the Gene Ontology to enhance the explanatory value of the CGC LDA model, and to suggest additional candidates for age-related genes. A novel, pairwise document similarity measure based on the posterior distribution on the topic simplex was formulated and used to search the CGC database for "homologs" of a "query" document discussing the life span-modifying clk-2 gene. Inspection of these document homologs enabled and facilitated the production of hypotheses about the function and role of clk-2. Conclusion Like other graphical models for genetic, genomic and other types of biological data, LDA provides a method for extracting unanticipated insights and generating predictions amenable to subsequent experimental validation.
A Framework for Structural Systems Based on the Principles of Statistical Mechanics
Directory of Open Access Journals (Sweden)
Rabindranath Andujar
2014-11-01
Full Text Available A framework is proposed in which certain well-known concepts of statistical mechanics and thermodynamics can be used and applied to characterize structural systems of interconnected Timoshenko beam elements. We first make the assimilation to a network of nodes linked by potential energy functions that are derived from the stiffness properties of the beams. Then we define a series of thermodynamic quantities inherent to a given structure (i.e., internal energy, heat, pressure, temperature, entropy, and kinetic energy. With the exception of entropy, all of them have the dimensions of energy. In order to test this new framework, a series of experiments was performed on four structural specimens within the elastic regime. Their configurations were taken from the seismic regulations known as Eurocode 8 in order to have a better based reference for our comparisons. The results are then explained within this new framework. Very interesting correlations have been found between the parameters given in the code and our concepts.
Information Graph Flow: A Geometric Approximation of Quantum and Statistical Systems
Vanchurin, Vitaly
2018-05-01
Given a quantum (or statistical) system with a very large number of degrees of freedom and a preferred tensor product factorization of the Hilbert space (or of a space of distributions) we describe how it can be approximated with a very low-dimensional field theory with geometric degrees of freedom. The geometric approximation procedure consists of three steps. The first step is to construct weighted graphs (we call information graphs) with vertices representing subsystems (e.g., qubits or random variables) and edges representing mutual information (or the flow of information) between subsystems. The second step is to deform the adjacency matrices of the information graphs to that of a (locally) low-dimensional lattice using the graph flow equations introduced in the paper. (Note that the graph flow produces very sparse adjacency matrices and thus might also be used, for example, in machine learning or network science where the task of graph sparsification is of a central importance.) The third step is to define an emergent metric and to derive an effective description of the metric and possibly other degrees of freedom. To illustrate the procedure we analyze (numerically and analytically) two information graph flows with geometric attractors (towards locally one- and two-dimensional lattices) and metric perturbations obeying a geometric flow equation. Our analysis also suggests a possible approach to (a non-perturbative) quantum gravity in which the geometry (a secondary object) emerges directly from a quantum state (a primary object) due to the flow of the information graphs.
Jegadeeshwaran, R.; Sugumaran, V.
2015-02-01
Hydraulic brakes in automobiles are important components for the safety of passengers; therefore, the brakes are a good subject for condition monitoring. The condition of the brake components can be monitored by using the vibration characteristics. On-line condition monitoring by using machine learning approach is proposed in this paper as a possible solution to such problems. The vibration signals for both good as well as faulty conditions of brakes were acquired from a hydraulic brake test setup with the help of a piezoelectric transducer and a data acquisition system. Descriptive statistical features were extracted from the acquired vibration signals and the feature selection was carried out using the C4.5 decision tree algorithm. There is no specific method to find the right number of features required for classification for a given problem. Hence an extensive study is needed to find the optimum number of features. The effect of the number of features was also studied, by using the decision tree as well as Support Vector Machines (SVM). The selected features were classified using the C-SVM and Nu-SVM with different kernel functions. The results are discussed and the conclusion of the study is presented.
Full counting statistics in a serially coupled double quantum dot system with spin-orbit coupling
Wang, Qiang; Xue, Hai-Bin; Xie, Hai-Qing
2018-04-01
We study the full counting statistics of electron transport through a serially coupled double quantum dot (QD) system with spin-orbit coupling (SOC) weakly coupled to two electrodes. We demonstrate that the spin polarizations of the source and drain electrodes determine whether the shot noise maintains super-Poissonian distribution, and whether the sign transitions of the skewness from positive to negative values and of the kurtosis from negative to positive values take place. In particular, the interplay between the spin polarizations of the source and drain electrodes and the magnitude of the external magnetic field, can give rise to a gate-voltage-tunable strong negative differential conductance (NDC) and the shot noise in this NDC region is significantly enhanced. Importantly, for a given SOC parameter, the obvious variation of the high-order current cumulants as a function of the energy-level detuning in a certain range, especially the dip position of the Fano factor of the skewness can be used to qualitatively extract the information about the magnitude of the SOC.
SSD for R: A Comprehensive Statistical Package to Analyze Single-System Data
Auerbach, Charles; Schudrich, Wendy Zeitlin
2013-01-01
The need for statistical analysis in single-subject designs presents a challenge, as analytical methods that are applied to group comparison studies are often not appropriate in single-subject research. "SSD for R" is a robust set of statistical functions with wide applicability to single-subject research. It is a comprehensive package…
Long-term strategy for the statistical design of a forest health monitoring system
Hans T. Schreuder; Raymond L. Czaplewski
1993-01-01
A conceptual framework is given for a broad-scale survey of forest health that accomplishes three objectives: generate descriptive statistics; detect changes in such statistics; and simplify analytical inferences that identify, and possibly establish cause-effect relationships. Our paper discusses the development of sampling schemes to satisfy these three objectives,...
Danel, Isabella; Bortman, Marcelo
2008-01-01
Vital records, the registration of births, deaths, marriages and divorces, and the vital statistics derived from these records serve two important purposes. Firstly, vital records are legal documents, but the focus of this review, is the role of vital records to create demographic and epidemiological statistics that are used in monitoring trends and developing health policies and programs....
Ishihara, Masamichi
2018-04-01
We studied the effects of nonextensivity on the phase transition for the system of finite volume V in the ϕ4 theory in the Tsallis nonextensive statistics of entropic parameter q and temperature T, when the deviation from the Boltzmann-Gibbs (BG) statistics, |q ‑ 1|, is small. We calculated the condensate and the effective mass to the order q ‑ 1 with the normalized q-expectation value under the free particle approximation with zero bare mass. The following facts were found. The condensate Φ divided by v, Φ/v, at q (v is the value of the condensate at T = 0) is smaller than that at q‧ for q > q‧ as a function of Tph/v which is the physical temperature Tph divided by v. The physical temperature Tph is related to the variation of the Tsallis entropy and the variation of the internal energies, and Tph at q = 1 coincides with T. The effective mass decreases, reaches minimum, and increases after that, as Tph increases. The effective mass at q > 1 is lighter than the effective mass at q = 1 at low physical temperature and heavier than the effective mass at q = 1 at high physical temperature. The effects of the nonextensivity on the physical quantity as a function of Tph become strong as |q ‑ 1| increases. The results indicate the significance of the definition of the expectation value, the definition of the physical temperature, and the constraints for the density operator, when the terms including the volume of the system are not negligible.
2017-12-08
STATISTICAL LINEAR TIME-VARYING SYSTEM MODEL OF HIGH GRAZING ANGLE SEA CLUTTER FOR COMPUTING INTERFERENCE POWER 1. INTRODUCTION Statistical linear time...beam. We can approximate one of the sinc factors using the Dirichlet kernel to facilitate computation of the integral in (6) as follows: ∣∣∣∣sinc(WB...plotted in Figure 4. The resultant autocorrelation can then be found by substituting (18) into (28). The Python code used to generate Figures 1-4 is found
Cochran, Susan D; Drescher, Jack; Kismödi, Eszter; Giami, Alain; García-Moreno, Claudia; Atalla, Elham; Marais, Adele; Vieira, Elisabeth Meloni; Reed, Geoffrey M
2014-09-01
The World Health Organization is developing the 11th revision of the International Statistical Classification of Diseases and Related Health Problems (ICD-11), planned for publication in 2017. The Working Group on the Classification of Sexual Disorders and Sexual Health was charged with reviewing and making recommendations on disease categories related to sexuality in the chapter on mental and behavioural disorders in the 10th revision (ICD-10), published in 1990. This chapter includes categories for diagnoses based primarily on sexual orientation even though ICD-10 states that sexual orientation alone is not a disorder. This article reviews the scientific evidence and clinical rationale for continuing to include these categories in the ICD. A review of the evidence published since 1990 found little scientific interest in these categories. In addition, the Working Group found no evidence that they are clinically useful: they neither contribute to health service delivery or treatment selection nor provide essential information for public health surveillance. Moreover, use of these categories may create unnecessary harm by delaying accurate diagnosis and treatment. The Working Group recommends that these categories be deleted entirely from ICD-11. Health concerns related to sexual orientation can be better addressed using other ICD categories.
The κ parameter and κ-distribution in κ-deformed statistics for the systems in an external field
International Nuclear Information System (INIS)
Guo, Lina; Du, Jiulin
2007-01-01
It is naturally important question for us to ask under what physical situation should the κ-deformed statistics be suitable for the statistical description of a system and what should the κ parameter stand for. In this Letter, a formula expression of κ parameter is derived on the basis of the κ-H theorem, the κ-velocity distribution and the generalized Boltzmann equation in the framework of κ-deformed statistics. We thus obtain a physical interpretation for the parameter κ 0 with regard to the temperature gradient and the external force field. We show, as the q-statistics based on Tsallis entropy, the κ-deformed statistics may also be the candidate one suitable for the statistical description of the systems in external fields when being in the nonequilibrium stationary state, but has different physical characteristics. Namely, the κ-distribution is found to describe the nonequilibrium stationary state of the system where the external force should be vertical to the temperature gradient
Cohen, Alan A; Milot, Emmanuel; Yong, Jian; Seplaki, Christopher L; Fülöp, Tamàs; Bandeen-Roche, Karen; Fried, Linda P
2013-03-01
Previous studies have identified many biomarkers that are associated with aging and related outcomes, but the relevance of these markers for underlying processes and their relationship to hypothesized systemic dysregulation is not clear. We address this gap by presenting a novel method for measuring dysregulation via the joint distribution of multiple biomarkers and assessing associations of dysregulation with age and mortality. Using longitudinal data from the Women's Health and Aging Study, we selected a 14-marker subset from 63 blood measures: those that diverged from the baseline population mean with age. For the 14 markers and all combinatorial sub-subsets we calculated a multivariate distance called the Mahalanobis distance (MHBD) for all observations, indicating how "strange" each individual's biomarker profile was relative to the baseline population mean. In most models, MHBD correlated positively with age, MHBD increased within individuals over time, and higher MHBD predicted higher risk of subsequent mortality. Predictive power increased as more variables were incorporated into the calculation of MHBD. Biomarkers from multiple systems were implicated. These results support hypotheses of simultaneous dysregulation in multiple systems and confirm the need for longitudinal, multivariate approaches to understanding biomarkers in aging. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Jelly Views : Extending Relational Database Systems Toward Deductive Database Systems
Directory of Open Access Journals (Sweden)
Igor Wojnicki
2004-01-01
Full Text Available This paper regards the Jelly View technology, which provides a new, practical methodology for knowledge decomposition, storage, and retrieval within Relational Database Management Systems (RDBMS. Intensional Knowledge clauses (rules are decomposed and stored in the RDBMS founding reusable components. The results of the rule-based processing are visible as regular views, accessible through SQL. From the end-user point of view the processing capability becomes unlimited (arbitrarily complex queries can be constructed using Intensional Knowledge, while the most external queries are expressed with standard SQL. The RDBMS functionality becomes extended toward that of the Deductive Databases
Directory of Open Access Journals (Sweden)
Jae Eun Lee
2015-12-01
Full Text Available Purpose: There is abundant evidence that neighborhood characteristics are significantly linked to the health of the inhabitants of a given space within a given time frame. This study is to statistically validate a web-based GIS application designed to support cardiovascular-related research developed by the NIH funded Research Centers in Minority Institutions (RCMI Translational Research Network (RTRN Data Coordinating Center (DCC and discuss its applicability to cardiovascular studies. Methods: Geo-referencing, geocoding and geospatial analyses were conducted for 500 randomly selected home addresses in a U.S. southeastern Metropolitan area. The correlation coefficient, factor analysis and Cronbach’s alpha (α were estimated to quantify measures of the internal consistency, reliability and construct/criterion/discriminant validity of the cardiovascular-related geospatial variables (walk score, number of hospitals, fast food restaurants, parks and sidewalks. Results: Cronbach’s α for CVD GEOSPATIAL variables was 95.5%, implying successful internal consistency. Walk scores were significantly correlated with number of hospitals (r = 0.715; p < 0.0001, fast food restaurants (r = 0.729; p < 0.0001, parks (r = 0.773; p < 0.0001 and sidewalks (r = 0.648; p < 0.0001 within a mile from homes. It was also significantly associated with diversity index (r = 0.138, p = 0.0023, median household incomes (r = −0.181; p < 0.0001, and owner occupied rates (r = −0.440; p < 0.0001. However, its non-significant correlation was found with median age, vulnerability, unemployment rate, labor force, and population growth rate. Conclusion: Our data demonstrates that geospatial data generated by the web-based application were internally consistent and demonstrated satisfactory validity. Therefore, the GIS application may be useful to apply to cardiovascular-related studies aimed to investigate potential impact of geospatial factors on diseases and/or the long
Directory of Open Access Journals (Sweden)
Mark Frogley
2013-01-01
Full Text Available To reduce the maintenance cost, avoid catastrophic failure, and improve the wind transmission system reliability, online condition monitoring system is critical important. In the real applications, many rotating mechanical faults, such as bearing surface defect, gear tooth crack, chipped gear tooth and so on generate impulsive signals. When there are these types of faults developing inside rotating machinery, each time the rotating components pass over the damage point, an impact force could be generated. The impact force will cause a ringing of the support structure at the structural natural frequency. By effectively detecting those periodic impulse signals, one group of rotating machine faults could be detected and diagnosed. However, in real wind turbine operations, impulsive fault signals are usually relatively weak to the background noise and vibration signals generated from other healthy components, such as shaft, blades, gears and so on. Moreover, wind turbine transmission systems work under dynamic operating conditions. This will further increase the difficulties in fault detection and diagnostics. Therefore, developing advanced signal processing methods to enhance the impulsive signals is in great needs.In this paper, an adaptive filtering technique will be applied for enhancing the fault impulse signals-to-noise ratio in wind turbine gear transmission systems. Multiple statistical features designed to quantify the impulsive signals of the processed signal are extracted for bearing fault detection. The multiple dimensional features are then transformed into one dimensional feature. A minimum error rate classifier will be designed based on the compressed feature to identify the gear transmission system with defect. Real wind turbine vibration signals will be used to demonstrate the effectiveness of the presented methodology.
Davidson, Norman
2003-01-01
Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody
Rose, Michael Benjamin
A novel trajectory and attitude control and navigation analysis tool for powered ascent is developed. The tool is capable of rapid trade-space analysis and is designed to ultimately reduce turnaround time for launch vehicle design, mission planning, and redesign work. It is streamlined to quickly determine trajectory and attitude control dispersions, propellant dispersions, orbit insertion dispersions, and navigation errors and their sensitivities to sensor errors, actuator execution uncertainties, and random disturbances. The tool is developed by applying both Monte Carlo and linear covariance analysis techniques to a closed-loop, launch vehicle guidance, navigation, and control (GN&C) system. The nonlinear dynamics and flight GN&C software models of a closed-loop, six-degree-of-freedom (6-DOF), Monte Carlo simulation are formulated and developed. The nominal reference trajectory (NRT) for the proposed lunar ascent trajectory is defined and generated. The Monte Carlo truth models and GN&C algorithms are linearized about the NRT, the linear covariance equations are formulated, and the linear covariance simulation is developed. The performance of the launch vehicle GN&C system is evaluated using both Monte Carlo and linear covariance techniques and their trajectory and attitude control dispersion, propellant dispersion, orbit insertion dispersion, and navigation error results are validated and compared. Statistical results from linear covariance analysis are generally within 10% of Monte Carlo results, and in most cases the differences are less than 5%. This is an excellent result given the many complex nonlinearities that are embedded in the ascent GN&C problem. Moreover, the real value of this tool lies in its speed, where the linear covariance simulation is 1036.62 times faster than the Monte Carlo simulation. Although the application and results presented are for a lunar, single-stage-to-orbit (SSTO), ascent vehicle, the tools, techniques, and mathematical
DEFF Research Database (Denmark)
Moilanen, A.; Sundström, L.; Pedersen, Jes Søe
2004-01-01
breeding system, mating system, parentage analysis, paternity assignment, polyandry, social insects......breeding system, mating system, parentage analysis, paternity assignment, polyandry, social insects...
System's flips in climate-related energy (CRE) systems
Ramos, Maria-Helena; Creutin, Jean-Dominique; Engeland, Kolbjørn; François, Baptiste; Renard, Benjamin
2014-05-01
Several modern environmental questions invite to explore the complex relationships between natural phenomena and human behaviour at a range of space and time scales. This usually involves a number of cause-effect (causal) relationships, linking actions and events. In lay terms, 'effect' can be defined as 'what happened' and 'cause', 'why something happened.' In a changing world or merely moving from one scale to another, shifts in perspective are expected, bringing some phenomena into the foreground and putting others to the background. Systems can thus flip from one set of causal structures to another in response to environmental perturbations and human innovations or behaviors, for instance, as space-time signatures are modified. The identification of these flips helps in better understanding and predicting how societies and stakeholders react to a shift in perspective. In this study, our motivation is to investigate possible consequences of the shift to a low carbon economy in terms of socio-technico systems' flips. The focus is on the regional production of Climate-Related Energy (CRE) (hydro-, wind- and solar-power). We search for information on historic shifts that may help defining the forcing conditions of abrupt changes and extreme situations. We identify and present a series of examples in which we try to distinguish the various tipping points, thresholds, breakpoints and regime shifts that are characteristic of complex systems in the CRE production domain. We expect that with these examples our comprehension of the question will be enriched, providing us the elements needed to better validate modeling attempts, to predict and manage flips of complex CRE production systems. The work presented is part of the FP7 project COMPLEX (Knowledge based climate mitigation systems for a low carbon economy; http://www.complex.ac.uk/).
Balancing the books - a statistical theory of prospective budgets in Earth System science
O'Kane, J. Philip
An honest declaration of the error in a mass, momentum or energy balance, ɛ, simply raises the question of its acceptability: "At what value of ɛ is the attempted balance to be rejected?" Answering this question requires a reference quantity against which to compare ɛ. This quantity must be a mathematical function of all the data used in making the balance. To deliver this function, a theory grounded in a workable definition of acceptability is essential. A distinction must be drawn between a retrospective balance and a prospective budget in relation to any natural space-filling body. Balances look to the past; budgets look to the future. The theory is built on the application of classical sampling theory to the measurement and closure of a prospective budget. It satisfies R.A. Fisher's "vital requirement that the actual and physical conduct of experiments should govern the statistical procedure of their interpretation". It provides a test, which rejects, or fails to reject, the hypothesis that the closing error on the budget, when realised, was due to sampling error only. By increasing the number of measurements, the discrimination of the test can be improved, controlling both the precision and accuracy of the budget and its components. The cost-effective design of such measurement campaigns is discussed briefly. This analysis may also show when campaigns to close a budget on a particular space-filling body are not worth the effort for either scientific or economic reasons. Other approaches, such as those based on stochastic processes, lack this finality, because they fail to distinguish between different types of error in the mismatch between a set of realisations of the process and the measured data.
Luo, Li; Cheng, Xiaohua; Wang, Shiyuan; Zhang, Junxue; Zhu, Wenbo; Yang, Jiaying; Liu, Pei
2017-09-19
Blended learning that combines a modular object-oriented dynamic learning environment (Moodle) with face-to-face teaching was applied to a medical statistics course to improve learning outcomes and evaluate the impact factors of students' knowledge, attitudes and practices (KAP) relating to e-learning. The same real-name questionnaire was administered before and after the intervention. The summed scores of every part (knowledge, attitude and practice) were calculated using the entropy method. A mixed linear model was fitted using the SAS PROC MIXED procedure to analyse the impact factors of KAP. Educational reform, self-perceived character, registered permanent residence and hours spent online per day were significant impact factors of e-learning knowledge. Introversion and middle type respondents' average scores were higher than those of extroversion type respondents. Regarding e-learning attitudes, educational reform, community number, Internet age and hours spent online per day had a significant impact. Specifically, participants whose Internet age was no greater than 6 years scored 7.00 points lower than those whose Internet age was greater than 10 years. Regarding e-learning behaviour, educational reform and parents' literacy had a significant impact, as the average score increased 10.05 points (P learning KAP. Additionally, this type of blended course can be implemented in many other curriculums.
The derivation and application of a risk related value of the spend for saving a statistical life
International Nuclear Information System (INIS)
Jackson, D; Stone, D; Butler, G G; McGlynn, G
2004-01-01
The concept of a risk related value of the spend for saving a statistical life (VSSSL) is advanced for use in cost-benefit studies across the power generation sector, and the nuclear industry in particular. For illustrative purposes, a best estimate VSSSL is set based on HSE guidance at Pounds 2 M. Above a risk of 10 -3 y -1 it is assumed that the VSSSL may approach this maximum sustainable value. As the risk reduces so does the VSSSL. At a risk level of 10 -6 y -1 a VSSSL of Pounds 0.5 M is applied. For risks below 10 -9 y -1 the value of further risk reduction approaches zero, although a nominal VSSSL of Pounds 10 k is applied as a pragmatic way forward in this study. The implications of adopting this concept as an aid to decision making in determining the spend on radiological dose reduction measures are illustrated through a worked example with a banded approach to estimating collective dose
Gross, D. H. E.
1997-01-01
This review is addressed to colleagues working in different fields of physics who are interested in the concepts of microcanonical thermodynamics, its relation and contrast to ordinary, canonical or grandcanonical thermodynamics, and to get a first taste of the wide area of new applications of thermodynamical concepts like hot nuclei, hot atomic clusters and gravitating systems. Microcanonical thermodynamics describes how the volume of the N-body phase space depends on the globally conserved quantities like energy, angular momentum, mass, charge, etc. Due to these constraints the microcanonical ensemble can behave quite differently from the conventional, canonical or grandcanonical ensemble in many important physical systems. Microcanonical systems become inhomogeneous at first-order phase transitions, or with rising energy, or with external or internal long-range forces like Coulomb, centrifugal or gravitational forces. Thus, fragmentation of the system into a spatially inhomogeneous distribution of various regions of different densities and/or of different phases is a genuine characteristic of the microcanonical ensemble. In these cases which are realized by the majority of realistic systems in nature, the microcanonical approach is the natural statistical description. We investigate this most fundamental form of thermodynamics in four different nontrivial physical cases: (I) Microcanonical phase transitions of first and second order are studied within the Potts model. The total energy per particle is a nonfluctuating order parameter which controls the phase which the system is in. In contrast to the canonical form the microcanonical ensemble allows to tune the system continuously from one phase to the other through the region of coexisting phases by changing the energy smoothly. The configurations of coexisting phases carry important informations about the nature of the phase transition. This is more remarkable as the canonical ensemble is blind against these
Directory of Open Access Journals (Sweden)
Svetlana V. Smirnova
2013-01-01
Full Text Available The features of using information technologies within applied statisticians in psychology are considered in the article. Requirements to statistical preparation of psychology students in the conditions of information society are analyzed.
MacLean, Adam L.; Harrington, Heather A.; Stumpf, Michael P. H.; Byrne, Helen M.
2015-01-01
mathematical and statistical techniques that enable modelers to gain insight into (models of) gene regulation and generate testable predictions. We introduce a range of modeling frameworks, but focus on ordinary differential equation (ODE) models since
Digital Repository Service at National Institute of Oceanography (India)
Srinivas, K.; Revichandran, C.; DineshKumar, P.K.
Three different statistical forecasting techniques - autoregressive, sinusoidal and exponentially weighted moving average (EWMA) were used to forecast monthly values of meteorological and oceanographic (met-ocean) parameters viz. sea surface...
Photon statistics in an N-level (N-1)-mode system
International Nuclear Information System (INIS)
Kozierowski, M.; Shumovskij, A.S.
1987-01-01
The characteristic and photon number distribution functions, the statistical moments of photon numbers and the correlations of modes are studied. The normally ordered variances of the photon numbers and the cross-correlation functions are calculated
AutoBayes: A System for Generating Data Analysis Programs from Statistical Models
Fischer, Bernd; Schumann, Johann
2003-01-01
Data analysis is an important scientific task which is required whenever information needs to be extracted from raw data. Statistical approaches to data analysis, which use methods from probability theory and numerical analysis, are well-founded but dificult to implement: the development of a statistical data analysis program for any given application is time-consuming and requires substantial knowledge and experience in several areas. In this paper, we describe AutoBayes, a program synthesis...
Random Matrix Theory of the Energy-Level Statistics of Disordered Systems at the Anderson Transition
Canali, C. M.
1995-01-01
We consider a family of random matrix ensembles (RME) invariant under similarity transformations and described by the probability density $P({\\bf H})= \\exp[-{\\rm Tr}V({\\bf H})]$. Dyson's mean field theory (MFT) of the corresponding plasma model of eigenvalues is generalized to the case of weak confining potential, $V(\\epsilon)\\sim {A\\over 2}\\ln ^2(\\epsilon)$. The eigenvalue statistics derived from MFT are shown to deviate substantially from the classical Wigner-Dyson statistics when $A
2010-07-01
... Administrator, Law Enforcement Assistance Administration; the Director, Bureau of Justice Statistics; or the... environmental coordinator shall be designated in the Bureau of Justice Statistics, the Law Enforcement.... 1451, et seq.; and other environmental review laws and executive orders. 7. Actions planned by private...
Hernández-Borges, A A; Macías-Cervi, P; Gaspar-Guardado, M A; Torres-Alvarez de Arcaya, M L; Ruiz-Rabaza, A; Jiménez-Sosa, A
1999-01-01
The Internet offers a great amount of health related websites, but concern has been raised about their reliability. Several subjective evaluation criteria and websites rating systems have been proposed as a help for the Internet users to distinguish among web resources with different quality, but their efficacy has not been proven. To evaluate the agreement of a subset of Internet rating systems editorial boards regarding their evaluations of a sample of pediatric websites. To evaluate certain websites characteristics as possible quality indicators for pediatric websites. Comparative survey of the Results of systematic evaluations of the contents and formal aspects of a sample of pediatric websites, with the number of daily visits to those websites, the time since their last update, the impact factor of their authors or editors, and the number of websites linked to them. 363 websites were compiled from eight rating systems. Only 25 were indexed and evaluated by at least two rating systems. This subset included more updated and more linked websites. There was no correlation among the Results of the evaluation of these 25 websites by the rating systems. The number of inbound links to the websites significantly correlated with their updating frequency (pquality indicators. On the other hand, the citation analysis on the Web by the quantification of inbound links to medical websites could be an objective and feasible tool in rating great amounts of websites.
Assessment of Literature Related to Combustion Appliance Venting Systems
Energy Technology Data Exchange (ETDEWEB)
Rapp, Vi H. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Singer, Brett C. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Stratton, Chris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wray, Craig P. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)
2012-06-01
In many residential building retrofit programs, air tightening to increase energy efficiency is constrained by concerns about related impacts on the safety of naturally vented combustion appliances. Tighter housing units more readily depressurize when exhaust equipment is operated, making combustion appliances more prone to backdraft or spillage. Several test methods purportedly assess the potential for depressurization-induced backdrafting and spillage, but these tests are not robustly reliable and repeatable predictors of venting performance, in part because they do not fully capture weather effects on venting performance. The purpose of this literature review is to investigate combustion safety diagnostics in existing codes, standards, and guidelines related to combustion appliances. This review summarizes existing combustion safety test methods, evaluations of these test methods, and also discusses research related to wind effects and the simulation of vent system performance. Current codes and standards related to combustion appliance installation provide little information on assessing backdrafting or spillage potential. A substantial amount of research has been conducted to assess combustion appliance backdrafting and spillage test methods, but primarily focuses on comparing short-term (stress) induced tests and monitoring results. Monitoring, typically performed over one week, indicated that combinations of environmental and house operation characteristics most conducive to combustion spillage were rare. Research, to an extent, has assessed existing combustion safety diagnostics for house depressurization, but the objectives of the diagnostics, both stress and monitoring, are not clearly defined. More research is also needed to quantify the frequency of test “failure” occurrence throughout the building stock and assess the statistical effects of weather (especially wind) on house depressurization and in turn on combustion appliance venting
Koorehdavoudi, Hana; Bogdan, Paul
2016-06-01
Biological systems are frequently categorized as complex systems due to their capabilities of generating spatio-temporal structures from apparent random decisions. In spite of research on analyzing biological systems, we lack a quantifiable framework for measuring their complexity. To fill this gap, in this paper, we develop a new paradigm to study a collective group of N agents moving and interacting in a three-dimensional space. Our paradigm helps to identify the spatio-temporal states of the motion of the group and their associated transition probabilities. This framework enables the estimation of the free energy landscape corresponding to the identified states. Based on the energy landscape, we quantify missing information, emergence, self-organization and complexity for a collective motion. We show that the collective motion of the group of agents evolves to reach the most probable state with relatively lowest energy level and lowest missing information compared to other possible states. Our analysis demonstrates that the natural group of animals exhibit a higher degree of emergence, self-organization and complexity over time. Consequently, this algorithm can be integrated into new frameworks to engineer collective motions to achieve certain degrees of emergence, self-organization and complexity.
Mbondji, Peter Ebongue; Kebede, Derege; Soumbey-Alley, Edoh William; Zielinski, Chris; Kouvividila, Wenceslas; Lusamba-Dikassa, Paul-Samson
2014-05-01
To identify key data sources of health information and describe their availability in countries of the World Health Organization (WHO) African Region. An analytical review on the availability and quality of health information data sources in countries; from experience, observations, literature and contributions from countries. Forty-six Member States of the WHO African Region. No participants. The state of data sources, including censuses, surveys, vital registration and health care facility-based sources. In almost all countries of the Region, there is a heavy reliance on household surveys for most indicators, with more than 121 household surveys having been conducted in the Region since 2000. Few countries have civil registration systems that permit adequate and regular tracking of mortality and causes of death. Demographic surveillance sites function in several countries, but the data generated are not integrated into the national health information system because of concerns about representativeness. Health management information systems generate considerable data, but the information is rarely used because of concerns about bias, quality and timeliness. To date, 43 countries in the Region have initiated Integrated Disease Surveillance and Response. A multitude of data sources are used to track progress towards health-related goals in the Region, with heavy reliance on household surveys for most indicators. Countries need to develop comprehensive national plans for health information that address the full range of data needs and data sources and that include provision for building national capacities for data generation, analysis, dissemination and use. © The Royal Society of Medicine.
Tang, Hui-Yi; Wang, Jian-Hui; Ma, Yong-Li
2014-06-01
For a small system at a low temperature, thermal fluctuation and quantum effect play important roles in quantum thermodynamics. Starting from micro-canonical ensemble, we generalize the Boltzmann-Gibbs statistical factor from infinite to finite systems, no matter the interactions between particles are considered or not. This generalized factor, similar to Tsallis's q-form as a power-law distribution, has the restriction of finite energy spectrum and includes the nonextensivities of the small systems. We derive the exact expression for distribution of average particle numbers in the interacting classical and quantum nonextensive systems within a generalized canonical ensemble. This expression in the almost independent or elementary excitation quantum finite systems is similar to the corresponding ones obtained from the conventional grand-canonical ensemble. In the reconstruction for the statistical theory of the small systems, we present the entropy of the equilibrium systems and equation of total thermal energy. When we investigate the thermodynamics for the interacting nonextensive systems, we obtain the system-bath heat exchange and "uncompensated heat" which are in the thermodynamical level and independent on the detail of the system-bath coupling. For ideal finite systems, with different traps and boundary conditions, we calculate some thermodynamic quantities, such as the specific heat, entropy, and equation of state, etc. Particularly at low temperatures for the small systems, we predict some novel behaviors in the quantum thermodynamics, including internal entropy production, heat exchanges between the system and its surroundings and finite-size effects on the free energy.
Directory of Open Access Journals (Sweden)
Ye Yazoume
2012-09-01
Full Text Available Abstract Background In the developed world, information on vital events is routinely collected nationally to inform population and health policies. However, in many low-and middle-income countries, especially those in sub-Saharan Africa (SSA, there is a lack of effective and comprehensive national civil registration and vital statistics system. In the past decades, the number of Health and Demographic Surveillance Systems (HDSSs has increased throughout SSA. An HDSS monitors births, deaths, causes of death, migration, and other health and socio-economic indicators within a defined population over time. Currently, the International Network for the Continuous Demographic Evaluation of Populations and Their Health (INDEPTH brings together 38 member research centers which run 44 HDSS sites from 20 countries in Africa, Asia and Oceana. Thirty two of these HDSS sites are in SSA. Discussion This paper argues that, in the absence of an adequate national CRVS, HDSSs should be more effectively utilised to generate relevant public health data, and also to create local capacity for longitudinal data collection and management systems in SSA. If HDSSs get strategically located to cover different geographical regions in a country, data from these sites could be used to provide a more complete national picture of the health of the population. They provide useful data that can be extrapolated for national estimates if their regional coverage is well planned. HDSSs are however resource-intensive. Efforts are being put towards getting them linked to local or national policy contexts and to reduce their dependence on external funding. Increasing their number in SSA to cover a critical proportion of the population, especially urban populations, must be carefully planned. Strategic planning is needed at national levels to geographically locate HDSS sites and to support these through national funding mechanisms. Summary The paper does not suggest that HDSSs should be
Ye, Yazoume; Wamukoya, Marilyn; Ezeh, Alex; Emina, Jacques B O; Sankoh, Osman
2012-09-05
In the developed world, information on vital events is routinely collected nationally to inform population and health policies. However, in many low-and middle-income countries, especially those in sub-Saharan Africa (SSA), there is a lack of effective and comprehensive national civil registration and vital statistics system. In the past decades, the number of Health and Demographic Surveillance Systems (HDSSs) has increased throughout SSA. An HDSS monitors births, deaths, causes of death, migration, and other health and socio-economic indicators within a defined population over time. Currently, the International Network for the Continuous Demographic Evaluation of Populations and Their Health (INDEPTH) brings together 38 member research centers which run 44 HDSS sites from 20 countries in Africa, Asia and Oceana. Thirty two of these HDSS sites are in SSA. This paper argues that, in the absence of an adequate national CRVS, HDSSs should be more effectively utilised to generate relevant public health data, and also to create local capacity for longitudinal data collection and management systems in SSA. If HDSSs get strategically located to cover different geographical regions in a country, data from these sites could be used to provide a more complete national picture of the health of the population. They provide useful data that can be extrapolated for national estimates if their regional coverage is well planned. HDSSs are however resource-intensive. Efforts are being put towards getting them linked to local or national policy contexts and to reduce their dependence on external funding. Increasing their number in SSA to cover a critical proportion of the population, especially urban populations, must be carefully planned. Strategic planning is needed at national levels to geographically locate HDSS sites and to support these through national funding mechanisms. The paper does not suggest that HDSSs should be seen as a replacement for civil registration systems
Safety-related control air systems - approved 1977
International Nuclear Information System (INIS)
Anon.
1978-01-01
This standard applies to those portions of the control air system that furnish air required to support, control, or operate systems or portions of systems that are safety related in nuclear power plants. This standard relates only to the air supply system(s) for safety-related air operated devices and does not apply to the safety-related air operated device or to air operated actuators for such devices. The objectives of this standard are to provide (1) minimum system design requirements for equipment, piping, instruments, controls, and wiring that constitute the air supply system; and (2) the system and component testing and maintenance requirements
DEFF Research Database (Denmark)
Schaarup-Jensen, Kjeld; Rasmussen, Michael R.; Thorndahl, Søren
2008-01-01
In urban drainage modeling long term extreme statistics has become an important basis for decision-making e.g. in connection with renovation projects. Therefore it is of great importance to minimize the uncertainties concerning long term prediction of maximum water levels and combined sewer...... overflow (CSO) in drainage systems. These uncertainties originate from large uncertainties regarding rainfall inputs, parameters, and assessment of return periods. This paper investigates how the choice of rainfall time series influences the extreme events statistics of max water levels in manholes and CSO...... gauges are located at a distance of max 20 kilometers from the catchment. All gauges are included in the Danish national rain gauge system which was launched in 1976. The paper describes to what extent the extreme events statistics based on these 9 series diverge from each other and how this diversity...
DEFF Research Database (Denmark)
Schaarup-Jensen, Kjeld; Rasmussen, Michael R.; Thorndahl, Søren
2009-01-01
In urban drainage modelling long term extreme statistics has become an important basis for decision-making e.g. in connection with renovation projects. Therefore it is of great importance to minimize the uncertainties concerning long term prediction of maximum water levels and combined sewer...... overflow (CSO) in drainage systems. These uncertainties originate from large uncertainties regarding rainfall inputs, parameters, and assessment of return periods. This paper investigates how the choice of rainfall time series influences the extreme events statistics of max water levels in manholes and CSO...... gauges are located at a distance of max 20 kilometers from the catchment. All gauges are included in the Danish national rain gauge system which was launched in 1976. The paper describes to what extent the extreme events statistics based on these 9 series diverge from each other and how this diversity...
Energy Technology Data Exchange (ETDEWEB)
Wurtz, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kaplan, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
2015-10-28
Pulse shape discrimination (PSD) is a variety of statistical classifier. Fully-realized statistical classifiers rely on a comprehensive set of tools for designing, building, and implementing. PSD advances rely on improvements to the implemented algorithm. PSD advances can be improved by using conventional statistical classifier or machine learning methods. This paper provides the reader with a glossary of classifier-building elements and their functions in a fully-designed and operational classifier framework that can be used to discover opportunities for improving PSD classifier projects. This paper recommends reporting the PSD classifier’s receiver operating characteristic (ROC) curve and its behavior at a gamma rejection rate (GRR) relevant for realistic applications.
Amorisco, N. C.; Monachesi, A.; Agnello, A.; White, S. D. M.
2018-04-01
We use data from the HST Coma Cluster Treasury program to assess the richness of the globular cluster systems (GCSs) of 54 Coma ultra-diffuse galaxies (UDGs), 18 of which have a half-light radius exceeding 1.5 kpc. We use a hierarchical Bayesian method tested on a large number of mock data sets to account consistently for the high and spatially varying background counts in Coma. These include both background galaxies and intra-cluster globular clusters (ICGCs), which are disentangled from the population of member globular clusters (GCs) in a probabilistic fashion. We find no candidate for a GCS as rich as that of the Milky Way, our sample has GCSs typical of dwarf galaxies. For the standard relation between GCS richness and halo mass, 33 galaxies have a virial mass Mvir ≤ 1011 M⊙ at 90 per cent probability. Only three have Mvir > 1011 M⊙ with the same confidence. The mean colour and spread in colour of the UDG GCs are indistinguishable from those of the abundant population of ICGCs. The majority of UDGs in our sample are consistent with the relation between stellar mass and GC richness of `normal' dwarf galaxies. Nine systems, however, display GCSs that are richer by a factor of 3 or more (at 90 per cent probability). Six of these have sizes ≲1.4 kpc. Our results imply that the physical mechanisms responsible for the extended size of the UDGs and for the enhanced GC richness of some cluster dwarfs are at most weakly correlated.
Pivato, Marcus
2013-01-01
We show that, in a sufficiently large population satisfying certain statistical regularities, it is often possible to accurately estimate the utilitarian social welfare function, even if we only have very noisy data about individual utility functions and interpersonal utility comparisons. In particular, we show that it is often possible to identify an optimal or close-to-optimal utilitarian social choice using voting rules such as the Borda rule, approval voting, relative utilitarianism, or a...