WorldWideScience

Sample records for statistical multisource-multitarget information

  1. Advances in statistical multisource-multitarget information fusion

    CERN Document Server

    Mahler, Ronald PS

    2014-01-01

    This is the sequel to the 2007 Artech House bestselling title, Statistical Multisource-Multitarget Information Fusion. That earlier book was a comprehensive resource for an in-depth understanding of finite-set statistics (FISST), a unified, systematic, and Bayesian approach to information fusion. The cardinalized probability hypothesis density (CPHD) filter, which was first systematically described in the earlier book, has since become a standard multitarget detection and tracking technique, especially in research and development.Since 2007, FISST has inspired a considerable amount of research

  2. Information theory and statistics

    CERN Document Server

    Kullback, Solomon

    1968-01-01

    Highly useful text studies logarithmic measures of information and their application to testing statistical hypotheses. Includes numerous worked examples and problems. References. Glossary. Appendix. 1968 2nd, revised edition.

  3. Information in statistical physics

    OpenAIRE

    Balian, R.

    2005-01-01

    We review with a tutorial scope the information theory foundations of quantum statistical physics. Only a small proportion of the variables that characterize a system at the microscopic scale can be controlled, for both practical and theoretical reasons, and a probabilistic description involving the observers is required. The criterion of maximum von Neumann entropy is then used for making reasonable inferences. It means that no spurious information is introduced besides the known data. Its o...

  4. Statistical Information:. a Bayesian Perspective

    Science.gov (United States)

    Stern, R. B.; Pereira, C. A. De B.

    2012-12-01

    We explore the concept of information in statistics: information about unknown quantities of interest, the parameters. We discuss intuitive ideas of what should be information in statistics. Our approach on information is divided in two scenarios: observed data and planning of an experiment. On the first scenario, we discuss the Sufficiency Principle, the Conditionality Principle, the Likelihood Principle and their relationship with trivial experiments. We also provide applications of some measures of information to an intuitive example. On the second scenario, the definition and new applications of Blackwell Sufficiency are presented. We discuss a new relationship between Blackwell Equivalence and the Likelihood Principle. Finally, the expected values of some measures of information are calculated.

  5. Textual information access statistical models

    CERN Document Server

    Gaussier, Eric

    2013-01-01

    This book presents statistical models that have recently been developed within several research communities to access information contained in text collections. The problems considered are linked to applications aiming at facilitating information access:- information extraction and retrieval;- text classification and clustering;- opinion mining;- comprehension aids (automatic summarization, machine translation, visualization).In order to give the reader as complete a description as possible, the focus is placed on the probability models used in the applications

  6. Information geometry and sufficient statistics

    Czech Academy of Sciences Publication Activity Database

    Ay, N.; Jost, J.; Le, Hong-Van; Schwachhöfer, L.

    2015-01-01

    Roč. 162, 1-2 (2015), s. 327-364 ISSN 0178-8051 Institutional support: RVO:67985840 Keywords : Fisher quadratic form * Amari-Chentsov tensor * sufficient statistic Subject RIV: BA - General Mathematics Impact factor: 2.204, year: 2015 http://link.springer.com/article/10.1007/s00440-014-0574-8

  7. Statistical Information: A Bayesian Perspective

    Directory of Open Access Journals (Sweden)

    Carlos A. de B. Pereira

    2012-11-01

    Full Text Available We explore the meaning of information about quantities of interest. Our approach is divided in two scenarios: the analysis of observations and the planning of an experiment. First, we review the Sufficiency, Conditionality and Likelihood principles and how they relate to trivial experiments. Next, we review Blackwell Sufficiency and show that sampling without replacement is Blackwell Sufficient for sampling with replacement. Finally, we unify the two scenarios presenting an extension of the relationship between Blackwell Equivalence and the Likelihood Principle.

  8. Time Series, Statistics, and Information

    Science.gov (United States)

    1990-06-01

    the sample spectral density. Values of delta are used to describe music and how the brain works! U. S. News and World Report, June 11, 1990, p. 62...writes: "Surprisingly, the same mathematical formula that characterizes the ebb and flow of music has been discovered to exist widely in nature, from the... Ishiguro , M., and Kitagawa, G. (1983). Akaike Information Criterion Statis- tics, D. Reidel: Boston. Smith, C. R. and Erickson, G. J. (1987). Maximum

  9. Statistical Symbolic Execution with Informed Sampling

    Science.gov (United States)

    Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco

    2014-01-01

    Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.

  10. A quantum information approach to statistical mechanics

    International Nuclear Information System (INIS)

    Cuevas, G.

    2011-01-01

    The field of quantum information and computation harnesses and exploits the properties of quantum mechanics to perform tasks more efficiently than their classical counterparts, or that may uniquely be possible in the quantum world. Its findings and techniques have been applied to a number of fields, such as the study of entanglement in strongly correlated systems, new simulation techniques for many-body physics or, generally, to quantum optics. This thesis aims at broadening the scope of quantum information theory by applying it to problems in statistical mechanics. We focus on classical spin models, which are toy models used in a variety of systems, ranging from magnetism, neural networks, to quantum gravity. We tackle these models using quantum information tools from three different angles. First, we show how the partition function of a class of widely different classical spin models (models in different dimensions, different types of many-body interactions, different symmetries, etc) can be mapped to the partition function of a single model. We prove this by first establishing a relation between partition functions and quantum states, and then transforming the corresponding quantum states to each other. Second, we give efficient quantum algorithms to estimate the partition function of various classical spin models, such as the Ising or the Potts model. The proof is based on a relation between partition functions and quantum circuits, which allows us to determine the quantum computational complexity of the partition function by studying the corresponding quantum circuit. Finally, we outline the possibility of applying quantum information concepts and tools to certain models of dis- crete quantum gravity. The latter provide a natural route to generalize our results, insofar as the central quantity has the form of a partition function, and as classical spin models are used as toy models of matter. (author)

  11. Medicaid Statistical Information System (MSIS) Tables

    Data.gov (United States)

    U.S. Department of Health & Human Services — The 25 MSIS statistical tables contain national state-by-state data. These tables contain high-level aggregated statistics relating to Medicaid eligibility and...

  12. Annual statistical information 1996; Informe estatistico anual 1996

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    This annual statistical report aims to propagate the information about the generation, transmission and distribution systems evolution and about the electric power market from the Parana State, Brazil, in 1996. The electric power consumption in the distribution area of the Parana Power Company (COPEL) presented a growth about 6,7%. The electric power production in the the COPEL plants increased 42,2% higher than 1995, due to the outflows verified in the Iguacu river and to the long period of the affluence reduction that the Southern region tanks coursed during this year. This report presents statistical data about the following topics: a) electric power energy balance from the Parana State; b) electric power energy balance from the COPEL - own generation, certain interchange, electric power requirement, direct distribution and the electric system 6 graphs, 3 maps, 61 tabs.; e-mail: splcnmr at mail.copel.br

  13. Information transport in classical statistical systems

    Science.gov (United States)

    Wetterich, C.

    2018-02-01

    For "static memory materials" the bulk properties depend on boundary conditions. Such materials can be realized by classical statistical systems which admit no unique equilibrium state. We describe the propagation of information from the boundary to the bulk by classical wave functions. The dependence of wave functions on the location of hypersurfaces in the bulk is governed by a linear evolution equation that can be viewed as a generalized Schrödinger equation. Classical wave functions obey the superposition principle, with local probabilities realized as bilinears of wave functions. For static memory materials the evolution within a subsector is unitary, as characteristic for the time evolution in quantum mechanics. The space-dependence in static memory materials can be used as an analogue representation of the time evolution in quantum mechanics - such materials are "quantum simulators". For example, an asymmetric Ising model on a Euclidean two-dimensional lattice represents the time evolution of free relativistic fermions in two-dimensional Minkowski space.

  14. Information transport in classical statistical systems

    Directory of Open Access Journals (Sweden)

    C. Wetterich

    2018-02-01

    Full Text Available For “static memory materials” the bulk properties depend on boundary conditions. Such materials can be realized by classical statistical systems which admit no unique equilibrium state. We describe the propagation of information from the boundary to the bulk by classical wave functions. The dependence of wave functions on the location of hypersurfaces in the bulk is governed by a linear evolution equation that can be viewed as a generalized Schrödinger equation. Classical wave functions obey the superposition principle, with local probabilities realized as bilinears of wave functions. For static memory materials the evolution within a subsector is unitary, as characteristic for the time evolution in quantum mechanics. The space-dependence in static memory materials can be used as an analogue representation of the time evolution in quantum mechanics – such materials are “quantum simulators”. For example, an asymmetric Ising model on a Euclidean two-dimensional lattice represents the time evolution of free relativistic fermions in two-dimensional Minkowski space.

  15. Statistical decisions under nonparametric a priori information

    International Nuclear Information System (INIS)

    Chilingaryan, A.A.

    1985-01-01

    The basic module of applied program package for statistical analysis of the ANI experiment data is described. By means of this module tasks of choosing theoretical model most adequately fitting to experimental data, selection of events of definte type, identification of elementary particles are carried out. For mentioned problems solving, the Bayesian rules, one-leave out test and KNN (K Nearest Neighbour) adaptive density estimation are utilized

  16. Safety Management Information Statistics (SAMIS) - 1995 Annual Report

    Science.gov (United States)

    1997-04-01

    The Safety Management Information Statistics 1995 Annual Report is a compilation and analysis of transit accident, casualty and crime statistics reported under the Federal Transit Administration's National Transit Database Reporting by transit system...

  17. Safety Management Information Statistics (SAMIS) - 1993 Annual Report

    Science.gov (United States)

    1995-05-01

    The 1993 Safety Management Information Statistics (SAMIS) report, now in its fourth year of publication, is a compilation and analysis of transit accident and casualty statistics uniformly collected from approximately 400 transit agencies throughout ...

  18. Safety Management Information Statistics (SAMIS) - 1991 Annual Report

    Science.gov (United States)

    1993-02-01

    The Safety Management Information Statistics 1991 Annual Report is a compilation and analysis of mass transit accident and casualty statistics reported by transit systems in the United States during 1991, under FTA's Section 15 reporting system.

  19. Safety Management Information Statistics (SAMIS) - 1994 Annual Report

    Science.gov (United States)

    1996-07-01

    The Safety Management Information Statistics 1994 Annual Report is a compilation and analysis of mass transit accident and casualty statistics reported by transit systems in the United States during 1994, reported under the Federal Transit Administra...

  20. PRIS-STATISTICS: Power Reactor Information System Statistical Reports. User's Manual

    International Nuclear Information System (INIS)

    2013-01-01

    The IAEA developed the Power Reactor Information System (PRIS)-Statistics application to assist PRIS end users with generating statistical reports from PRIS data. Statistical reports provide an overview of the status, specification and performance results of every nuclear power reactor in the world. This user's manual was prepared to facilitate the use of the PRIS-Statistics application and to provide guidelines and detailed information for each report in the application. Statistical reports support analyses of nuclear power development and strategies, and the evaluation of nuclear power plant performance. The PRIS database can be used for comprehensive trend analyses and benchmarking against best performers and industrial standards.

  1. Temporal and Statistical Information in Causal Structure Learning

    Science.gov (United States)

    McCormack, Teresa; Frosch, Caren; Patrick, Fiona; Lagnado, David

    2015-01-01

    Three experiments examined children's and adults' abilities to use statistical and temporal information to distinguish between common cause and causal chain structures. In Experiment 1, participants were provided with conditional probability information and/or temporal information and asked to infer the causal structure of a 3-variable mechanical…

  2. Information Geometry, Inference Methods and Chaotic Energy Levels Statistics

    OpenAIRE

    Cafaro, Carlo

    2008-01-01

    In this Letter, we propose a novel information-geometric characterization of chaotic (integrable) energy level statistics of a quantum antiferromagnetic Ising spin chain in a tilted (transverse) external magnetic field. Finally, we conjecture our results might find some potential physical applications in quantum energy level statistics.

  3. Elementary statistics for effective library and information service management

    CERN Document Server

    Egghe, Leo

    2001-01-01

    This title describes how best to use statistical data to produce professional reports on library activities. The authors cover data gathering, sampling, graphical representation of data and summary statistics from data, and also include a section on trend analysis. A full bibliography and a subject index make this a key title for any information professional..

  4. Information Geometric Complexity of a Trivariate Gaussian Statistical Model

    Directory of Open Access Journals (Sweden)

    Domenico Felice

    2014-05-01

    Full Text Available We evaluate the information geometric complexity of entropic motion on low-dimensional Gaussian statistical manifolds in order to quantify how difficult it is to make macroscopic predictions about systems in the presence of limited information. Specifically, we observe that the complexity of such entropic inferences not only depends on the amount of available pieces of information but also on the manner in which such pieces are correlated. Finally, we uncover that, for certain correlational structures, the impossibility of reaching the most favorable configuration from an entropic inference viewpoint seems to lead to an information geometric analog of the well-known frustration effect that occurs in statistical physics.

  5. Prototyping a Distributed Information Retrieval System That Uses Statistical Ranking.

    Science.gov (United States)

    Harman, Donna; And Others

    1991-01-01

    Built using a distributed architecture, this prototype distributed information retrieval system uses statistical ranking techniques to provide better service to the end user. Distributed architecture was shown to be a feasible alternative to centralized or CD-ROM information retrieval, and user testing of the ranking methodology showed both…

  6. Information support of real estate market statistical research

    OpenAIRE

    Shibirina, S.

    2010-01-01

    The article considers the significance and suggests the directions for using information in statistical researches of real estate market. A special attention is paid on interconnections of information support and participants of realty market's, and stages of market analysis which characterizing realty market.

  7. Concepts and recent advances in generalized information measures and statistics

    CERN Document Server

    Kowalski, Andres M

    2013-01-01

    Since the introduction of the information measure widely known as Shannon entropy, quantifiers based on information theory and concepts such as entropic forms and statistical complexities have proven to be useful in diverse scientific research fields. This book contains introductory tutorials suitable for the general reader, together with chapters dedicated to the basic concepts of the most frequently employed information measures or quantifiers and their recent applications to different areas, including physics, biology, medicine, economics, communication and social sciences. As these quantif

  8. Evaluation Statistics Computed for the Wave Information Studies (WIS)

    Science.gov (United States)

    2016-07-01

    defining it as the standard deviation of the errors (i.e., demeaned RMSE) divided by the mean of the observations (Mentaschi et al. 2013), as done by...SI. To overcome these challenges, the wave model community should make strides in standardizing statistical metrics to advance the objective...ERDC/CHL CHETN-I-91 July 2016 Approved for public release; distribution is unlimited. Evaluation Statistics Computed for the Wave Information

  9. Fisher information and statistical inference for phase-type distributions

    DEFF Research Database (Denmark)

    Bladt, Mogens; Esparza, Luz Judith R; Nielsen, Bo Friis

    2011-01-01

    This paper is concerned with statistical inference for both continuous and discrete phase-type distributions. We consider maximum likelihood estimation, where traditionally the expectation-maximization (EM) algorithm has been employed. Certain numerical aspects of this method are revised and we...... and Newton--Raphson approach. The inverse of the Fisher information matrix provides the variances and covariances of the estimated parameters....

  10. Reasoning about Informal Statistical Inference: One Statistician's View

    Science.gov (United States)

    Rossman, Allan J.

    2008-01-01

    This paper identifies key concepts and issues associated with the reasoning of informal statistical inference. I focus on key ideas of inference that I think all students should learn, including at secondary level as well as tertiary. I argue that a fundamental component of inference is to go beyond the data at hand, and I propose that statistical…

  11. Statistical approach for selection of biologically informative genes.

    Science.gov (United States)

    Das, Samarendra; Rai, Anil; Mishra, D C; Rai, Shesh N

    2018-05-20

    Selection of informative genes from high dimensional gene expression data has emerged as an important research area in genomics. Many gene selection techniques have been proposed so far are either based on relevancy or redundancy measure. Further, the performance of these techniques has been adjudged through post selection classification accuracy computed through a classifier using the selected genes. This performance metric may be statistically sound but may not be biologically relevant. A statistical approach, i.e. Boot-MRMR, was proposed based on a composite measure of maximum relevance and minimum redundancy, which is both statistically sound and biologically relevant for informative gene selection. For comparative evaluation of the proposed approach, we developed two biological sufficient criteria, i.e. Gene Set Enrichment with QTL (GSEQ) and biological similarity score based on Gene Ontology (GO). Further, a systematic and rigorous evaluation of the proposed technique with 12 existing gene selection techniques was carried out using five gene expression datasets. This evaluation was based on a broad spectrum of statistically sound (e.g. subject classification) and biological relevant (based on QTL and GO) criteria under a multiple criteria decision-making framework. The performance analysis showed that the proposed technique selects informative genes which are more biologically relevant. The proposed technique is also found to be quite competitive with the existing techniques with respect to subject classification and computational time. Our results also showed that under the multiple criteria decision-making setup, the proposed technique is best for informative gene selection over the available alternatives. Based on the proposed approach, an R Package, i.e. BootMRMR has been developed and available at https://cran.r-project.org/web/packages/BootMRMR. This study will provide a practical guide to select statistical techniques for selecting informative genes

  12. Allele-sharing statistics using information on family history.

    Science.gov (United States)

    Callegaro, A; Meulenbelt, I; Kloppenburg, M; Slagboom, P E; Houwing-Duistermaat, J J

    2010-11-01

    When conducting genetic studies for complex traits, large samples are commonly required to detect new genetic factors. A possible strategy to decrease the sample size is to reduce heterogeneity using available information. In this paper we propose a new class of model-free linkage analysis statistics which takes into account the information given by the ungenotyped affected relatives (positive family history). This information is included into the scoring function of classical allele-sharing statistics. We studied pedigrees of affected sibling pairs with one ungenotyped affected relative. We show that, for rare allele common complex diseases, the proposed method increases the expected power to detect linkage. Allele-sharing methods were applied to the symptomatic osteoarthritis GARP study where taking into account the family-history increased considerably the evidence of linkage in the region of the DIO2 susceptibility locus. © 2010 The Authors Annals of Human Genetics © 2010 Blackwell Publishing Ltd/University College London.

  13. Informing Evidence Based Decisions: Usage Statistics for Online Journal Databases

    Directory of Open Access Journals (Sweden)

    Alexei Botchkarev

    2017-06-01

    Full Text Available Abstract Objective – The primary objective was to examine online journal database usage statistics for a provincial ministry of health in the context of evidence based decision-making. In addition, the study highlights implementation of the Journal Access Centre (JAC that is housed and powered by the Ontario Ministry of Health and Long-Term Care (MOHLTC to inform health systems policy-making. Methods – This was a prospective case study using descriptive analysis of the JAC usage statistics of journal articles from January 2009 to September 2013. Results – JAC enables ministry employees to access approximately 12,000 journals with full-text articles. JAC usage statistics for the 2011-2012 calendar years demonstrate a steady level of activity in terms of searches, with monthly averages of 5,129. In 2009-2013, a total of 4,759 journal titles were accessed including 1,675 journals with full-text. Usage statistics demonstrate that the actual consumption was over 12,790 full-text downloaded articles or approximately 2,700 articles annually. Conclusion – JAC’s steady level of activities, revealed by the study, reflects continuous demand for JAC services and products. It testifies that access to online journal databases has become part of routine government knowledge management processes. MOHLTC’s broad area of responsibilities with dynamically changing priorities translates into the diverse information needs of its employees and a large set of required journals. Usage statistics indicate that MOHLTC information needs cannot be mapped to a reasonably compact set of “core” journals with a subsequent subscription to those.

  14. Statistics of optimal information flow in ensembles of regulatory motifs

    Science.gov (United States)

    Crisanti, Andrea; De Martino, Andrea; Fiorentino, Jonathan

    2018-02-01

    Genetic regulatory circuits universally cope with different sources of noise that limit their ability to coordinate input and output signals. In many cases, optimal regulatory performance can be thought to correspond to configurations of variables and parameters that maximize the mutual information between inputs and outputs. Since the mid-2000s, such optima have been well characterized in several biologically relevant cases. Here we use methods of statistical field theory to calculate the statistics of the maximal mutual information (the "capacity") achievable by tuning the input variable only in an ensemble of regulatory motifs, such that a single controller regulates N targets. Assuming (i) sufficiently large N , (ii) quenched random kinetic parameters, and (iii) small noise affecting the input-output channels, we can accurately reproduce numerical simulations both for the mean capacity and for the whole distribution. Our results provide insight into the inherent variability in effectiveness occurring in regulatory systems with heterogeneous kinetic parameters.

  15. Statistics of optimal information flow in ensembles of regulatory motifs.

    Science.gov (United States)

    Crisanti, Andrea; De Martino, Andrea; Fiorentino, Jonathan

    2018-02-01

    Genetic regulatory circuits universally cope with different sources of noise that limit their ability to coordinate input and output signals. In many cases, optimal regulatory performance can be thought to correspond to configurations of variables and parameters that maximize the mutual information between inputs and outputs. Since the mid-2000s, such optima have been well characterized in several biologically relevant cases. Here we use methods of statistical field theory to calculate the statistics of the maximal mutual information (the "capacity") achievable by tuning the input variable only in an ensemble of regulatory motifs, such that a single controller regulates N targets. Assuming (i) sufficiently large N, (ii) quenched random kinetic parameters, and (iii) small noise affecting the input-output channels, we can accurately reproduce numerical simulations both for the mean capacity and for the whole distribution. Our results provide insight into the inherent variability in effectiveness occurring in regulatory systems with heterogeneous kinetic parameters.

  16. Nonparametric Estimation of Information-Based Measures of Statistical Dispersion

    Czech Academy of Sciences Publication Activity Database

    Košťál, Lubomír; Pokora, Ondřej

    2012-01-01

    Roč. 14, č. 7 (2012), s. 1221-1233 ISSN 1099-4300 R&D Projects: GA ČR(CZ) GAP103/11/0282; GA ČR(CZ) GBP304/12/G069; GA ČR(CZ) GPP103/12/ P558 Institutional support: RVO:67985823 Keywords : statistical dispersion * entropy * Fisher information * nonparametric density estimation * neuronal activity Subject RIV: FH - Neurology Impact factor: 1.347, year: 2012

  17. Statistical process control based chart for information systems security

    Science.gov (United States)

    Khan, Mansoor S.; Cui, Lirong

    2015-07-01

    Intrusion detection systems have a highly significant role in securing computer networks and information systems. To assure the reliability and quality of computer networks and information systems, it is highly desirable to develop techniques that detect intrusions into information systems. We put forward the concept of statistical process control (SPC) in computer networks and information systems intrusions. In this article we propose exponentially weighted moving average (EWMA) type quality monitoring scheme. Our proposed scheme has only one parameter which differentiates it from the past versions. We construct the control limits for the proposed scheme and investigate their effectiveness. We provide an industrial example for the sake of clarity for practitioner. We give comparison of the proposed scheme with EWMA schemes and p chart; finally we provide some recommendations for the future work.

  18. A statistical mechanical interpretation of algorithmic information theory: Total statistical mechanical interpretation based on physical argument

    Science.gov (United States)

    Tadaki, Kohtaro

    2010-12-01

    The statistical mechanical interpretation of algorithmic information theory (AIT, for short) was introduced and developed by our former works [K. Tadaki, Local Proceedings of CiE 2008, pp. 425-434, 2008] and [K. Tadaki, Proceedings of LFCS'09, Springer's LNCS, vol. 5407, pp. 422-440, 2009], where we introduced the notion of thermodynamic quantities, such as partition function Z(T), free energy F(T), energy E(T), statistical mechanical entropy S(T), and specific heat C(T), into AIT. We then discovered that, in the interpretation, the temperature T equals to the partial randomness of the values of all these thermodynamic quantities, where the notion of partial randomness is a stronger representation of the compression rate by means of program-size complexity. Furthermore, we showed that this situation holds for the temperature T itself, which is one of the most typical thermodynamic quantities. Namely, we showed that, for each of the thermodynamic quantities Z(T), F(T), E(T), and S(T) above, the computability of its value at temperature T gives a sufficient condition for T (0,1) to satisfy the condition that the partial randomness of T equals to T. In this paper, based on a physical argument on the same level of mathematical strictness as normal statistical mechanics in physics, we develop a total statistical mechanical interpretation of AIT which actualizes a perfect correspondence to normal statistical mechanics. We do this by identifying a microcanonical ensemble in the framework of AIT. As a result, we clarify the statistical mechanical meaning of the thermodynamic quantities of AIT.

  19. Incorporating Linguistic Information to Statistical Word-Level Alignment

    Science.gov (United States)

    Cendejas, Eduardo; Barceló, Grettel; Gelbukh, Alexander; Sidorov, Grigori

    Parallel texts are enriched by alignment algorithms, thus establishing a relationship between the structures of the implied languages. Depending on the alignment level, the enrichment can be performed on paragraphs, sentences or words, of the expressed content in the source language and its translation. There are two main approaches to perform word-level alignment: statistical or linguistic. Due to the dissimilar grammar rules the languages have, the statistical algorithms usually give lower precision. That is why the development of this type of algorithms is generally aimed at a specific language pair using linguistic techniques. A hybrid alignment system based on the combination of the two traditional approaches is presented in this paper. It provides user-friendly configuration and is adaptable to the computational environment. The system uses linguistic resources and procedures such as identification of cognates, morphological information, syntactic trees, dictionaries, and semantic domains. We show that the system outperforms existing algorithms.

  20. What Could Fuzzy Logic Bring to Statistical Information Systems?

    Directory of Open Access Journals (Sweden)

    Miroslav Hudec

    2011-03-01

    Full Text Available The aim of the paper is to present the applicability of the fuzzy logic for statistical information systems in order to improve work with statistical data. The improvement offers the approximate reasoning in order to solve problems in a way that more resembles human logic. The paper examines the fuzzy logic approach,emphasizes situations when the two-valued (crisp logic is not adequate and offers solutions based on fuzzy logic. The first step of using data is its selection from a database. Although the Structured Query Language (SQL is a very powerful tool, it is unable to satisfy needs for data selection based on linguistic expressions and degrees of truth. For this purpose the fuzzy generalised logicalcondition (GLC was developed. Later researches have shown that the GLC formula is suitable for other processes concerning data, namely data classification and data dissemination.

  1. Identifying Statistical Dependence in Genomic Sequences via Mutual Information Estimates

    Directory of Open Access Journals (Sweden)

    Wojciech Szpankowski

    2007-12-01

    Full Text Available Questions of understanding and quantifying the representation and amount of information in organisms have become a central part of biological research, as they potentially hold the key to fundamental advances. In this paper, we demonstrate the use of information-theoretic tools for the task of identifying segments of biomolecules (DNA or RNA that are statistically correlated. We develop a precise and reliable methodology, based on the notion of mutual information, for finding and extracting statistical as well as structural dependencies. A simple threshold function is defined, and its use in quantifying the level of significance of dependencies between biological segments is explored. These tools are used in two specific applications. First, they are used for the identification of correlations between different parts of the maize zmSRp32 gene. There, we find significant dependencies between the 5′ untranslated region in zmSRp32 and its alternatively spliced exons. This observation may indicate the presence of as-yet unknown alternative splicing mechanisms or structural scaffolds. Second, using data from the FBI's combined DNA index system (CODIS, we demonstrate that our approach is particularly well suited for the problem of discovering short tandem repeats—an application of importance in genetic profiling.

  2. Statistical techniques to extract information during SMAP soil moisture assimilation

    Science.gov (United States)

    Kolassa, J.; Reichle, R. H.; Liu, Q.; Alemohammad, S. H.; Gentine, P.

    2017-12-01

    Statistical techniques permit the retrieval of soil moisture estimates in a model climatology while retaining the spatial and temporal signatures of the satellite observations. As a consequence, the need for bias correction prior to an assimilation of these estimates is reduced, which could result in a more effective use of the independent information provided by the satellite observations. In this study, a statistical neural network (NN) retrieval algorithm is calibrated using SMAP brightness temperature observations and modeled soil moisture estimates (similar to those used to calibrate the SMAP Level 4 DA system). Daily values of surface soil moisture are estimated using the NN and then assimilated into the NASA Catchment model. The skill of the assimilation estimates is assessed based on a comprehensive comparison to in situ measurements from the SMAP core and sparse network sites as well as the International Soil Moisture Network. The NN retrieval assimilation is found to significantly improve the model skill, particularly in areas where the model does not represent processes related to agricultural practices. Additionally, the NN method is compared to assimilation experiments using traditional bias correction techniques. The NN retrieval assimilation is found to more effectively use the independent information provided by SMAP resulting in larger model skill improvements than assimilation experiments using traditional bias correction techniques.

  3. The system for statistical analysis of logistic information

    Directory of Open Access Journals (Sweden)

    Khayrullin Rustam Zinnatullovich

    2015-05-01

    Full Text Available The current problem for managers in logistic and trading companies is the task of improving the operational business performance and developing the logistics support of sales. The development of logistics sales supposes development and implementation of a set of works for the development of the existing warehouse facilities, including both a detailed description of the work performed, and the timing of their implementation. Logistics engineering of warehouse complex includes such tasks as: determining the number and the types of technological zones, calculation of the required number of loading-unloading places, development of storage structures, development and pre-sales preparation zones, development of specifications of storage types, selection of loading-unloading equipment, detailed planning of warehouse logistics system, creation of architectural-planning decisions, selection of information-processing equipment, etc. The currently used ERP and WMS systems did not allow us to solve the full list of logistics engineering problems. In this regard, the development of specialized software products, taking into account the specifics of warehouse logistics, and subsequent integration of these software with ERP and WMS systems seems to be a current task. In this paper we suggest a system of statistical analysis of logistics information, designed to meet the challenges of logistics engineering and planning. The system is based on the methods of statistical data processing.The proposed specialized software is designed to improve the efficiency of the operating business and the development of logistics support of sales. The system is based on the methods of statistical data processing, the methods of assessment and prediction of logistics performance, the methods for the determination and calculation of the data required for registration, storage and processing of metal products, as well as the methods for planning the reconstruction and development

  4. Using statistical text classification to identify health information technology incidents

    Science.gov (United States)

    Chai, Kevin E K; Anthony, Stephen; Coiera, Enrico; Magrabi, Farah

    2013-01-01

    Objective To examine the feasibility of using statistical text classification to automatically identify health information technology (HIT) incidents in the USA Food and Drug Administration (FDA) Manufacturer and User Facility Device Experience (MAUDE) database. Design We used a subset of 570 272 incidents including 1534 HIT incidents reported to MAUDE between 1 January 2008 and 1 July 2010. Text classifiers using regularized logistic regression were evaluated with both ‘balanced’ (50% HIT) and ‘stratified’ (0.297% HIT) datasets for training, validation, and testing. Dataset preparation, feature extraction, feature selection, cross-validation, classification, performance evaluation, and error analysis were performed iteratively to further improve the classifiers. Feature-selection techniques such as removing short words and stop words, stemming, lemmatization, and principal component analysis were examined. Measurements κ statistic, F1 score, precision and recall. Results Classification performance was similar on both the stratified (0.954 F1 score) and balanced (0.995 F1 score) datasets. Stemming was the most effective technique, reducing the feature set size to 79% while maintaining comparable performance. Training with balanced datasets improved recall (0.989) but reduced precision (0.165). Conclusions Statistical text classification appears to be a feasible method for identifying HIT reports within large databases of incidents. Automated identification should enable more HIT problems to be detected, analyzed, and addressed in a timely manner. Semi-supervised learning may be necessary when applying machine learning to big data analysis of patient safety incidents and requires further investigation. PMID:23666777

  5. From Quality to Information Quality in Official Statistics

    Directory of Open Access Journals (Sweden)

    Kenett Ron S.

    2016-12-01

    Full Text Available The term quality of statistical data, developed and used in official statistics and international organizations such as the International Monetary Fund (IMF and the Organisation for Economic Co-operation and Development (OECD, refers to the usefulness of summary statistics generated by producers of official statistics. Similarly, in the context of survey quality, official agencies such as Eurostat, National Center for Science and Engineering Statistics (NCSES, and Statistics Canada have created dimensions for evaluating the quality of a survey and its ability to report ‘accurate survey data’.

  6. Combining Spatial Statistical and Ensemble Information in Probabilistic Weather Forecasts

    National Research Council Canada - National Science Library

    Berrocal, Veronica J; Raftery, Adrian E; Gneiting, Tilmann

    2006-01-01

    .... Bayesian model averaging (BMA) is a statistical postprocessing method for forecast ensembles that generates calibrated probabilistic forecast products for weather quantities at individual sites...

  7. Safety Management Information Statistics (SAMIS) - 1990 Annual Report.

    Science.gov (United States)

    1992-04-01

    The report is a compilation and analysis of mass transit accident and casualty statistics reported by transit systems in the United States during 1990, under the Federal Transit Administration's (FTA's) Section 15 reporting system.

  8. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  9. Safety Management Information Statistics (SAMIS) - 1992 Annual Report

    Science.gov (United States)

    1994-06-01

    This SAMIS 1992 annual report, now in its third year of publication, is a compilation and analysis of mass transit accident and casualty statistics reported by 600 transit systems in the United States under the FTA Section 15 reporting system. This r...

  10. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  11. Statistical physics of networks, information and complex systems

    Energy Technology Data Exchange (ETDEWEB)

    Ecke, Robert E [Los Alamos National Laboratory

    2009-01-01

    In this project we explore the mathematical methods and concepts of statistical physics that are fmding abundant applications across the scientific and technological spectrum from soft condensed matter systems and bio-infonnatics to economic and social systems. Our approach exploits the considerable similarity of concepts between statistical physics and computer science, allowing for a powerful multi-disciplinary approach that draws its strength from cross-fertilization and mUltiple interactions of researchers with different backgrounds. The work on this project takes advantage of the newly appreciated connection between computer science and statistics and addresses important problems in data storage, decoding, optimization, the infonnation processing properties of the brain, the interface between quantum and classical infonnation science, the verification of large software programs, modeling of complex systems including disease epidemiology, resource distribution issues, and the nature of highly fluctuating complex systems. Common themes that the project has been emphasizing are (i) neural computation, (ii) network theory and its applications, and (iii) a statistical physics approach to infonnation theory. The project's efforts focus on the general problem of optimization and variational techniques, algorithm development and infonnation theoretic approaches to quantum systems. These efforts are responsible for fruitful collaborations and the nucleation of science efforts that span multiple divisions such as EES, CCS, 0 , T, ISR and P. This project supports the DOE mission in Energy Security and Nuclear Non-Proliferation by developing novel infonnation science tools for communication, sensing, and interacting complex networks such as the internet or energy distribution system. The work also supports programs in Threat Reduction and Homeland Security.

  12. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  13. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  14. Statistics

    International Nuclear Information System (INIS)

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  15. Statistics

    International Nuclear Information System (INIS)

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  16. Financial Information 2015. Australian Vocational Education and Training Statistics

    Science.gov (United States)

    National Centre for Vocational Education Research (NCVER), 2016

    2016-01-01

    This publication provides information on how government-funded vocational education and training (VET) in Australia is financed and where the money is spent. Government-funded VET in the 2015 reporting year is broadly defined as all activity delivered by government providers and government-funded activity delivered by community education providers…

  17. Financial Information 2016: Australian Vocational Education and Training Statistics

    Science.gov (United States)

    National Centre for Vocational Education Research (NCVER), 2017

    2017-01-01

    This publication provides financial information on the government-funded vocational education and training (VET) system in Australia. Reporting includes VET funds transacted through government accounts of the Australian and state and territory government departments and their controlled training organisation entities such as TAFE institutes and…

  18. Australian Vocational Education and Training Statistics: Financial Information 2007

    Science.gov (United States)

    National Centre for Vocational Education Research (NCVER), 2008

    2008-01-01

    This publication details the financial operations of Australia's public vocational education and training (VET) system for 2007. The information presented covers revenues and expenses; assets, liabilities and equities; cash flows; and trends in total revenues and expenses. The scope of the financial data collection covers all transactions that…

  19. Bayesian statistics and information fusion for GPS-denied navigation

    Science.gov (United States)

    Copp, Brian Lee

    It is well known that satellite navigation systems are vulnerable to disruption due to jamming, spoofing, or obstruction of the signal. The desire for robust navigation of aircraft in GPS-denied environments has motivated the development of feature-aided navigation systems, in which measurements of environmental features are used to complement the dead reckoning solution produced by an inertial navigation system. Examples of environmental features which can be exploited for navigation include star positions, terrain elevation, terrestrial wireless signals, and features extracted from photographic data. Feature-aided navigation represents a particularly challenging estimation problem because the measurements are often strongly nonlinear, and the quality of the navigation solution is limited by the knowledge of nuisance parameters which may be difficult to model accurately. As a result, integration approaches based on the Kalman filter and its variants may fail to give adequate performance. This project develops a framework for the integration of feature-aided navigation techniques using Bayesian statistics. In this approach, the probability density function for aircraft horizontal position (latitude and longitude) is approximated by a two-dimensional point mass function defined on a rectangular grid. Nuisance parameters are estimated using a hypothesis based approach (Multiple Model Adaptive Estimation) which continuously maintains an accurate probability density even in the presence of strong nonlinearities. The effectiveness of the proposed approach is illustrated by the simulated use of terrain referenced navigation and wireless time-of-arrival positioning to estimate a reference aircraft trajectory. Monte Carlo simulations have shown that accurate position estimates can be obtained in terrain referenced navigation even with a strongly nonlinear altitude bias. The integration of terrain referenced and wireless time-of-arrival measurements is described along with

  20. Statistics

    International Nuclear Information System (INIS)

    2003-01-01

    For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products

  1. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  2. Statistics

    International Nuclear Information System (INIS)

    2004-01-01

    For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees

  3. Vascular Extraction Using MRA Statistics and Gradient Information

    Directory of Open Access Journals (Sweden)

    Shifeng Zhao

    2018-01-01

    Full Text Available Brain vessel segmentation is a fundamental component of cerebral disease screening systems. However, detecting vessels is still a challenging task owing to their complex appearance and thinning geometry as well as the contrast decrease from the root of the vessel to its thin branches. We present a method for segmentation of the vasculature in Magnetic Resonance Angiography (MRA images. First, we apply volume projection, 2D segmentation, and back-projection procedures for first stage of background subtraction and vessel reservation. Those labeled as background or vessel voxels are excluded from consideration in later computation. Second, stochastic expectation maximization algorithm (SEM is used to estimate the probability density function (PDF of the remaining voxels, which are assumed to be mixture of one Rayleigh and two Gaussian distributions. These voxels can then be classified into background, middle region, or vascular structure. Third, we adapt the K-means method which is based on the gradient of remaining voxels to effectively detect true positives around boundaries of vessels. Experimental results on clinical cerebral data demonstrate that using gradient information as a further step improves the mixture model based segmentation of cerebral vasculature, in particular segmentation of the low contrast vasculature.

  4. Coordination of the National Statistical System in the Information Security Context

    Directory of Open Access Journals (Sweden)

    O. H.

    2017-12-01

    Full Text Available The need for building the national statistical system (NSS as the framework for coordination of statistical works is substantiated. NSS is defined on the basis of system approach. It is emphasized that the essential conditions underlying NSS are strategic planning, reliance on internationally adopted methods and due consideration to country-specific environment. The role of the state coordination policy in organizing statistical activities in the NSS framework is highlighted, key objectives of the integrated national policy on coordination of statistical activities are given. Threats arising from non-existence of NSS in a country are shown: “irregular” pattern of statistical activities, resulting from absence of common legal, methodological and organizational grounds; high costs involved in the finished information product in parallel with its low quality; impossibility of administering the statistical information security in a coherent manner, i. e. keeping with the rules on confidentiality of data, preventing intentional distortion of information and keeping with the rules of treatment with data making the state secret. An extensive review of NSS functional objectives is made: to ensure the system development of the official statistics; to ensure confidentiality and protection of individual data; to establish interdepartmental mechanisms for control and protection of secret statistical information; to broaden and regulate the access to statistical data and their effective use. The need for creating the National Statistical Commission is grounded.

  5. Statistical Language Models and Information Retrieval: Natural Language Processing Really Meets Retrieval

    NARCIS (Netherlands)

    Hiemstra, Djoerd; de Jong, Franciska M.G.

    2001-01-01

    Traditionally, natural language processing techniques for information retrieval have always been studied outside the framework of formal models of information retrieval. In this article, we introduce a new formal model of information retrieval based on the application of statistical language models.

  6. Use of Statistical Information for Damage Assessment of Civil Engineering Structures

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Andersen, P.

    This paper considers the problem of damage assessment of civil engineering structures using statistical information. The aim of the paper is to review how researchers recently have tried to solve the problem. It is pointed out that the problem consists of not only how to use the statistical...

  7. Financial Statistics. Higher Education General Information Survey (HEGIS) [machine-readable data file].

    Science.gov (United States)

    Center for Education Statistics (ED/OERI), Washington, DC.

    The Financial Statistics machine-readable data file (MRDF) is a subfile of the larger Higher Education General Information Survey (HEGIS). It contains basic financial statistics for over 3,000 institutions of higher education in the United States and its territories. The data are arranged sequentially by institution, with institutional…

  8. Statistical information 1971-76. From the National Institute of Radiation Protection

    International Nuclear Information System (INIS)

    1978-01-01

    This report includes statistical information about the work performed at the National Institute of Radiation Protection, Sweden, during the period 1971-1976, as well as about the different fields causing the intervention by the institute. (E.R.)

  9. Knowledge-Intensive Gathering and Integration of Statistical Information on European Fisheries

    NARCIS (Netherlands)

    Klinkert, M.; Treur, J.; Verwaart, T.; Loganantharaj, R.; Palm, G.; Ali, M.

    2000-01-01

    Gathering, maintenance, integration and presentation of statistics are major activities of the Dutch Agricultural Economics Research Institute LEI. In this paper we explore how knowledge and agent technology can be exploited to support the information gathering and integration process. In

  10. The Philippine management information system for public health programs, vital statistics, mortality and notifiable diseases.

    Science.gov (United States)

    Marte, B A; Schwefel, D

    1995-10-01

    Strengthening the information support for decision making has been identified as an important first step toward improving the efficiency, effectiveness, and equitability of the health care system in the Philippines. A Philippine-German Cooperation is in partnership toward developing a need-responsive and cost-effective Health and Management Information System (HAMIS). Four information baskets are being strengthened specifically to address these needs in a cost-effective way: public health information systems, hospital information systems, information systems on economics and financing, information systems on good health care management. BLACKBOX is the management information system for public health programs, vital statistics, mortality and notifiable diseases of the Philippines. It handles and retrieves all data that is being collected by public health workers routinely all over the Philippines. The eventual aim of BLACKBOX is to encourage the development of an information culture in which health managers actively utilise information for rational planning and decision making for a knowledge based health care delivery.

  11. The influence of narrative v. statistical information on perceiving vaccination risks.

    Science.gov (United States)

    Betsch, Cornelia; Ulshöfer, Corina; Renkewitz, Frank; Betsch, Tilmann

    2011-01-01

    Health-related information found on the Internet is increasing and impacts patient decision making, e.g. regarding vaccination decisions. In addition to statistical information (e.g. incidence rates of vaccine adverse events), narrative information is also widely available such as postings on online bulletin boards. Previous research has shown that narrative information can impact treatment decisions, even when statistical information is presented concurrently. As the determinants of this effect are largely unknown, we will vary features of the narratives to identify mechanisms through which narratives impact risk judgments. An online bulletin board setting provided participants with statistical information and authentic narratives about the occurrence and nonoccurrence of adverse events. Experiment 1 followed a single factorial design with 1, 2, or 4 narratives out of 10 reporting adverse events. Experiment 2 implemented a 2 (statistical risk 20% vs. 40%) × 2 (2/10 vs. 4/10 narratives reporting adverse events) × 2 (high vs. low richness) × 2 (high vs. low emotionality) between-subjects design. Dependent variables were perceived risk of side-effects and vaccination intentions. Experiment 1 shows an inverse relation between the number of narratives reporting adverse-events and vaccination intentions, which was mediated by the perceived risk of vaccinating. Experiment 2 showed a stronger influence of the number of narratives than of the statistical risk information. High (vs. low) emotional narratives had a greater impact on the perceived risk, while richness had no effect. The number of narratives influences risk judgments can potentially override statistical information about risk.

  12. A measure of statistical complexity based on predictive information with application to finite spin systems

    Energy Technology Data Exchange (ETDEWEB)

    Abdallah, Samer A., E-mail: samer.abdallah@eecs.qmul.ac.uk [School of Electronic Engineering and Computer Science, Queen Mary University of London, London E1 4NS (United Kingdom); Plumbley, Mark D., E-mail: mark.plumbley@eecs.qmul.ac.uk [School of Electronic Engineering and Computer Science, Queen Mary University of London, London E1 4NS (United Kingdom)

    2012-01-09

    We propose the binding information as an information theoretic measure of complexity between multiple random variables, such as those found in the Ising or Potts models of interacting spins, and compare it with several previously proposed measures of statistical complexity, including excess entropy, Bialek et al.'s predictive information, and the multi-information. We discuss and prove some of the properties of binding information, particularly in relation to multi-information and entropy, and show that, in the case of binary random variables, the processes which maximise binding information are the ‘parity’ processes. The computation of binding information is demonstrated on Ising models of finite spin systems, showing that various upper and lower bounds are respected and also that there is a strong relationship between the introduction of high-order interactions and an increase of binding-information. Finally we discuss some of the implications this has for the use of the binding information as a measure of complexity. -- Highlights: ► We introduce ‘binding information’ as a entropic/statistical measure of complexity. ► Binding information (BI) is related to earlier notions of predictive information. ► We derive upper and lower bounds of BI relation to entropy and multi-information. ► Parity processes found to maximise BI in finite sets of binary random variables. ► Application to spin glasses shows highest BI obtained with high-order interactions.

  13. An identity of Chernoff bounds with an interpretation in statistical physics and applications in information theory

    OpenAIRE

    Merhav, Neri

    2007-01-01

    An identity between two versions of the Chernoff bound on the probability a certain large deviations event, is established. This identity has an interpretation in statistical physics, namely, an isothermal equilibrium of a composite system that consists of multiple subsystems of particles. Several information--theoretic application examples, where the analysis of this large deviations probability naturally arises, are then described from the viewpoint of this statistical mechanical interpreta...

  14. Health information systems in Africa: descriptive analysis of data sources, information products and health statistics.

    Science.gov (United States)

    Mbondji, Peter Ebongue; Kebede, Derege; Soumbey-Alley, Edoh William; Zielinski, Chris; Kouvividila, Wenceslas; Lusamba-Dikassa, Paul-Samson

    2014-05-01

    To identify key data sources of health information and describe their availability in countries of the World Health Organization (WHO) African Region. An analytical review on the availability and quality of health information data sources in countries; from experience, observations, literature and contributions from countries. Forty-six Member States of the WHO African Region. No participants. The state of data sources, including censuses, surveys, vital registration and health care facility-based sources. In almost all countries of the Region, there is a heavy reliance on household surveys for most indicators, with more than 121 household surveys having been conducted in the Region since 2000. Few countries have civil registration systems that permit adequate and regular tracking of mortality and causes of death. Demographic surveillance sites function in several countries, but the data generated are not integrated into the national health information system because of concerns about representativeness. Health management information systems generate considerable data, but the information is rarely used because of concerns about bias, quality and timeliness. To date, 43 countries in the Region have initiated Integrated Disease Surveillance and Response. A multitude of data sources are used to track progress towards health-related goals in the Region, with heavy reliance on household surveys for most indicators. Countries need to develop comprehensive national plans for health information that address the full range of data needs and data sources and that include provision for building national capacities for data generation, analysis, dissemination and use. © The Royal Society of Medicine.

  15. On divergence of finite measures and their applicability in statistics and information theory

    Czech Academy of Sciences Publication Activity Database

    Vajda, Igor; Stummer, W.

    2009-01-01

    Roč. 44, č. 2 (2009), s. 169-187 ISSN 0233-1888 R&D Projects: GA MŠk(CZ) 1M0572; GA ČR(CZ) GA102/07/1131 Institutional research plan: CEZ:AV0Z10750506 Keywords : Local and global divergences of finite measures * Divergences of sigma-finite measures * Statistical censoring * Pinsker's inequality, Ornstein's distance * Differential power entropies Subject RIV: BD - Theory of Information Impact factor: 0.759, year: 2009 http://library.utia.cas.cz/separaty/2009/SI/vajda-on divergence of finite measures and their applicability in statistics and information theory.pdf

  16. Review of Statistical Learning Methods in Integrated Omics Studies (An Integrated Information Science).

    Science.gov (United States)

    Zeng, Irene Sui Lan; Lumley, Thomas

    2018-01-01

    Integrated omics is becoming a new channel for investigating the complex molecular system in modern biological science and sets a foundation for systematic learning for precision medicine. The statistical/machine learning methods that have emerged in the past decade for integrated omics are not only innovative but also multidisciplinary with integrated knowledge in biology, medicine, statistics, machine learning, and artificial intelligence. Here, we review the nontrivial classes of learning methods from the statistical aspects and streamline these learning methods within the statistical learning framework. The intriguing findings from the review are that the methods used are generalizable to other disciplines with complex systematic structure, and the integrated omics is part of an integrated information science which has collated and integrated different types of information for inferences and decision making. We review the statistical learning methods of exploratory and supervised learning from 42 publications. We also discuss the strengths and limitations of the extended principal component analysis, cluster analysis, network analysis, and regression methods. Statistical techniques such as penalization for sparsity induction when there are fewer observations than the number of features and using Bayesian approach when there are prior knowledge to be integrated are also included in the commentary. For the completeness of the review, a table of currently available software and packages from 23 publications for omics are summarized in the appendix.

  17. Measures of statistical dispersion based on Shannon and Fisher information concepts

    Czech Academy of Sciences Publication Activity Database

    Košťál, Lubomír; Lánský, Petr; Pokora, Ondřej

    2013-01-01

    Roč. 235, JUN 20 (2013), s. 214-223 ISSN 0020-0255 R&D Projects: GA ČR(CZ) GAP304/12/0259; GA ČR(CZ) GAP103/11/0282; GA ČR(CZ) GPP103/12/ P558 Institutional support: RVO:67985823 Keywords : statistical dispersion * entropy * Fisher information Subject RIV: BD - Theory of Information Impact factor: 3.893, year: 2013

  18. Analyzing Statistical Mediation with Multiple Informants: A New Approach with an Application in Clinical Psychology.

    Science.gov (United States)

    Papa, Lesther A; Litson, Kaylee; Lockhart, Ginger; Chassin, Laurie; Geiser, Christian

    2015-01-01

    Testing mediation models is critical for identifying potential variables that need to be targeted to effectively change one or more outcome variables. In addition, it is now common practice for clinicians to use multiple informant (MI) data in studies of statistical mediation. By coupling the use of MI data with statistical mediation analysis, clinical researchers can combine the benefits of both techniques. Integrating the information from MIs into a statistical mediation model creates various methodological and practical challenges. The authors review prior methodological approaches to MI mediation analysis in clinical research and propose a new latent variable approach that overcomes some limitations of prior approaches. An application of the new approach to mother, father, and child reports of impulsivity, frustration tolerance, and externalizing problems (N = 454) is presented. The results showed that frustration tolerance mediated the relationship between impulsivity and externalizing problems. The new approach allows for a more comprehensive and effective use of MI data when testing mediation models.

  19. 75 FR 21231 - Proposed Information Collection; Comment Request; Marine Recreational Fisheries Statistics Survey

    Science.gov (United States)

    2010-04-23

    ... Collection; Comment Request; Marine Recreational Fisheries Statistics Survey AGENCY: National Oceanic and... Andrews, (301) 713-2328, ext. 148 or [email protected] . SUPPLEMENTARY INFORMATION: I. Abstract Marine recreational anglers are surveyed for catch and effort data, fish biology data, and angler socioeconomic...

  20. Incorporating Nonparametric Statistics into Delphi Studies in Library and Information Science

    Science.gov (United States)

    Ju, Boryung; Jin, Tao

    2013-01-01

    Introduction: The Delphi technique is widely used in library and information science research. However, many researchers in the field fail to employ standard statistical tests when using this technique. This makes the technique vulnerable to criticisms of its reliability and validity. The general goal of this article is to explore how…

  1. Knowledge-Sharing Intention among Information Professionals in Nigeria: A Statistical Analysis

    Science.gov (United States)

    Tella, Adeyinka

    2016-01-01

    In this study, the researcher administered a survey and developed and tested a statistical model to examine the factors that determine the intention of information professionals in Nigeria to share knowledge with their colleagues. The result revealed correlations between the overall score for intending to share knowledge and other…

  2. Statistical Mechanics and Information-Theoretic Perspectives on Complexity in the Earth System

    Directory of Open Access Journals (Sweden)

    Konstantinos Eftaxias

    2013-11-01

    Full Text Available This review provides a summary of methods originated in (non-equilibrium statistical mechanics and information theory, which have recently found successful applications to quantitatively studying complexity in various components of the complex system Earth. Specifically, we discuss two classes of methods: (i entropies of different kinds (e.g., on the one hand classical Shannon and R´enyi entropies, as well as non-extensive Tsallis entropy based on symbolic dynamics techniques and, on the other hand, approximate entropy, sample entropy and fuzzy entropy; and (ii measures of statistical interdependence and causality (e.g., mutual information and generalizations thereof, transfer entropy, momentary information transfer. We review a number of applications and case studies utilizing the above-mentioned methodological approaches for studying contemporary problems in some exemplary fields of the Earth sciences, highlighting the potentials of different techniques.

  3. A flexible statistics web processing service--added value for information systems for experiment data.

    Science.gov (United States)

    Heimann, Dennis; Nieschulze, Jens; König-Ries, Birgitta

    2010-04-20

    Data management in the life sciences has evolved from simple storage of data to complex information systems providing additional functionalities like analysis and visualization capabilities, demanding the integration of statistical tools. In many cases the used statistical tools are hard-coded within the system. That leads to an expensive integration, substitution, or extension of tools because all changes have to be done in program code. Other systems are using generic solutions for tool integration but adapting them to another system is mostly rather extensive work. This paper shows a way to provide statistical functionality over a statistics web service, which can be easily integrated in any information system and set up using XML configuration files. The statistical functionality is extendable by simply adding the description of a new application to a configuration file. The service architecture as well as the data exchange process between client and service and the adding of analysis applications to the underlying service provider are described. Furthermore a practical example demonstrates the functionality of the service.

  4. Statistics information of rice EST mapping results - RGP estmap2001 | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us RGP estmap2001 Statistics information of rice EST mapping results Data detail Data name Statistics...of This Database Site Policy | Contact Us Statistics information of rice EST mapping results - RGP estmap2001 | LSDB Archive ...

  5. Information Theory - The Bridge Connecting Bounded Rational Game Theory and Statistical Physics

    Science.gov (United States)

    Wolpert, David H.

    2005-01-01

    A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality of all red-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. This paper shows that the same information theoretic mathematical structure, known as Product Distribution (PD) theory, addresses both issues. In this, PD theory not only provides a principle formulation of bounded rationality and a set of new types of mean field theory in statistical physics; it also shows that those topics are fundamentally one and the same.

  6. Analysis and evolution of air quality monitoring networks using combined statistical information indexes

    Directory of Open Access Journals (Sweden)

    Axel Osses

    2013-10-01

    Full Text Available In this work, we present combined statistical indexes for evaluating air quality monitoring networks based on concepts derived from the information theory and Kullback–Liebler divergence. More precisely, we introduce: (1 the standard measure of complementary mutual information or ‘specificity’ index; (2 a new measure of information gain or ‘representativity’ index; (3 the information gaps associated with the evolution of a network and (4 the normalised information distance used in clustering analysis. All these information concepts are illustrated by applying them to 14 yr of data collected by the air quality monitoring network in Santiago de Chile (33.5 S, 70.5 W, 500 m a.s.l.. We find that downtown stations, located in a relatively flat area of the Santiago basin, generally show high ‘representativity’ and low ‘specificity’, whereas the contrary is found for a station located in a canyon to the east of the basin, consistently with known emission and circulation patterns of Santiago. We also show interesting applications of information gain to the analysis of the evolution of a network, where the choice of background information is also discussed, and of mutual information distance to the classifications of stations. Our analyses show that information as those presented here should of course be used in a complementary way when addressing the analysis of an air quality network for planning and evaluation purposes.

  7. Harmonisation of the Average Earnings Information System (MoLSA with the Wage Statistics (CZSO

    Directory of Open Access Journals (Sweden)

    Kateřina Duspivová

    2012-06-01

    Full Text Available In 2011, the methodology of the Average Earnings Information System (ISPV was harmonized with the methodology applied in the wage statistics of the Czech Statistical Office (CZSO. The benefit of the harmonisation rests in improved quality of the published wage statistics. Within the harmonisation, the ISPV population wasextended by economic subjects not monitored before. The extension of the ISPV population allowed to calculate more accurate numbers above all on employees, and thus since 2011, all ISPV publications has newlystated weighed numbers of employees. Due to the harmonisation, the gross monthly wage median for the wage sphere decreased in 2011. Despite the harmonisation, there are still differences between both surveys due to the specifics of the ISPV survey.

  8. On the Estimation and Use of Statistical Modelling in Information Retrieval

    DEFF Research Database (Denmark)

    Petersen, Casper

    Automatic text processing often relies on assumptions about the distribution of some property (such as term frequency) in the data being processed. In information retrieval (IR) such assumptions may be contributed to (i) the absence of principled approaches for determining the correct statistical...... that assumptions regarding the distribution of dataset properties can be replaced with an effective, efficient and principled method for determining the best-fitting distribution and that using this distribution can lead to improved retrieval performance....

  9. Examples of the Application of Nonparametric Information Geometry to Statistical Physics

    Directory of Open Access Journals (Sweden)

    Giovanni Pistone

    2013-09-01

    Full Text Available We review a nonparametric version of Amari’s information geometry in which the set of positive probability densities on a given sample space is endowed with an atlas of charts to form a differentiable manifold modeled on Orlicz Banach spaces. This nonparametric setting is used to discuss the setting of typical problems in machine learning and statistical physics, such as black-box optimization, Kullback-Leibler divergence, Boltzmann-Gibbs entropy and the Boltzmann equation.

  10. RaptorX: exploiting structure information for protein alignment by statistical inference

    OpenAIRE

    Peng, Jian; Xu, Jinbo

    2011-01-01

    This paper presents RaptorX, a statistical method for template-based protein modeling that improves alignment accuracy by exploiting structural information in a single or multiple templates. RaptorX consists of three major components: single-template threading, alignment quality prediction and multiple-template threading. This paper summarizes the methods employed by RaptorX and presents its CASP9 result analysis, aiming to identify major bottlenecks with RaptorX and template-based modeling a...

  11. Multi-feature statistical nonrigid registration using high-dimensional generalized information measures.

    Science.gov (United States)

    Hamrouni, Sameh; Rougon, Nicolas; Prêteux, Françoise

    2011-01-01

    Nonrigid image registration methods based on the optimization of information-theoretic measures provide versatile solutions for robustly aligning mono-modal data with nonlinear variations and multi-modal data in radiology. Whereas mutual information and its variations arise as a first choice, generalized information measures offer relevant alternatives in specific clinical contexts, Their usual application setting is the alignement of image pairs by statistically matching scalar random variables (generally, greylevel distributions), handled via their probability densities. In this paper, we address the issue of estimating and optimizing generalized information measures over high-dimensional state spaces to derive multi-feature statistical nonrigid registration models. Specifically, we introduce novel consistent and asymptotically unbiaised kappa nearest neighbors estimators of alpha-informations, and study their variational optimization over finite and infinite dimensional smooth transform spaces. The resulting theoretical framework provides a well-posed and computationally efficient alternative to entropic graph techniques. Its performances are assessed on two cardiological applications: measuring myocardial deformations in tagged MRI, and compensating cardio-thoracic motions in perfusion MRI.

  12. Hybrid approach combining contextual and statistical information for identifying MEDLINE citation terms

    Science.gov (United States)

    Kim, In Cheol; Le, Daniel X.; Thoma, George R.

    2008-01-01

    There is a strong demand for developing automated tools for extracting pertinent information from the biomedical literature that is a rich, complex, and dramatically growing resource, and is increasingly accessed via the web. This paper presents a hybrid method based on contextual and statistical information to automatically identify two MEDLINE citation terms: NIH grant numbers and databank accession numbers from HTML-formatted online biomedical documents. Their detection is challenging due to many variations and inconsistencies in their format (although recommended formats exist), and also because of their similarity to other technical or biological terms. Our proposed method first extracts potential candidates for these terms using a rule-based method. These are scored and the final candidates are submitted to a human operator for verification. The confidence score for each term is calculated using statistical information, and morphological and contextual information. Experiments conducted on more than ten thousand HTML-formatted online biomedical documents show that most NIH grant numbers and databank accession numbers can be successfully identified by the proposed method, with recall rates of 99.8% and 99.6%, respectively. However, owing to the high false alarm rate, the proposed method yields F-measure rates of 86.6% and 87.9% for NIH grants and databanks, respectively.

  13. Statistics for library and information services a primer for using open source R software for accessibility and visualization

    CERN Document Server

    Friedman, Alon

    2016-01-01

    Statistics for Library and Information Services, written for non-statisticians, provides logical, user-friendly, and step-by-step instructions to make statistics more accessible for students and professionals in the field of Information Science. It emphasizes concepts of statistical theory and data collection methodologies, but also extends to the topics of visualization creation and display, so that the reader will be able to better conduct statistical analysis and communicate his/her findings. The book is tailored for information science students and professionals. It has specific examples of dataset sets, scripts, design modules, data repositories, homework assignments, and a glossary lexicon that matches the field of Information Science. The textbook provides a visual road map that is customized specifically for Information Science instructors, students, and professionals regarding statistics and visualization. Each chapter in the book includes full-color illustrations on how to use R for the statistical ...

  14. The Barrier to Informed Choice in Cancer Screening: Statistical Illiteracy in Physicians and Patients.

    Science.gov (United States)

    Wegwarth, Odette; Gigerenzer, Gerd

    2018-01-01

    other misleading statistics, motivated by conflicts of interest and defensive medicine that do not promote informed physicians and patients. What can be done? Every medical school should teach its students how to understand evidence in general and health statistics in particular. To cultivate informed patients, elementary and high schools should start teaching the mathematics of uncertainty-statistical thinking. Guidelines about complete and transparent reporting in journals, brochures, and the media need to be better enforced, and laws need to be changed in order to protect patients and doctors alike against the practice of defensive medicine instead of encouraging it. A critical mass of informed citizens will not resolve all healthcare problems, but it can constitute a major triggering factor for better care.

  15. Toddlers favor communicatively presented information over statistical reliability in learning about artifacts.

    Science.gov (United States)

    Marno, Hanna; Csibra, Gergely

    2015-01-01

    Observed associations between events can be validated by statistical information of reliability or by testament of communicative sources. We tested whether toddlers learn from their own observation of efficiency, assessed by statistical information on reliability of interventions, or from communicatively presented demonstration, when these two potential types of evidence of validity of interventions on a novel artifact are contrasted with each other. Eighteen-month-old infants observed two adults, one operating the artifact by a method that was more efficient (2/3 probability of success) than that of the other (1/3 probability of success). Compared to the Baseline condition, in which communicative signals were not employed, infants tended to choose the less reliable method to operate the artifact when this method was demonstrated in a communicative manner in the Experimental condition. This finding demonstrates that, in certain circumstances, communicative sanctioning of reliability may override statistical evidence for young learners. Such a bias can serve fast and efficient transmission of knowledge between generations.

  16. Analyzing Statistical Mediation with Multiple Informants: A New Approach with an Application in Clinical Psychology

    Directory of Open Access Journals (Sweden)

    Lesther ePapa

    2015-11-01

    Full Text Available Testing mediation models is critical for identifying potential variables that need to be targeted to effectively change one or more outcome variables. In addition, it is now common practice for clinicians to use multiple informant (MI data in studies of statistical mediation. By coupling the use of MI data with statistical mediation analysis, clinical researchers can combine the benefits of both techniques. Integrating the information from MIs into a statistical mediation model creates various methodological and practical challenges. The authors review prior methodological approaches to MI mediation analysis in clinical research and propose a new latent variable approach that overcomes some limitations of prior approaches. An application of the new approach to mother, father, and child reports of impulsivity, frustration tolerance, and externalizing problems (N = 454 is presented. The results showed that frustration tolerance mediated the relationship between impulsivity and externalizing problems. Advantages and limitations of the new approach are discussed. The new approach can help clinical researchers overcome limitations of prior techniques. It allows for a more comprehensive and effective use of MI data when testing mediation models.

  17. A statistic to estimate the variance of the histogram-based mutual information estimator based on dependent pairs of observations

    NARCIS (Netherlands)

    Moddemeijer, R

    In the case of two signals with independent pairs of observations (x(n),y(n)) a statistic to estimate the variance of the histogram based mutual information estimator has been derived earlier. We present such a statistic for dependent pairs. To derive this statistic it is necessary to avail of a

  18. Correlating Fourier phase information with real-space higher order statistics in CMB data

    Science.gov (United States)

    Modest, H. I.; Räth, C.; Banday, A. J.; Górski, K. M.; Morfill, G. E.

    2014-06-01

    We present a heuristic study on the correlations between harmonic space phase information and higher-order statistics. Using the spherical full-sky maps of the cosmic microwave background as an example, we demonstrate that known phase correlations at large spatial scales can gradually be diminished when subtracting a suitable best-fit (Bianchi-)template map of a given strength. The weaker phase correlations are attended by a vanishing signature of anisotropy when measuring the Minkowski functionals and scaling indices in real space with the aid of surrogate maps being free of phase correlations. Those investigations can open a new road to a better understanding of signatures of non-Gaussianities in complex spatial structures, especially by elucidating the meaning of Fourier phase correlations and their influence on higher-order statistics.

  19. Data Flow Analysis and Visualization for Spatiotemporal Statistical Data without Trajectory Information.

    Science.gov (United States)

    Kim, Seokyeon; Jeong, Seongmin; Woo, Insoo; Jang, Yun; Maciejewski, Ross; Ebert, David S

    2018-03-01

    Geographic visualization research has focused on a variety of techniques to represent and explore spatiotemporal data. The goal of those techniques is to enable users to explore events and interactions over space and time in order to facilitate the discovery of patterns, anomalies and relationships within the data. However, it is difficult to extract and visualize data flow patterns over time for non-directional statistical data without trajectory information. In this work, we develop a novel flow analysis technique to extract, represent, and analyze flow maps of non-directional spatiotemporal data unaccompanied by trajectory information. We estimate a continuous distribution of these events over space and time, and extract flow fields for spatial and temporal changes utilizing a gravity model. Then, we visualize the spatiotemporal patterns in the data by employing flow visualization techniques. The user is presented with temporal trends of geo-referenced discrete events on a map. As such, overall spatiotemporal data flow patterns help users analyze geo-referenced temporal events, such as disease outbreaks, crime patterns, etc. To validate our model, we discard the trajectory information in an origin-destination dataset and apply our technique to the data and compare the derived trajectories and the original. Finally, we present spatiotemporal trend analysis for statistical datasets including twitter data, maritime search and rescue events, and syndromic surveillance.

  20. Improving Prediction Skill of Imperfect Turbulent Models Through Statistical Response and Information Theory

    Science.gov (United States)

    Majda, Andrew J.; Qi, Di

    2016-02-01

    Turbulent dynamical systems with a large phase space and a high degree of instabilities are ubiquitous in climate science and engineering applications. Statistical uncertainty quantification (UQ) to the response to the change in forcing or uncertain initial data in such complex turbulent systems requires the use of imperfect models due to the lack of both physical understanding and the overwhelming computational demands of Monte Carlo simulation with a large-dimensional phase space. Thus, the systematic development of reduced low-order imperfect statistical models for UQ in turbulent dynamical systems is a grand challenge. This paper applies a recent mathematical strategy for calibrating imperfect models in a training phase and accurately predicting the response by combining information theory and linear statistical response theory in a systematic fashion. A systematic hierarchy of simple statistical imperfect closure schemes for UQ for these problems is designed and tested which are built through new local and global statistical energy conservation principles combined with statistical equilibrium fidelity. The forty mode Lorenz 96 (L-96) model which mimics forced baroclinic turbulence is utilized as a test bed for the calibration and predicting phases for the hierarchy of computationally cheap imperfect closure models both in the full phase space and in a reduced three-dimensional subspace containing the most energetic modes. In all of phase spaces, the nonlinear response of the true model is captured accurately for the mean and variance by the systematic closure model, while alternative methods based on the fluctuation-dissipation theorem alone are much less accurate. For reduced-order model for UQ in the three-dimensional subspace for L-96, the systematic low-order imperfect closure models coupled with the training strategy provide the highest predictive skill over other existing methods for general forced response yet have simple design principles based on a

  1. On Improving the Quality and Interpretation of Environmental Assessments using Statistical Analysis and Geographic Information Systems

    Science.gov (United States)

    Karuppiah, R.; Faldi, A.; Laurenzi, I.; Usadi, A.; Venkatesh, A.

    2014-12-01

    An increasing number of studies are focused on assessing the environmental footprint of different products and processes, especially using life cycle assessment (LCA). This work shows how combining statistical methods and Geographic Information Systems (GIS) with environmental analyses can help improve the quality of results and their interpretation. Most environmental assessments in literature yield single numbers that characterize the environmental impact of a process/product - typically global or country averages, often unchanging in time. In this work, we show how statistical analysis and GIS can help address these limitations. For example, we demonstrate a method to separately quantify uncertainty and variability in the result of LCA models using a power generation case study. This is important for rigorous comparisons between the impacts of different processes. Another challenge is lack of data that can affect the rigor of LCAs. We have developed an approach to estimate environmental impacts of incompletely characterized processes using predictive statistical models. This method is applied to estimate unreported coal power plant emissions in several world regions. There is also a general lack of spatio-temporal characterization of the results in environmental analyses. For instance, studies that focus on water usage do not put in context where and when water is withdrawn. Through the use of hydrological modeling combined with GIS, we quantify water stress on a regional and seasonal basis to understand water supply and demand risks for multiple users. Another example where it is important to consider regional dependency of impacts is when characterizing how agricultural land occupation affects biodiversity in a region. We developed a data-driven methodology used in conjuction with GIS to determine if there is a statistically significant difference between the impacts of growing different crops on different species in various biomes of the world.

  2. Water Quality Stressor Information from Clean Water Act Statewide Statistical Surveys

    Data.gov (United States)

    U.S. Environmental Protection Agency — Stressors assessed by statewide statistical surveys and their state and national attainment categories. Statewide statistical surveys are water quality assessments...

  3. Water Quality attainment Information from Clean Water Act Statewide Statistical Surveys

    Data.gov (United States)

    U.S. Environmental Protection Agency — Designated uses assessed by statewide statistical surveys and their state and national attainment categories. Statewide statistical surveys are water quality...

  4. Information processing in bacteria: memory, computation, and statistical physics: a key issues review

    International Nuclear Information System (INIS)

    Lan, Ganhui; Tu, Yuhai

    2016-01-01

    preserving information, it does not reveal the underlying mechanism that leads to the observed input-output relationship, nor does it tell us much about which information is important for the organism and how biological systems use information to carry out specific functions. To do that, we need to develop models of the biological machineries, e.g. biochemical networks and neural networks, to understand the dynamics of biological information processes. This is a much more difficult task. It requires deep knowledge of the underlying biological network—the main players (nodes) and their interactions (links)—in sufficient detail to build a model with predictive power, as well as quantitative input-output measurements of the system under different perturbations (both genetic variations and different external conditions) to test the model predictions to guide further development of the model. Due to the recent growth of biological knowledge thanks in part to high throughput methods (sequencing, gene expression microarray, etc) and development of quantitative in vivo techniques such as various florescence technology, these requirements are starting to be realized in different biological systems. The possible close interaction between quantitative experimentation and theoretical modeling has made systems biology an attractive field for physicists interested in quantitative biology. In this review, we describe some of the recent work in developing a quantitative predictive model of bacterial chemotaxis, which can be considered as the hydrogen atom of systems biology. Using statistical physics approaches, such as the Ising model and Langevin equation, we study how bacteria, such as E. coli, sense and amplify external signals, how they keep a working memory of the stimuli, and how they use these data to compute the chemical gradient. In particular, we will describe how E. coli cells avoid cross-talk in a heterogeneous receptor cluster to keep a ligand-specific memory. We will also

  5. Information processing in bacteria: memory, computation, and statistical physics: a key issues review

    Science.gov (United States)

    Lan, Ganhui; Tu, Yuhai

    2016-05-01

    preserving information, it does not reveal the underlying mechanism that leads to the observed input-output relationship, nor does it tell us much about which information is important for the organism and how biological systems use information to carry out specific functions. To do that, we need to develop models of the biological machineries, e.g. biochemical networks and neural networks, to understand the dynamics of biological information processes. This is a much more difficult task. It requires deep knowledge of the underlying biological network—the main players (nodes) and their interactions (links)—in sufficient detail to build a model with predictive power, as well as quantitative input-output measurements of the system under different perturbations (both genetic variations and different external conditions) to test the model predictions to guide further development of the model. Due to the recent growth of biological knowledge thanks in part to high throughput methods (sequencing, gene expression microarray, etc) and development of quantitative in vivo techniques such as various florescence technology, these requirements are starting to be realized in different biological systems. The possible close interaction between quantitative experimentation and theoretical modeling has made systems biology an attractive field for physicists interested in quantitative biology. In this review, we describe some of the recent work in developing a quantitative predictive model of bacterial chemotaxis, which can be considered as the hydrogen atom of systems biology. Using statistical physics approaches, such as the Ising model and Langevin equation, we study how bacteria, such as E. coli, sense and amplify external signals, how they keep a working memory of the stimuli, and how they use these data to compute the chemical gradient. In particular, we will describe how E. coli cells avoid cross-talk in a heterogeneous receptor cluster to keep a ligand-specific memory. We will also

  6. Segmentation of human skull in MRI using statistical shape information from CT data.

    Science.gov (United States)

    Wang, Defeng; Shi, Lin; Chu, Winnie C W; Cheng, Jack C Y; Heng, Pheng Ann

    2009-09-01

    To automatically segment the skull from the MRI data using a model-based three-dimensional segmentation scheme. This study exploited the statistical anatomy extracted from the CT data of a group of subjects by means of constructing an active shape model of the skull surfaces. To construct a reliable shape model, a novel approach was proposed to optimize the automatic landmarking on the coupled surfaces (i.e., the skull vault) by minimizing the description length that incorporated local thickness information. This model was then used to locate the skull shape in MRI of a different group of patients. Compared with performing landmarking separately on the coupled surfaces, the proposed landmarking method constructed models that had better generalization ability and specificity. The segmentation accuracies were measured by the Dice coefficient and the set difference, and compared with the method based on mathematical morphology operations. The proposed approach using the active shape model based on the statistical skull anatomy presented in the head CT data contributes to more reliable segmentation of the skull from MRI data.

  7. Power analysis as a tool to identify statistically informative indicators for monitoring coral reef disturbances.

    Science.gov (United States)

    Van Wynsberge, Simon; Gilbert, Antoine; Guillemot, Nicolas; Heintz, Tom; Tremblay-Boyer, Laura

    2017-07-01

    Extensive biological field surveys are costly and time consuming. To optimize sampling and ensure regular monitoring on the long term, identifying informative indicators of anthropogenic disturbances is a priority. In this study, we used 1800 candidate indicators by combining metrics measured from coral, fish, and macro-invertebrate assemblages surveyed from 2006 to 2012 in the vicinity of an ongoing mining project in the Voh-Koné-Pouembout lagoon, New Caledonia. We performed a power analysis to identify a subset of indicators which would best discriminate temporal changes due to a simulated chronic anthropogenic impact. Only 4% of tested indicators were likely to detect a 10% annual decrease of values with sufficient power (>0.80). Corals generally exerted higher statistical power than macro-invertebrates and fishes because of lower natural variability and higher occurrence. For the same reasons, higher taxonomic ranks provided higher power than lower taxonomic ranks. Nevertheless, a number of families of common sedentary or sessile macro-invertebrates and fishes also performed well in detecting changes: Echinometridae, Isognomidae, Muricidae, Tridacninae, Arcidae, and Turbinidae for macro-invertebrates and Pomacentridae, Labridae, and Chaetodontidae for fishes. Interestingly, these families did not provide high power in all geomorphological strata, suggesting that the ability of indicators in detecting anthropogenic impacts was closely linked to reef geomorphology. This study provides a first operational step toward identifying statistically relevant indicators of anthropogenic disturbances in New Caledonia's coral reefs, which can be useful in similar tropical reef ecosystems where little information is available regarding the responses of ecological indicators to anthropogenic disturbances.

  8. When Statistical Literacy Really Matters: Understanding Published Information about the HIV/AIDS Epidemic in South Africa

    Science.gov (United States)

    Hobden, Sally

    2014-01-01

    Information on the HIV/AIDS epidemic in Southern Africa is often interpreted through a veil of secrecy and shame and, I argue, with flawed understanding of basic statistics. This research determined the levels of statistical literacy evident in 316 future Mathematical Literacy teachers' explanations of the median in the context of HIV/AIDS…

  9. Quality of mortality statistics' information: garbage codes as causesof death in Belo Horizonte, 2011-2013.

    Science.gov (United States)

    Ishitani, Lenice Harumi; Teixeira, Renato Azeredo; Abreu, Daisy Maria Xavier; Paixão, Lucia Maria Miana Mattos; França, Elisabeth Barboza

    2017-05-01

    To assess the quality of mortality information by analyzing the frequency of garbage codes (GC) registered as underlying cause-of-death in Belo Horizonte, Minas Gerais, Brazil. Data of deaths of residents from 2011 to 2013 were selected. GC causes were classified as proposed by the Global Burden of Disease Study (GBD) 2015. They were grouped into GCs from ICD-10 Chapter XVIII and GCs excluding codes of Chapter XVIII. Proportions of GC were calculated by sex, age, and place of occurrence. In Belo Horizonte, 30.5% of the total of 44,123 deaths were GC. Higher proportion of these codes was observed in children (1 to 4 years) and in people aged over 60 years. The following leading GCs observed were: other ill-defined and unspecified causes of death (code R99), unspecified pneumonia (J18.9), unspecified stroke (hemorrhagic or ischemic) (I64), and unspecified septicemia (A41.9). The proportions of GC were 28.7% and 36.9% in deaths that occurred in hospitals and at home, respectively. An important difference occurred in the GC group from Chapter XVIII of ICD-10: 1.7% occurred in hospitals and 16.9% at home. The high proportions of GC in mortality statistics in Belo Horizonte demonstrated its importance for assessing the quality of information on causes of death.

  10. Application of statistical machine translation to public health information: a feasibility study.

    Science.gov (United States)

    Kirchhoff, Katrin; Turner, Anne M; Axelrod, Amittai; Saavedra, Francisco

    2011-01-01

    Accurate, understandable public health information is important for ensuring the health of the nation. The large portion of the US population with Limited English Proficiency is best served by translations of public-health information into other languages. However, a large number of health departments and primary care clinics face significant barriers to fulfilling federal mandates to provide multilingual materials to Limited English Proficiency individuals. This article presents a pilot study on the feasibility of using freely available statistical machine translation technology to translate health promotion materials. The authors gathered health-promotion materials in English from local and national public-health websites. Spanish versions were created by translating the documents using a freely available machine-translation website. Translations were rated for adequacy and fluency, analyzed for errors, manually corrected by a human posteditor, and compared with exclusively manual translations. Machine translation plus postediting took 15-53 min per document, compared to the reported days or even weeks for the standard translation process. A blind comparison of machine-assisted and human translations of six documents revealed overall equivalency between machine-translated and manually translated materials. The analysis of translation errors indicated that the most important errors were word-sense errors. The results indicate that machine translation plus postediting may be an effective method of producing multilingual health materials with equivalent quality but lower cost compared to manual translations.

  11. Application of statistical machine translation to public health information: a feasibility study

    Science.gov (United States)

    Turner, Anne M; Axelrod, Amittai; Saavedra, Francisco

    2011-01-01

    Objective Accurate, understandable public health information is important for ensuring the health of the nation. The large portion of the US population with Limited English Proficiency is best served by translations of public-health information into other languages. However, a large number of health departments and primary care clinics face significant barriers to fulfilling federal mandates to provide multilingual materials to Limited English Proficiency individuals. This article presents a pilot study on the feasibility of using freely available statistical machine translation technology to translate health promotion materials. Design The authors gathered health-promotion materials in English from local and national public-health websites. Spanish versions were created by translating the documents using a freely available machine-translation website. Translations were rated for adequacy and fluency, analyzed for errors, manually corrected by a human posteditor, and compared with exclusively manual translations. Results Machine translation plus postediting took 15–53 min per document, compared to the reported days or even weeks for the standard translation process. A blind comparison of machine-assisted and human translations of six documents revealed overall equivalency between machine-translated and manually translated materials. The analysis of translation errors indicated that the most important errors were word-sense errors. Conclusion The results indicate that machine translation plus postediting may be an effective method of producing multilingual health materials with equivalent quality but lower cost compared to manual translations. PMID:21498805

  12. The duration of uncertain times: audiovisual information about intervals is integrated in a statistically optimal fashion.

    Directory of Open Access Journals (Sweden)

    Jess Hartcher-O'Brien

    Full Text Available Often multisensory information is integrated in a statistically optimal fashion where each sensory source is weighted according to its precision. This integration scheme isstatistically optimal because it theoretically results in unbiased perceptual estimates with the highest precisionpossible.There is a current lack of consensus about how the nervous system processes multiple sensory cues to elapsed time.In order to shed light upon this, we adopt a computational approach to pinpoint the integration strategy underlying duration estimationof audio/visual stimuli. One of the assumptions of our computational approach is that the multisensory signals redundantly specify the same stimulus property. Our results clearly show that despite claims to the contrary, perceived duration is the result of an optimal weighting process, similar to that adopted for estimates of space. That is, participants weight the audio and visual information to arrive at the most precise, single duration estimate possible. The work also disentangles how different integration strategies - i.e. consideringthe time of onset/offset ofsignals - might alter the final estimate. As such we provide the first concrete evidence of an optimal integration strategy in human duration estimates.

  13. Mathematical model of statistical identification of information support of road transport

    Directory of Open Access Journals (Sweden)

    V. G. Kozlov

    2016-01-01

    Full Text Available In this paper based on the statistical identification method using the theory of self-organizing systems, built multifactor model the relationship of road transport and training system. Background information for the model represented by a number of parameters of average annual road transport operations and information provision, including training complex system parameters (inputs, road management and output parameters. Ask two criteria: stability criterion model and test correlation. The program determines their minimum, and is the only model of optimal complexity. The predetermined number of parameters established mathematical relationship of each output parameter with the others. To improve the accuracy and regularity of the forecast of the interpolation nodes allocated in the test data sequence. Other data form the training sequence. Decision model based on the principle of selection. Running it with the gradual complication of the mathematical description and exhaustive search of all possible variants of the models on the specified criteria. Advantages of the proposed model: adequately reflects the actual process, allows you to enter any additional input parameters and determine their impact on the individual output parameters of the road transport, allows in turn change the values of key parameters in a certain ratio and to determine the appropriate changes the output parameters of the road transport, allows to predict the output parameters road transport operations.

  14. Statistical shape and texture model of quadrature phase information for prostate segmentation.

    Science.gov (United States)

    Ghose, Soumya; Oliver, Arnau; Martí, Robert; Lladó, Xavier; Freixenet, Jordi; Mitra, Jhimli; Vilanova, Joan C; Comet-Batlle, Josep; Meriaudeau, Fabrice

    2012-01-01

    Prostate volume estimation from segmentation of transrectal ultrasound (TRUS) images aids in diagnosis and treatment of prostate hypertrophy and cancer. Computer-aided accurate and computationally efficient prostate segmentation in TRUS images is a challenging task, owing to low signal-to-noise ratio, speckle noise, calcifications, and heterogeneous intensity distribution in the prostate region. A multi-resolution framework using texture features in a parametric deformable statistical model of shape and appearance was developed to segment the prostate. Local phase information of log-Gabor quadrature filter extracted texture of the prostate region in TRUS images. Large bandwidth of log-Gabor filter ensures easy estimation of local orientations, and zero response for a constant signal provides invariance to gray level shift. This aids in enhanced representation of the underlying texture information of the prostate unaffected by speckle noise and imaging artifacts. The parametric model of the propagating contour is derived from principal component analysis of prior shape and texture information of the prostate from the training data. The parameters were modified using prior knowledge of the optimization space to achieve segmentation. The proposed method achieves a mean Dice similarity coefficient value of 0.95 ± 0.02 and mean absolute distance of 1.26 ± 0.51 millimeter when validated with 24 TRUS images of 6 data sets in a leave-one-patient-out validation framework. The proposed method for prostate TRUS image segmentation is computationally efficient and provides accurate prostate segmentations in the presence of intensity heterogeneities and imaging artifacts.

  15. A Catalogue of Data in the Statistical Information Centre, March 1976. (Catalogue de donnees du Centre d'information statistique, Mars 1976.)

    Science.gov (United States)

    Department of Indian Affairs and Northern Development, Ottawa (Ontario).

    Over 189 materials which cover aspects of the Administration, Parks Canada, Indian and Eskimo Affairs, and Northern Development Programs are cited in this bilingual catalogue (English and French). Information given for each entry is: reference number, statistics available, years covered, and whether the statistics are available by area, region,…

  16. A hybrid downscaling using statistical correction and high resolution regional climate model information

    Science.gov (United States)

    Wakazuki, Y.

    2017-12-01

    The author presented the outline of a statistical downscaling method using high resolution regional climate model simulation results, which is called hybrid-downscaling, at AGU fall meeting 2016. This presentation is the extension. The statistical downscaling is calculated with lighter computational costs for various patterns of climate states in future which are needed to estimate uncertainty of regional climate change. However, the estimation accuracy is low in the region where the density of observation is low. On the other hand, dynamical downscaling using regional climate model (RCM) use huge computational costs. However, climatological features are well reproduced even in the region where the density of observation is low. I proposed a method to compensate the disadvantages of statistical and dynamical downscaling methods in the hybrid-downscaling. The downscaling processes are divided into horizontal interpolation (HI) and bias correction (BC). In HI, middle-resolution multi-RCM simulation results are interpolated to high resolution data which has grid sizes of 1-2 km. The HI model for climatological variables such as mean precipitation and temperature is learned by using the high resolution dynamical downscaling result. In BC, correction ratio/difference of high-resolution data are estimated by a generalized linear model with predictors of geographical elements. In this method, spatial distribution is largely influenced by the high-resolution RCM result. The hybrid-downscaling model has been applied for regional climate model simulation with the target region around Japan. Multiple future climate simulations had been performed to cover the uncertainty with 24 and 6 km grid sizes. However, only two climate simulations had been calculated with 2 km grid size because of huge computational costs. To estimate 2 km grid information, two kinds of hybrid-downscaling, in which 24 and 6 km RCM results were used as middle-resolution RCMs, were performed. The

  17. The obtaining of statistical characteristics of informative features of signals in the Autonomous information systems using neural networks

    Directory of Open Access Journals (Sweden)

    V. K. Hohlov

    2014-01-01

    Full Text Available The article studies a neural network approach to obtain the statistical characteristics of the input vector implementations of signal and noise at ill-conditioned matrices of correlation moments to solve the problems to select and reduce the vector dimensions of informative features at detection and recognition of signals and noise based on regression methods.A scientific novelty is determined by applying neural network algorithms for the efficient solution of problems to select the informative features and determine the parameters of regression algorithms in terms of the degeneracy or ill-conditioned data with unknown expectation and covariance matrices.The article proposes to use a single-layer neural network with no zero weights and activation functions to calculate the initial regression characteristics and the mean-square value error of multiple initial regression representations, which are necessary to justify the selection of informative features, reduce a dimension of sign vectors and implement the regression algorithms. It is shown that when excluding direct links between the inputs and their corresponding neurons, in the training network the weight coefficients of neuron inputs are the coefficients of initial multiple regression, the error meansquare value of multiple initial regression representations is calculated at the outputs of neurons. The article considers conditionality of the problem to calculate the matrix that is inverse one for matrix of correlation moments. It defines a condition number, which characterizes the relative error of stated task.The problem concerning the matrix condition of the correlation moment of informative signal features and noise arises when solving the problem to find the multiple coefficients of initial regression (MCIR and the residual mean-square values of the multiple regression representations. For obtaining the MCIR and finding the residual mean-square values the matrix of correlation moments of

  18. System of National Accounts as an Information Base for Tax Statistics

    Directory of Open Access Journals (Sweden)

    A. E. Lyapin

    2017-01-01

    Full Text Available The article is devoted to those aspects of the system of national accounts, which together perform the role of information base of tax statistics. In our time, the tax system is one of the main subjects of the discussions about the methods and directions of its reform.Taxes are one of the main factors of regulation of the economy and act as an incentive for its development. Analysis of tax revenues to the budgets of different levels will enable to collect taxes and perform tax burden for various industries. From the amount of tax revenue it is possible to judge scales of reproductive processes in the country. It should be noted that taxes in the SNA are special. As mentioned earlier, in the SNA, taxes on products are treated in the form of income. At the same time, most economists prefer, their consideration in the form of consumption taxes, and taxes on various financial transactions (for example: taxes on the purchase/sale of securities are treated as taxes on production, including in cases when there are no services. It would be rational to revise and amend the SNA associated with the interpretation of all taxes and subsidies, to ensure better understanding and compliance with user needs.Taxes are an integral part of any state and an indispensable element of economic relations of any society. In turn, taxes and the budget are inextricably linked, as these relations have a clearly expressed, objective bilateral character. Taxes are the main groups of budget revenues, which makes it possible to finance all the government agencies and expenditure items, as well as the implementation of institutional subsidy units that make up the SNA sector “non-financial corporations”.The second side story is that taxes – a part of the money that is taken from producers and households. The total mass of taxes depends on the composition of taxes, tax rates, tax base and scope of benefits. The bulk of tax revenues also depends on possible changes in

  19. Statistical Properties of Maximum Likelihood Estimators of Power Law Spectra Information

    Science.gov (United States)

    Howell, L. W., Jr.

    2003-01-01

    A simple power law model consisting of a single spectral index, sigma(sub 2), is believed to be an adequate description of the galactic cosmic-ray (GCR) proton flux at energies below 10(exp 13) eV, with a transition at the knee energy, E(sub k), to a steeper spectral index sigma(sub 2) greater than sigma(sub 1) above E(sub k). The maximum likelihood (ML) procedure was developed for estimating the single parameter sigma(sub 1) of a simple power law energy spectrum and generalized to estimate the three spectral parameters of the broken power law energy spectrum from simulated detector responses and real cosmic-ray data. The statistical properties of the ML estimator were investigated and shown to have the three desirable properties: (Pl) consistency (asymptotically unbiased), (P2) efficiency (asymptotically attains the Cramer-Rao minimum variance bound), and (P3) asymptotically normally distributed, under a wide range of potential detector response functions. Attainment of these properties necessarily implies that the ML estimation procedure provides the best unbiased estimator possible. While simulation studies can easily determine if a given estimation procedure provides an unbiased estimate of the spectra information, and whether or not the estimator is approximately normally distributed, attainment of the Cramer-Rao bound (CRB) can only be ascertained by calculating the CRB for an assumed energy spectrum- detector response function combination, which can be quite formidable in practice. However, the effort in calculating the CRB is very worthwhile because it provides the necessary means to compare the efficiency of competing estimation techniques and, furthermore, provides a stopping rule in the search for the best unbiased estimator. Consequently, the CRB for both the simple and broken power law energy spectra are derived herein and the conditions under which they are stained in practice are investigated.

  20. Inclusion probability for DNA mixtures is a subjective one-sided match statistic unrelated to identification information

    Directory of Open Access Journals (Sweden)

    Mark William Perlin

    2015-01-01

    Full Text Available Background: DNA mixtures of two or more people are a common type of forensic crime scene evidence. A match statistic that connects the evidence to a criminal defendant is usually needed for court. Jurors rely on this strength of match to help decide guilt or innocence. However, the reliability of unsophisticated match statistics for DNA mixtures has been questioned. Materials and Methods: The most prevalent match statistic for DNA mixtures is the combined probability of inclusion (CPI, used by crime labs for over 15 years. When testing 13 short tandem repeat (STR genetic loci, the CPI -1 value is typically around a million, regardless of DNA mixture composition. However, actual identification information, as measured by a likelihood ratio (LR, spans a much broader range. This study examined probability of inclusion (PI mixture statistics for 517 locus experiments drawn from 16 reported cases and compared them with LR locus information calculated independently on the same data. The log(PI -1 values were examined and compared with corresponding log(LR values. Results: The LR and CPI methods were compared in case examples of false inclusion, false exclusion, a homicide, and criminal justice outcomes. Statistical analysis of crime laboratory STR data shows that inclusion match statistics exhibit a truncated normal distribution having zero center, with little correlation to actual identification information. By the law of large numbers (LLN, CPI -1 increases with the number of tested genetic loci, regardless of DNA mixture composition or match information. These statistical findings explain why CPI is relatively constant, with implications for DNA policy, criminal justice, cost of crime, and crime prevention. Conclusions: Forensic crime laboratories have generated CPI statistics on hundreds of thousands of DNA mixture evidence items. However, this commonly used match statistic behaves like a random generator of inclusionary values, following the LLN

  1. Components of spatial information management in wildlife ecology: Software for statistical and modeling analysis [Chapter 14

    Science.gov (United States)

    Hawthorne L. Beyer; Jeff Jenness; Samuel A. Cushman

    2010-01-01

    Spatial information systems (SIS) is a term that describes a wide diversity of concepts, techniques, and technologies related to the capture, management, display and analysis of spatial information. It encompasses technologies such as geographic information systems (GIS), global positioning systems (GPS), remote sensing, and relational database management systems (...

  2. Using Information Technology in Teaching of Business Statistics in Nigeria Business School

    Science.gov (United States)

    Hamadu, Dallah; Adeleke, Ismaila; Ehie, Ike

    2011-01-01

    This paper discusses the use of Microsoft Excel software in the teaching of statistics in the Faculty of Business Administration at the University of Lagos, Nigeria. Problems associated with existing traditional methods are identified and a novel pedagogy using Excel is proposed. The advantages of using this software over other specialized…

  3. Information capacity and pattern formation in a tent map network featuring statistical periodicity

    Science.gov (United States)

    Hauptmann, C.; Touchette, H.; Mackey, M. C.

    2003-02-01

    We provide quantitative support to the observation that lattices of coupled maps are “efficient” information coding devices. It has been suggested recently that lattices of coupled maps may provide a model of information coding in the nervous system because of their ability to create structured and stimulus-dependent activity patterns which have the potential to be used for storing information. In this paper, we give an upper bound to the effective number of patterns that can be used to store information in the lattice by evaluating numerically its information capacity or information rate as a function of the coupling strength between the maps. We also estimate the time taken by the lattice to establish a limiting activity pattern.

  4. Statistical Inference on Memory Structure of Processes and Its Applications to Information Theory

    Science.gov (United States)

    2016-05-12

    these basic neighborhoods are called context blocks . The context blocks may be infinite. Statistical estimation of the context block system of a random ...strong consistency of the estimator. In the special case when all the context blocks lie in a uniform finite basic neighborhood, therefore the random ...Department of the Army position, policy or decision, unless so designated by other documentation. 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS

  5. The effect of interspike interval statistics on the information gain under the rate coding hypothesis

    Czech Academy of Sciences Publication Activity Database

    Koyama, S.; Košťál, Lubomír

    2014-01-01

    Roč. 11, č. 1 (2014), s. 63-80 ISSN 1547-1063. [International Workshop on Neural Coding (NC) /10./. Praha, 02.09.2012-07.09.2012] R&D Projects: GA ČR(CZ) GPP103/12/P558 Institutional support: RVO:67985823 Keywords : Kullback-Leibler divergence * neural spike trains * Fisher information Subject RIV: BD - Theory of Information Impact factor: 0.840, year: 2014

  6. Current practices in spatial analysis of cancer data: mapping health statistics to inform policymakers and the public.

    Science.gov (United States)

    Bell, B Sue; Hoskins, Richard E; Pickle, Linda Williams; Wartenberg, Daniel

    2006-11-08

    To communicate population-based cancer statistics, cancer researchers have a long tradition of presenting data in a spatial representation, or map. Historically, health data were presented in printed atlases in which the map producer selected the content and format. The availability of geographic information systems (GIS) with comprehensive mapping and spatial analysis capability for desktop and Internet mapping has greatly expanded the number of producers and consumers of health maps, including policymakers and the public.Because health maps, particularly ones that show elevated cancer rates, historically have raised public concerns, it is essential that these maps be designed to be accurate, clear, and interpretable for the broad range of users who may view them. This article focuses on designing maps to communicate effectively. It is based on years of research into the use of health maps for communicating among public health researchers. The basics for designing maps that communicate effectively are similar to the basics for any mode of communication. Tasks include deciding on the purpose, knowing the audience and its characteristics, choosing a media suitable for both the purpose and the audience, and finally testing the map design to ensure that it suits the purpose with the intended audience, and communicates accurately and effectively. Special considerations for health maps include ensuring confidentiality and reflecting the uncertainty of small area statistics. Statistical maps need to be based on sound practices and principles developed by the statistical and cartographic communities. The biggest challenge is to ensure that maps of health statistics inform without misinforming. Advances in the sciences of cartography, statistics, and visualization of spatial data are constantly expanding the toolkit available to mapmakers to meet this challenge. Asking potential users to answer questions or to talk about what they see is still the best way to evaluate the

  7. Current practices in spatial analysis of cancer data: mapping health statistics to inform policymakers and the public

    Directory of Open Access Journals (Sweden)

    Wartenberg Daniel

    2006-11-01

    Full Text Available Abstract Background To communicate population-based cancer statistics, cancer researchers have a long tradition of presenting data in a spatial representation, or map. Historically, health data were presented in printed atlases in which the map producer selected the content and format. The availability of geographic information systems (GIS with comprehensive mapping and spatial analysis capability for desktop and Internet mapping has greatly expanded the number of producers and consumers of health maps, including policymakers and the public. Because health maps, particularly ones that show elevated cancer rates, historically have raised public concerns, it is essential that these maps be designed to be accurate, clear, and interpretable for the broad range of users who may view them. This article focuses on designing maps to communicate effectively. It is based on years of research into the use of health maps for communicating among public health researchers. Results The basics for designing maps that communicate effectively are similar to the basics for any mode of communication. Tasks include deciding on the purpose, knowing the audience and its characteristics, choosing a media suitable for both the purpose and the audience, and finally testing the map design to ensure that it suits the purpose with the intended audience, and communicates accurately and effectively. Special considerations for health maps include ensuring confidentiality and reflecting the uncertainty of small area statistics. Statistical maps need to be based on sound practices and principles developed by the statistical and cartographic communities. Conclusion The biggest challenge is to ensure that maps of health statistics inform without misinforming. Advances in the sciences of cartography, statistics, and visualization of spatial data are constantly expanding the toolkit available to mapmakers to meet this challenge. Asking potential users to answer questions or to talk

  8. TECHNIQUE OF CARRYING OUT THE PRACTICAL TRAINING ON MATHEMATICAL STATISTICS ABOUT USE OF INFORMATION TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    В А Бубнов

    2017-12-01

    Full Text Available In article the maintenance of a technique of training in a computer class on the example of the statistical analysis of the price of dollar in rubles within March, 2017 with use of the Microsoft Excel program is shown. This analysis allows from the traditional data defining dynamics of the price of dollar depending on date of day of this month to reveal days of month in which the price of dollar is grouped rather average price of dollar, and also to reveal so-called rare days in which the dollar price strongly differs from average as towards her reduction, and increase.

  9. Improvement of Information and Methodical Provision of Macro-economic Statistical Analysis

    Directory of Open Access Journals (Sweden)

    Tiurina Dina M.

    2014-02-01

    Full Text Available The article generalises and analyses main shortcomings of the modern system of macro-statistical analysis based on the use of the system of national accounts and balance of the national economy. The article proves on the basis of historic analysis of formation of indicators of the system of national accounts that problems with its practical use have both regional and global reasons. In order to eliminate impossibility of accounting life quality the article offers a system of quality indicators based on the general perception of wellbeing as assurance in own solvency of population and representative sampling of economic subjects.

  10. English Collocation Learning through Corpus Data: On-Line Concordance and Statistical Information

    Science.gov (United States)

    Ohtake, Hiroshi; Fujita, Nobuyuki; Kawamoto, Takeshi; Morren, Brian; Ugawa, Yoshihiro; Kaneko, Shuji

    2012-01-01

    We developed an English Collocations On Demand system offering on-line corpus and concordance information to help Japanese researchers acquire a better command of English collocation patterns. The Life Science Dictionary Corpus consists of approximately 90,000,000 words collected from life science related research papers published in academic…

  11. 76 FR 28421 - Proposed Information Collection; Comment Request; Marine Recreational Fisheries Statistics Survey

    Science.gov (United States)

    2011-05-17

    .... Studies will test the effectiveness of alternative sample frames and data collection methods for... Collection Information will be collected through telephone and mail interviews. III. Data OMB Control Number... improved data collection program for recreational fisheries. To meet the requirements of MSA, NOAA...

  12. Compendium of Statistical and Financial Information: Ontario Universities, 2001-02.

    Science.gov (United States)

    Council of Ontario Universities, Toronto.

    This compendium presents data about aspects of the Ontario University System, Canada. It is a companion to the "Financial Report of Ontario Universities," the annual series of volumes prepared under the auspices of the Council of Financial OfficersUniversities of Ontario (COFO-UO). The Compendium contains supplementary information on…

  13. Validation of survey information on smoking and alcohol consumption against import statistics, Greenland 1993-2010

    DEFF Research Database (Denmark)

    Bjerregaard, Peter; Becker, Ulrik

    2013-01-01

    Questionnaires are widely used to obtain information on health-related behaviour, and they are more often than not the only method that can be used to assess the distribution of behaviour in subgroups of the population. No validation studies of reported consumption of tobacco or alcohol have been...

  14. Statistical Distances and Their Applications to Biophysical Parameter Estimation: Information Measures, M-Estimates, and Minimum Contrast Methods

    Directory of Open Access Journals (Sweden)

    Peter R. J. North

    2013-03-01

    Full Text Available Radiative transfer models predicting the bidirectional reflectance factor (BRF of leaf canopies are powerful tools that relate biophysical parameters such as leaf area index (LAI, fractional vegetation cover fV and the fraction of photosynthetically active radiation absorbed by the green parts of the vegetation canopy (fAPAR to remotely sensed reflectance data. One of the most successful approaches to biophysical parameter estimation is the inversion of detailed radiative transfer models through the construction of Look-Up Tables (LUTs. The solution of the inverse problem requires additional information on canopy structure, soil background and leaf properties, and the relationships between these parameters and the measured reflectance data are often nonlinear. The commonly used approach for optimization of a solution is based on minimization of the least squares estimate between model and observations (referred to as cost function or distance; here we will also use the terms “statistical distance” or “divergence” or “metric”, which are common in the statistical literature. This paper investigates how least-squares minimization and alternative distances affect the solution to the inverse problem. The paper provides a comprehensive list of different cost functions from the statistical literature, which can be divided into three major classes: information measures, M-estimates and minimum contrast methods. We found that, for the conditions investigated, Least Square Estimation (LSE is not an optimal statistical distance for the estimation of biophysical parameters. Our results indicate that other statistical distances, such as the two power measures, Hellinger, Pearson chi-squared measure, Arimoto and Koenker–Basset distances result in better estimates of biophysical parameters than LSE; in some cases the parameter estimation was improved by 15%.

  15. Statistical mechanics of consciousness: Maximization of information content of network is associated with conscious awareness

    Science.gov (United States)

    Guevara Erra, R.; Mateos, D. M.; Wennberg, R.; Perez Velazquez, J. L.

    2016-11-01

    It is said that complexity lies between order and disorder. In the case of brain activity and physiology in general, complexity issues are being considered with increased emphasis. We sought to identify features of brain organization that are optimal for sensory processing, and that may guide the emergence of cognition and consciousness, by analyzing neurophysiological recordings in conscious and unconscious states. We find a surprisingly simple result: Normal wakeful states are characterized by the greatest number of possible configurations of interactions between brain networks, representing highest entropy values. Therefore, the information content is larger in the network associated to conscious states, suggesting that consciousness could be the result of an optimization of information processing. These findings help to guide in a more formal sense inquiry into how consciousness arises from the organization of matter.

  16. Information-theory-based solution of the inverse problem in classical statistical mechanics.

    Science.gov (United States)

    D'Alessandro, Marco; Cilloco, Francesco

    2010-08-01

    We present a procedure for the determination of the interaction potential from the knowledge of the radial pair distribution function. The method, realized inside an inverse Monte Carlo simulation scheme, is based on the application of the maximum entropy principle of information theory and the interaction potential emerges as the asymptotic expression of the transition probability. Results obtained for high density monoatomic fluids are very satisfactory and provide an accurate extraction of the potential, despite a modest computational effort.

  17. Nonparametric Information Geometry: From Divergence Function to Referential-Representational Biduality on Statistical Manifolds

    Directory of Open Access Journals (Sweden)

    Jun Zhang

    2013-12-01

    Full Text Available Divergence functions are the non-symmetric “distance” on the manifold, Μθ, of parametric probability density functions over a measure space, (Χ,μ. Classical information geometry prescribes, on Μθ: (i a Riemannian metric given by the Fisher information; (ii a pair of dual connections (giving rise to the family of α-connections that preserve the metric under parallel transport by their joint actions; and (iii a family of divergence functions ( α-divergence defined on Μθ x Μθ, which induce the metric and the dual connections. Here, we construct an extension of this differential geometric structure from Μθ (that of parametric probability density functions to the manifold, Μ, of non-parametric functions on X, removing the positivity and normalization constraints. The generalized Fisher information and α-connections on M are induced by an α-parameterized family of divergence functions, reflecting the fundamental convex inequality associated with any smooth and strictly convex function. The infinite-dimensional manifold, M, has zero curvature for all these α-connections; hence, the generally non-zero curvature of M can be interpreted as arising from an embedding of Μθ into Μ. Furthermore, when a parametric model (after a monotonic scaling forms an affine submanifold, its natural and expectation parameters form biorthogonal coordinates, and such a submanifold is dually flat for α = ± 1, generalizing the results of Amari’s α-embedding. The present analysis illuminates two different types of duality in information geometry, one concerning the referential status of a point (measurable function expressed in the divergence function (“referential duality” and the other concerning its representation under an arbitrary monotone scaling (“representational duality”.

  18. A resilient and efficient CFD framework: Statistical learning tools for multi-fidelity and heterogeneous information fusion

    Science.gov (United States)

    Lee, Seungjoon; Kevrekidis, Ioannis G.; Karniadakis, George Em

    2017-09-01

    Exascale-level simulations require fault-resilient algorithms that are robust against repeated and expected software and/or hardware failures during computations, which may render the simulation results unsatisfactory. If each processor can share some global information about the simulation from a coarse, limited accuracy but relatively costless auxiliary simulator we can effectively fill-in the missing spatial data at the required times by a statistical learning technique - multi-level Gaussian process regression, on the fly; this has been demonstrated in previous work [1]. Based on the previous work, we also employ another (nonlinear) statistical learning technique, Diffusion Maps, that detects computational redundancy in time and hence accelerate the simulation by projective time integration, giving the overall computation a "patch dynamics" flavor. Furthermore, we are now able to perform information fusion with multi-fidelity and heterogeneous data (including stochastic data). Finally, we set the foundations of a new framework in CFD, called patch simulation, that combines information fusion techniques from, in principle, multiple fidelity and resolution simulations (and even experiments) with a new adaptive timestep refinement technique. We present two benchmark problems (the heat equation and the Navier-Stokes equations) to demonstrate the new capability that statistical learning tools can bring to traditional scientific computing algorithms. For each problem, we rely on heterogeneous and multi-fidelity data, either from a coarse simulation of the same equation or from a stochastic, particle-based, more "microscopic" simulation. We consider, as such "auxiliary" models, a Monte Carlo random walk for the heat equation and a dissipative particle dynamics (DPD) model for the Navier-Stokes equations. More broadly, in this paper we demonstrate the symbiotic and synergistic combination of statistical learning, domain decomposition, and scientific computing in

  19. Statistical mechanics of the mixed majority-minority game with random external information

    International Nuclear Information System (INIS)

    Martino, A de; Giardina, I; Mosetti, G

    2003-01-01

    We study the asymptotic macroscopic properties of the mixed majority-minority game, modelling a population in which two types of heterogeneous adaptive agents, namely 'fundamentalists' driven by differentiation and 'trend-followers' driven by imitation, interact. The presence of a fraction f of trend-followers is shown to induce (a) a significant loss of informational efficiency with respect to a pure minority game (in particular, an efficient, unpredictable phase exists only for f 1/2. We solve the model by means of an approximate static (replica) theory and by a direct dynamical (generating functional) technique. The two approaches coincide and match numerical results convincingly

  20. Quantum statistical gravity: time dilation due to local information in many-body quantum systems

    Science.gov (United States)

    Sels, Dries; Wouters, Michiel

    2017-08-01

    We propose a generic mechanism for the emergence of a gravitational potential that acts on all classical objects in a quantum system. Our conjecture is based on the analysis of mutual information in many-body quantum systems. Since measurements in quantum systems affect the surroundings through entanglement, a measurement at one position reduces the entropy in its neighbourhood. This reduction in entropy can be described by a local temperature, that is directly related to the gravitational potential. A crucial ingredient in our argument is that ideal classical mechanical motion occurs at constant probability. This definition is motivated by the analysis of entropic forces in classical systems.

  1. Revisiting Information Technology tools serving authorship and editorship: a case-guided tutorial to statistical analysis and plagiarism detection.

    Science.gov (United States)

    Bamidis, P D; Lithari, C; Konstantinidis, S T

    2010-12-01

    With the number of scientific papers published in journals, conference proceedings, and international literature ever increasing, authors and reviewers are not only facilitated with an abundance of information, but unfortunately continuously confronted with risks associated with the erroneous copy of another's material. In parallel, Information Communication Technology (ICT) tools provide to researchers novel and continuously more effective ways to analyze and present their work. Software tools regarding statistical analysis offer scientists the chance to validate their work and enhance the quality of published papers. Moreover, from the reviewers and the editor's perspective, it is now possible to ensure the (text-content) originality of a scientific article with automated software tools for plagiarism detection. In this paper, we provide a step-bystep demonstration of two categories of tools, namely, statistical analysis and plagiarism detection. The aim is not to come up with a specific tool recommendation, but rather to provide useful guidelines on the proper use and efficiency of either category of tools. In the context of this special issue, this paper offers a useful tutorial to specific problems concerned with scientific writing and review discourse. A specific neuroscience experimental case example is utilized to illustrate the young researcher's statistical analysis burden, while a test scenario is purpose-built using open access journal articles to exemplify the use and comparative outputs of seven plagiarism detection software pieces.

  2. Predicting Protein Secondary Structure Using Consensus Data Mining (CDM) Based on Empirical Statistics and Evolutionary Information.

    Science.gov (United States)

    Kandoi, Gaurav; Leelananda, Sumudu P; Jernigan, Robert L; Sen, Taner Z

    2017-01-01

    Predicting the secondary structure of a protein from its sequence still remains a challenging problem. The prediction accuracies remain around 80 %, and for very diverse methods. Using evolutionary information and machine learning algorithms in particular has had the most impact. In this chapter, we will first define secondary structures, then we will review the Consensus Data Mining (CDM) technique based on the robust GOR algorithm and Fragment Database Mining (FDM) approach. GOR V is an empirical method utilizing a sliding window approach to model the secondary structural elements of a protein by making use of generalized evolutionary information. FDM uses data mining from experimental structure fragments, and is able to successfully predict the secondary structure of a protein by combining experimentally determined structural fragments based on sequence similarities of the fragments. The CDM method combines predictions from GOR V and FDM in a hierarchical manner to produce consensus predictions for secondary structure. In other words, if sequence fragment are not available, then it uses GOR V to make the secondary structure prediction. The online server of CDM is available at http://gor.bb.iastate.edu/cdm/ .

  3. Superluminal propagation and information transfer: A statistical approach in the microwave domain

    Energy Technology Data Exchange (ETDEWEB)

    Dorrah, Ahmed H., E-mail: Ahmed.Dorrah@mail.utoronto.ca; Kayili, Levent; Mojahedi, Mo

    2014-09-12

    Signal velocity is calculated in a medium with negative group delay (NGD). By accounting for the medium and the detector noise sources, the time varying probability of error at the detector [Pe(t)] is evaluated in the NGD channel and a normal dispersion channel. The scheme in which Pe(t) falls below a threshold at earlier time, implies faster information transfer. It is found that the signal velocity depends on the detector type and the relative noise strength of the detector with respect to the channel. Finally, it is shown that NGD channels can be useful in applications that are limited by the detector noise. - Highlights: • Information is described probabilistically by the temporal error probability Pe(t). • Pe(t) is calculated in superluminal medium versus a positive group delay medium. • Pe(t) is studied in three different detectors with three different noise profiles. • Signal latency is reduced in superluminal media limited by the detector noise. • Signal latency cannot be reduced in certain detection schemes despite its noise.

  4. Sources identification of antibiotic pollution combining land use information and multivariate statistics.

    Science.gov (United States)

    Li, Jia; Zhang, Haibo; Chen, Yongshan; Luo, Yongming; Zhang, Hua

    2016-07-01

    To quantify the extent of antibiotic contamination and to identity the dominant pollutant sources in the Tiaoxi River Watershed, surface water samples were collected at eight locations and analyzed for four tetracyclines and three sulfonamides using ultra-performance liquid chromatography tandem mass spectrometry (UPLC-MS/MS). The observed maximum concentrations of tetracycline (623 ng L(-1)), oxytetracycline (19,810 ng L(-1)), and sulfamethoxazole (112 ng L(-1)) exceeded their corresponding Predicted No Effect Concentration (PNEC) values. In particular, high concentrations of antibiotics were observed in wet summer with heavy rainfall. The maximum concentrations of antibiotics appeared in the vicinity of intensive aquaculture areas. High-resolution land use data were used for identifying diffuse source of antibiotic pollution in the watershed. Significant correlations between tetracycline and developed (r = 0.93), tetracycline and barren (r = 0.87), oxytetracycline and barren (r = 0.82), and sulfadiazine and agricultural facilities (r = 0.71) were observed. In addition, the density of aquaculture significantly correlated with doxycycline (r = 0.74) and oxytetracycline (r = 0.76), while the density of livestock significantly correlated with sulfadiazine (r = 0.71). Principle Component Analysis (PCA) indicated that doxycycline, tetracycline, oxytetracycline, and sulfamethoxazole were from aquaculture and domestic sources, whereas sulfadiazine and sulfamethazine were from livestock wastewater. Flood or drainage from aquaculture ponds was identified as a major source of antibiotics in the Tiaoxi watershed. A hot-spot map was created based on results of land use analysis and multi-variable statistics, which provided an effective management tool of sources identification in watersheds with multiple diffuse sources of antibiotic pollution.

  5. The spread of scientific information: insights from the web usage statistics in PLoS article-level metrics.

    Directory of Open Access Journals (Sweden)

    Koon-Kiu Yan

    Full Text Available The presence of web-based communities is a distinctive signature of Web 2.0. The web-based feature means that information propagation within each community is highly facilitated, promoting complex collective dynamics in view of information exchange. In this work, we focus on a community of scientists and study, in particular, how the awareness of a scientific paper is spread. Our work is based on the web usage statistics obtained from the PLoS Article Level Metrics dataset compiled by PLoS. The cumulative number of HTML views was found to follow a long tail distribution which is reasonably well-fitted by a lognormal one. We modeled the diffusion of information by a random multiplicative process, and thus extracted the rates of information spread at different stages after the publication of a paper. We found that the spread of information displays two distinct decay regimes: a rapid downfall in the first month after publication, and a gradual power law decay afterwards. We identified these two regimes with two distinct driving processes: a short-term behavior driven by the fame of a paper, and a long-term behavior consistent with citation statistics. The patterns of information spread were found to be remarkably similar in data from different journals, but there are intrinsic differences for different types of web usage (HTML views and PDF downloads versus XML. These similarities and differences shed light on the theoretical understanding of different complex systems, as well as a better design of the corresponding web applications that is of high potential marketing impact.

  6. The Spread of Scientific Information: Insights from the Web Usage Statistics in PLoS Article-Level Metrics

    Science.gov (United States)

    Yan, Koon-Kiu; Gerstein, Mark

    2011-01-01

    The presence of web-based communities is a distinctive signature of Web 2.0. The web-based feature means that information propagation within each community is highly facilitated, promoting complex collective dynamics in view of information exchange. In this work, we focus on a community of scientists and study, in particular, how the awareness of a scientific paper is spread. Our work is based on the web usage statistics obtained from the PLoS Article Level Metrics dataset compiled by PLoS. The cumulative number of HTML views was found to follow a long tail distribution which is reasonably well-fitted by a lognormal one. We modeled the diffusion of information by a random multiplicative process, and thus extracted the rates of information spread at different stages after the publication of a paper. We found that the spread of information displays two distinct decay regimes: a rapid downfall in the first month after publication, and a gradual power law decay afterwards. We identified these two regimes with two distinct driving processes: a short-term behavior driven by the fame of a paper, and a long-term behavior consistent with citation statistics. The patterns of information spread were found to be remarkably similar in data from different journals, but there are intrinsic differences for different types of web usage (HTML views and PDF downloads versus XML). These similarities and differences shed light on the theoretical understanding of different complex systems, as well as a better design of the corresponding web applications that is of high potential marketing impact. PMID:21603617

  7. Advancing Empirical Approaches to the Concept of Resilience: A Critical Examination of Panarchy, Ecological Information, and Statistical Evidence

    Directory of Open Access Journals (Sweden)

    Ali Kharrazi

    2016-09-01

    Full Text Available Despite its ambiguities, the concept of resilience is of critical importance to researchers, practitioners, and policy-makers in dealing with dynamic socio-ecological systems. In this paper, we critically examine the three empirical approaches of (i panarchy; (ii ecological information-based network analysis; and (iii statistical evidence of resilience to three criteria determined for achieving a comprehensive understanding and application of this concept. These criteria are the ability: (1 to reflect a system’s adaptability to shocks; (2 to integrate social and environmental dimensions; and (3 to evaluate system-level trade-offs. Our findings show that none of the three currently applied approaches are strong in handling all three criteria. Panarchy is strong in the first two criteria but has difficulty with normative trade-offs. The ecological information-based approach is strongest in evaluating trade-offs but relies on common dimensions that lead to over-simplifications in integrating the social and environmental dimensions. Statistical evidence provides suggestions that are simplest and easiest to act upon but are generally weak in all three criteria. This analysis confirms the value of these approaches in specific instances but also the need for further research in advancing empirical approaches to the concept of resilience.

  8. Practical Statistics

    CERN Document Server

    Lyons, L.

    2016-01-01

    Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.

  9. CMS Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...

  10. Image Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-08

    In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.

  11. Statistical methods

    CERN Document Server

    Szulc, Stefan

    1965-01-01

    Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then

  12. Global trade statistics lack granularity to inform traceability and management of diverse and high-value fishes.

    Science.gov (United States)

    Cawthorn, Donna-Mareè; Mariani, Stefano

    2017-10-09

    Illegal, unreported and unregulated (IUU) fishing and seafood supply chain fraud are multifaceted problems that demand multifaceted solutions. Here, we investigate the extent to which global fisheries trade data analyses can support effective seafood traceability and promote sustainable seafood markets using one of the world's most highly prized, yet misunderstood, groups of fishes as a model: the snappers, family Lutjanidae. By collating and comparing production, import and export data from international and national statistical collections for the period 2006-2013, we show that official trade data severely lack the level of detail required to track snapper trade flows, uncover potential IUU activities and/or inform exploitation management of snappers and related species. Moreover, we contend that the lack of taxonomic granularity and use of vague generic names in trade records represent one of the most insidious impediments to seafood traceability, and suggest that widely used harmonised commodity classification systems should evolve to address these gaps.

  13. METHODS AND TOOLS TO DEVELOP INNOVATIVE STRATEGIC MANAGEMENT DECISIONS BASED ON THE APPLICATION OF ADVANCED INFORMATION AND COMMUNICATION TECHNOLOGIES IN THE STATISTICAL BRANCH OF UZBEKISTAN

    Directory of Open Access Journals (Sweden)

    Irina E. Zhukovskya

    2013-01-01

    Full Text Available This paper focuses on the improvement of the statistical branch-based application of electronic document management and network information technology. As a software solutions proposed use of new software solutions of the State Committee on Statistics of the Republic of Uzbekistan «eStat 2.0», allowing not only to optimize the statistical sector employees, but also serves as a link between all the economic entities of the national economy.

  14. Conceptual Incongruence between Prion Disease and Genetic Diversity in Ovine Species within European Union defined by Informational Statistics Terms

    Directory of Open Access Journals (Sweden)

    Gheorghe Hrinca

    2016-11-01

    Full Text Available Biodiversity and the studies of spongiform encephalopathies in the farm animals are highly topical concerns of the contemporary scientific world. Both themes are very interesting for the life sciences and very important for the application field of animal breeding. The implementation of these two concepts creates an antithetical paradigm: the achievement of genetic prophylaxis joins with the decrease of genetic diversity. The paper examines the genetic diversity and its evolution in sheep livestock from the European space in the context in which the European Community has developed very laborious and costly programs targeted both for conservation and enhancement of biodiversity and to eradicate the scrapie in small ruminants. This paper utilises a precise method to quantify the genetic biodiversity in all sheep populations in Europe by a modern concept derived from informational statistics - informational energy. In addition, the paper proposes concrete and viable solutions to achieve these two desiderata at optimal levels in connection with a perfect perspicacity of sheep breeder which consists in accuracy of the reproduction process and correct application of the selection criteria.

  15. Practical Statistics

    OpenAIRE

    Lyons, L.

    2017-01-01

    Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses. Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing...

  16. Designing experiments for maximum information from cyclic oxidation tests and their statistical analysis using half Normal plots

    International Nuclear Information System (INIS)

    Coleman, S.Y.; Nicholls, J.R.

    2006-01-01

    Cyclic oxidation testing at elevated temperatures requires careful experimental design and the adoption of standard procedures to ensure reliable data. This is a major aim of the 'COTEST' research programme. Further, as such tests are both time consuming and costly, in terms of human effort, to take measurements over a large number of cycles, it is important to gain maximum information from a minimum number of tests (trials). This search for standardisation of cyclic oxidation conditions leads to a series of tests to determine the relative effects of cyclic parameters on the oxidation process. Following a review of the available literature, databases and the experience of partners to the COTEST project, the most influential parameters, upper dwell temperature (oxidation temperature) and time (hot time), lower dwell time (cold time) and environment, were investigated in partners' laboratories. It was decided to test upper dwell temperature at 3 levels, at and equidistant from a reference temperature; to test upper dwell time at a reference, a higher and a lower time; to test lower dwell time at a reference and a higher time and wet and dry environments. Thus an experiment, consisting of nine trials, was designed according to statistical criteria. The results of the trial were analysed statistically, to test the main linear and quadratic effects of upper dwell temperature and hot time and the main effects of lower dwell time (cold time) and environment. The nine trials are a quarter fraction of the 36 possible combinations of parameter levels that could have been studied. The results have been analysed by half Normal plots as there are only 2 degrees of freedom for the experimental error variance, which is rather low for a standard analysis of variance. Half Normal plots give a visual indication of which factors are statistically significant. In this experiment each trial has 3 replications, and the data are analysed in terms of mean mass change, oxidation kinetics

  17. Understanding Statistics - Cancer Statistics

    Science.gov (United States)

    Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.

  18. Pixels as ROIs (PAR): a less-biased and statistically powerful approach for gleaning functional information from image stacks.

    Science.gov (United States)

    Keller, Jacob; Keller, Jacob Pearson; Homma, Kazuaki; Dallos, Peter

    2013-01-01

    Especially in the last decade or so, there have been dramatic advances in fluorescence-based imaging methods designed to measure a multitude of functions in living cells. Despite this, many of the methods used to analyze the resulting images are limited. Perhaps the most common mode of analysis is the choice of regions of interest (ROIs), followed by quantification of the signal contained therein in comparison with another "control" ROI. While this method has several advantages, such as flexibility and capitalization on the power of human visual recognition capabilities, it has the drawbacks of potential subjectivity and lack of precisely defined criteria for ROI selection. This can lead to analyses which are less precise or accurate than the data might allow for, and generally a regrettable loss of information. Herein, we explore the possibility of abandoning the use of conventional ROIs, and instead propose treating individual pixels as ROIs, such that all information can be extracted systematically with the various statistical cutoffs we discuss. As a test case for this approach, we monitored intracellular pH in cells transfected with the chloride/bicarbonate transporter slc26a3 using the ratiometric dye SNARF-5F under various conditions. We performed a parallel analysis using two different levels of stringency in conventional ROI analysis as well as the pixels-as-ROIs (PAR) approach, and found that pH differences between control and transfected cells were accentuated by ~50-100% by using the PAR approach. We therefore consider this approach worthy of adoption, especially in cases in which higher accuracy and precision are required.

  19. Pixels as ROIs (PAR: a less-biased and statistically powerful approach for gleaning functional information from image stacks.

    Directory of Open Access Journals (Sweden)

    Jacob Keller

    Full Text Available Especially in the last decade or so, there have been dramatic advances in fluorescence-based imaging methods designed to measure a multitude of functions in living cells. Despite this, many of the methods used to analyze the resulting images are limited. Perhaps the most common mode of analysis is the choice of regions of interest (ROIs, followed by quantification of the signal contained therein in comparison with another "control" ROI. While this method has several advantages, such as flexibility and capitalization on the power of human visual recognition capabilities, it has the drawbacks of potential subjectivity and lack of precisely defined criteria for ROI selection. This can lead to analyses which are less precise or accurate than the data might allow for, and generally a regrettable loss of information. Herein, we explore the possibility of abandoning the use of conventional ROIs, and instead propose treating individual pixels as ROIs, such that all information can be extracted systematically with the various statistical cutoffs we discuss. As a test case for this approach, we monitored intracellular pH in cells transfected with the chloride/bicarbonate transporter slc26a3 using the ratiometric dye SNARF-5F under various conditions. We performed a parallel analysis using two different levels of stringency in conventional ROI analysis as well as the pixels-as-ROIs (PAR approach, and found that pH differences between control and transfected cells were accentuated by ~50-100% by using the PAR approach. We therefore consider this approach worthy of adoption, especially in cases in which higher accuracy and precision are required.

  20. Statistical approaches to the analysis of point count data: A little extra information can go a long way

    Science.gov (United States)

    Farnsworth, G.L.; Nichols, J.D.; Sauer, J.R.; Fancy, S.G.; Pollock, K.H.; Shriner, S.A.; Simons, T.R.; Ralph, C. John; Rich, Terrell D.

    2005-01-01

    Point counts are a standard sampling procedure for many bird species, but lingering concerns still exist about the quality of information produced from the method. It is well known that variation in observer ability and environmental conditions can influence the detection probability of birds in point counts, but many biologists have been reluctant to abandon point counts in favor of more intensive approaches to counting. However, over the past few years a variety of statistical and methodological developments have begun to provide practical ways of overcoming some of the problems with point counts. We describe some of these approaches, and show how they can be integrated into standard point count protocols to greatly enhance the quality of the information. Several tools now exist for estimation of detection probability of birds during counts, including distance sampling, double observer methods, time-depletion (removal) methods, and hybrid methods that combine these approaches. Many counts are conducted in habitats that make auditory detection of birds much more likely than visual detection. As a framework for understanding detection probability during such counts, we propose separating two components of the probability a bird is detected during a count into (1) the probability a bird vocalizes during the count and (2) the probability this vocalization is detected by an observer. In addition, we propose that some measure of the area sampled during a count is necessary for valid inferences about bird populations. This can be done by employing fixed-radius counts or more sophisticated distance-sampling models. We recommend any studies employing point counts be designed to estimate detection probability and to include a measure of the area sampled.

  1. Medicaid Statistical Information System (MSIS): a data source for quality reporting for Medicaid and the Children's Health Insurance Program (CHIP).

    Science.gov (United States)

    MacTaggart, Patricia; Foster, Ashley; Markus, Anne

    2011-04-01

    Section 401 of the Children's Health Insurance Program Reauthorization Act of 2009 (CHIPRA) requires the Department of Health and Human Services (HHS) to identify and publish healthcare quality measures for children enrolled in the Children's Health Insurance Program (CHIP) or Medicaid. CHIPRA also requires core measures to identify disparities by race and ethnicity, among other factors. State Medicaid and CHIP programs are currently facing significant budgetary pressures that are likely to increase with eligibility expansions and programmatic changes resulting from the Patient Protection and Affordable Care Act (PPACA). To limit the burden on states and increase the likelihood of states' voluntarily reporting on core pediatric quality measures, HHS may consider utilizing existing data sources. This article examines the feasibility of utilizing Medicaid Statistical Information System (MSIS) data to identify and analyze the core children's healthcare quality measures required by CHIPRA. Five key themes related to the feasibility of using MSIS as a data source for quality measures are identified: states have significant experience with data collection, performance measurement, and quality oversight for children in Medicaid and CHIP; CHIPRA provisions related to reporting of quality measures will be implemented at a time when states are facing major fiscal constraints; MSIS provides potential opportunities as it offers a rich source of data, but the difficulties in obtaining clean data should not be underestimated; MSIS has limitations; and states, the federal government, providers, and enrollees benefit from standardization in data and quality measurement.

  2. Statistical methodology for estimating the mean difference in a meta-analysis without study-specific variance information.

    Science.gov (United States)

    Sangnawakij, Patarawan; Böhning, Dankmar; Adams, Stephen; Stanton, Michael; Holling, Heinz

    2017-04-30

    Statistical inference for analyzing the results from several independent studies on the same quantity of interest has been investigated frequently in recent decades. Typically, any meta-analytic inference requires that the quantity of interest is available from each study together with an estimate of its variability. The current work is motivated by a meta-analysis on comparing two treatments (thoracoscopic and open) of congenital lung malformations in young children. Quantities of interest include continuous end-points such as length of operation or number of chest tube days. As studies only report mean values (and no standard errors or confidence intervals), the question arises how meta-analytic inference can be developed. We suggest two methods to estimate study-specific variances in such a meta-analysis, where only sample means and sample sizes are available in the treatment arms. A general likelihood ratio test is derived for testing equality of variances in two groups. By means of simulation studies, the bias and estimated standard error of the overall mean difference from both methodologies are evaluated and compared with two existing approaches: complete study analysis only and partial variance information. The performance of the test is evaluated in terms of type I error. Additionally, we illustrate these methods in the meta-analysis on comparing thoracoscopic and open surgery for congenital lung malformations and in a meta-analysis on the change in renal function after kidney donation. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  3. "Making Sense of Figures": Statistics, Computing and Information Technologies in Agriculture and Biology in Britain, 1920s-1960s

    OpenAIRE

    Parolini, Giuditta

    2013-01-01

    Throughout the twentieth century statistical methods have increasingly become part of experimental research. In particular, statistics has made quantification processes meaningful in the soft sciences, which had traditionally relied on activities such as collecting and describing diversity rather than timing variation. The thesis explores this change in relation to agriculture and biology, focusing on analysis of variance and experimental design, the statistical methods developed by the ma...

  4. The use of process models to inform and improve statistical models of nitrate occurrence, Great Miami River Basin, southwestern Ohio

    Science.gov (United States)

    Walter, Donald A.; Starn, J. Jeffrey

    2013-01-01

    Statistical models of nitrate occurrence in the glacial aquifer system of the northern United States, developed by the U.S. Geological Survey, use observed relations between nitrate concentrations and sets of explanatory variables—representing well-construction, environmental, and source characteristics— to predict the probability that nitrate, as nitrogen, will exceed a threshold concentration. However, the models do not explicitly account for the processes that control the transport of nitrogen from surface sources to a pumped well and use area-weighted mean spatial variables computed from within a circular buffer around the well as a simplified source-area conceptualization. The use of models that explicitly represent physical-transport processes can inform and, potentially, improve these statistical models. Specifically, groundwater-flow models simulate advective transport—predominant in many surficial aquifers— and can contribute to the refinement of the statistical models by (1) providing for improved, physically based representations of a source area to a well, and (2) allowing for more detailed estimates of environmental variables. A source area to a well, known as a contributing recharge area, represents the area at the water table that contributes recharge to a pumped well; a well pumped at a volumetric rate equal to the amount of recharge through a circular buffer will result in a contributing recharge area that is the same size as the buffer but has a shape that is a function of the hydrologic setting. These volume-equivalent contributing recharge areas will approximate circular buffers in areas of relatively flat hydraulic gradients, such as near groundwater divides, but in areas with steep hydraulic gradients will be elongated in the upgradient direction and agree less with the corresponding circular buffers. The degree to which process-model-estimated contributing recharge areas, which simulate advective transport and therefore account for

  5. Tennessee StreamStats: A Web-Enabled Geographic Information System Application for Automating the Retrieval and Calculation of Streamflow Statistics

    Science.gov (United States)

    Ladd, David E.; Law, George S.

    2007-01-01

    The U.S. Geological Survey (USGS) provides streamflow and other stream-related information needed to protect people and property from floods, to plan and manage water resources, and to protect water quality in the streams. Streamflow statistics provided by the USGS, such as the 100-year flood and the 7-day 10-year low flow, frequently are used by engineers, land managers, biologists, and many others to help guide decisions in their everyday work. In addition to streamflow statistics, resource managers often need to know the physical and climatic characteristics (basin characteristics) of the drainage basins for locations of interest to help them understand the mechanisms that control water availability and water quality at these locations. StreamStats is a Web-enabled geographic information system (GIS) application that makes it easy for users to obtain streamflow statistics, basin characteristics, and other information for USGS data-collection stations and for ungaged sites of interest. If a user selects the location of a data-collection station, StreamStats will provide previously published information for the station from a database. If a user selects a location where no data are available (an ungaged site), StreamStats will run a GIS program to delineate a drainage basin boundary, measure basin characteristics, and estimate streamflow statistics based on USGS streamflow prediction methods. A user can download a GIS feature class of the drainage basin boundary with attributes including the measured basin characteristics and streamflow estimates.

  6. Multivariate mixed linear model analysis of longitudinal data: an information-rich statistical technique for analyzing disease resistance data

    Science.gov (United States)

    The mixed linear model (MLM) is currently among the most advanced and flexible statistical modeling techniques and its use in tackling problems in plant pathology has begun surfacing in the literature. The longitudinal MLM is a multivariate extension that handles repeatedly measured data, such as r...

  7. 14 CFR 385.19 - Authority of the Director, Office of Aviation Information, Bureau of Transportation Statistics.

    Science.gov (United States)

    2010-01-01

    ... for use of domestic and international service segment and market data in accordance with the... required by subchapter A of this chapter. (h) Determine the data necessary to complete the International...) Grant or deny requests for use of international Origin and Destination Survey statistics in accordance...

  8. Random walks along the streets and canals in compact cities: Spectral analysis, dynamical modularity, information, and statistical mechanics

    Science.gov (United States)

    Volchenkov, D.; Blanchard, Ph.

    2007-02-01

    Different models of random walks on the dual graphs of compact urban structures are considered. Analysis of access times between streets helps to detect the city modularity. The statistical mechanics approach to the ensembles of lazy random walkers is developed. The complexity of city modularity can be measured by an informationlike parameter which plays the role of an individual fingerprint of Genius loci. Global structural properties of a city can be characterized by the thermodynamic parameters calculated in the random walk problem.

  9. Cosmic Statistics of Statistics

    OpenAIRE

    Szapudi, I.; Colombi, S.; Bernardeau, F.

    1999-01-01

    The errors on statistics measured in finite galaxy catalogs are exhaustively investigated. The theory of errors on factorial moments by Szapudi & Colombi (1996) is applied to cumulants via a series expansion method. All results are subsequently extended to the weakly non-linear regime. Together with previous investigations this yields an analytic theory of the errors for moments and connected moments of counts in cells from highly nonlinear to weakly nonlinear scales. The final analytic formu...

  10. Injury Statistics

    Science.gov (United States)

    ... Certification Import Surveillance International Recall Guidance Civil and Criminal Penalties Federal Court Orders & Decisions Research & Statistics Research & Statistics Technical Reports Injury Statistics NEISS ...

  11. Statistical thermodynamics

    International Nuclear Information System (INIS)

    Lim, Gyeong Hui

    2008-03-01

    This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics

  12. National transportation statistics 2011

    Science.gov (United States)

    2011-04-01

    Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record, ...

  13. Mental Illness Statistics

    Science.gov (United States)

    ... News & Events About Us Home > Health Information Share Statistics Research shows that mental illnesses are common in ... of mental illnesses, such as suicide and disability. Statistics Top ı cs Mental Illness Any Anxiety Disorder ...

  14. Statistical Modeling of Caregiver Burden and Distress among Informal Caregivers of Individuals with Amyotrophic Lateral Sclerosis, Alzheimer's Disease, and Cancer

    Science.gov (United States)

    Cumming, John McClure

    2011-01-01

    Caregiver burden and distress have been associated with informal caregivers. Research findings on the specific aspects of the caregiving role that influence burden are mixed. Factors such as amount of time per day giving care and specific characteristics about the disease progression have been linked to caregiver burden and distress. Other…

  15. Statistical approaches to the analysis of point count data: a little extra information can go a long way

    Science.gov (United States)

    George L. Farnsworth; James D. Nichols; John R. Sauer; Steven G. Fancy; Kenneth H. Pollock; Susan A. Shriner; Theodore R. Simons

    2005-01-01

    Point counts are a standard sampling procedure for many bird species, but lingering concerns still exist about the quality of information produced from the method. It is well known that variation in observer ability and environmental conditions can influence the detection probability of birds in point counts, but many biologists have been reluctant to abandon point...

  16. Comparison of information-theoretic to statistical methods for gene-gene interactions in the presence of genetic heterogeneity

    Directory of Open Access Journals (Sweden)

    Sucheston Lara

    2010-09-01

    Full Text Available Abstract Background Multifactorial diseases such as cancer and cardiovascular diseases are caused by the complex interplay between genes and environment. The detection of these interactions remains challenging due to computational limitations. Information theoretic approaches use computationally efficient directed search strategies and thus provide a feasible solution to this problem. However, the power of information theoretic methods for interaction analysis has not been systematically evaluated. In this work, we compare power and Type I error of an information-theoretic approach to existing interaction analysis methods. Methods The k-way interaction information (KWII metric for identifying variable combinations involved in gene-gene interactions (GGI was assessed using several simulated data sets under models of genetic heterogeneity driven by susceptibility increasing loci with varying allele frequency, penetrance values and heritability. The power and proportion of false positives of the KWII was compared to multifactor dimensionality reduction (MDR, restricted partitioning method (RPM and logistic regression. Results The power of the KWII was considerably greater than MDR on all six simulation models examined. For a given disease prevalence at high values of heritability, the power of both RPM and KWII was greater than 95%. For models with low heritability and/or genetic heterogeneity, the power of the KWII was consistently greater than RPM; the improvements in power for the KWII over RPM ranged from 4.7% to 14.2% at for α = 0.001 in the three models at the lowest heritability values examined. KWII performed similar to logistic regression. Conclusions Information theoretic models are flexible and have excellent power to detect GGI under a variety of conditions that characterize complex diseases.

  17. Valley Fever (Coccidioidomycosis) Statistics

    Science.gov (United States)

    ... mouth, throat, and esophagus Vaginal candidiasis Invasive candidiasis Definition Symptoms Risk & Prevention Sources Diagnosis Treatment Statistics Healthcare Professionals More Resources Candida auris General Information ...

  18. Cancer Statistics

    Science.gov (United States)

    ... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...

  19. Caregiving Statistics

    Science.gov (United States)

    ... Coping with Alzheimer’s COPD Caregiving Take Care! Caregiver Statistics Statistics on Family Caregivers and Family Caregiving Caregiving Population ... Health Care Caregiver Self-Awareness State by State Statistics Caregiving Population The value of the services family ...

  20. The Balance-Scale Task Revisited: A Comparison of Statistical Models for Rule-Based and Information-Integration Theories of Proportional Reasoning.

    Directory of Open Access Journals (Sweden)

    Abe D Hofman

    Full Text Available We propose and test three statistical models for the analysis of children's responses to the balance scale task, a seminal task to study proportional reasoning. We use a latent class modelling approach to formulate a rule-based latent class model (RB LCM following from a rule-based perspective on proportional reasoning and a new statistical model, the Weighted Sum Model, following from an information-integration approach. Moreover, a hybrid LCM using item covariates is proposed, combining aspects of both a rule-based and information-integration perspective. These models are applied to two different datasets, a standard paper-and-pencil test dataset (N = 779, and a dataset collected within an online learning environment that included direct feedback, time-pressure, and a reward system (N = 808. For the paper-and-pencil dataset the RB LCM resulted in the best fit, whereas for the online dataset the hybrid LCM provided the best fit. The standard paper-and-pencil dataset yielded more evidence for distinct solution rules than the online data set in which quantitative item characteristics are more prominent in determining responses. These results shed new light on the discussion on sequential rule-based and information-integration perspectives of cognitive development.

  1. Trends in OSHA Compliance Monitoring Data 1979-2011: Statistical Modeling of Ancillary Information across 77 Chemicals.

    Science.gov (United States)

    Sarazin, Philippe; Burstyn, Igor; Kincl, Laurel; Lavoué, Jérôme

    2016-05-01

    The Integrated Management Information System (IMIS) is the largest multi-industry source of exposure measurements available in North America. However, many have suspected that the criteria through which worksites are selected for inspection are related to exposure levels. We investigated associations between exposure levels and ancillary variables in IMIS in order to understand the predictors of high exposure within an enforcement context. We analyzed the association between nine variables (reason for inspection, establishment size, total amount of penalty, Occupational Safety and Health Administration (OSHA) plan, OSHA region, union status, inspection scope, year, and industry) and exposure levels in IMIS using multimodel inference for 77 agents. For each agent, we used two different types of models: (i) logistic models were used for the odds ratio (OR) of exposure being above the threshold limit value (TLV) and (ii) linear models were used for exposure concentrations restricted to detected results to estimate percent increase in exposure level, i.e. relative index of exposure (RIE). Meta-analytic methods were used to combine results for each variable across agents. A total of 511,047 exposure measurements were modeled for logistic models and 299,791 for linear models. Higher exposures were measured during follow-up inspections than planned inspections [meta-OR = 1.61, 95% confidence interval (CI): 1.44-1.81; meta-RIE = 1.06, 95% CI: 1.03-1.09]. Lower exposures were observed for measurements collected under state OSHA plans compared to measurements collected under federal OSHA (meta-OR = 0.82, 95% CI: 0.73-0.92; meta-RIE = 0.86, 95% CI: 0.81-0.91). A 'high' total historical amount of penalty relative to none was associated with higher exposures (meta-OR = 1.54, 95% CI: 1.40-1.71; meta-RIE = 1.18, 95% CI: 1.13-1.23). The relationships observed between exposure levels and ancillary variables across a vast majority of agents suggest that certain elements of OSHA

  2. Business statistics for dummies

    CERN Document Server

    Anderson, Alan

    2013-01-01

    Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w

  3. STARS: An ArcGIS Toolset Used to Calculate the Spatial Information Needed to Fit Spatial Statistical Models to Stream Network Data

    Directory of Open Access Journals (Sweden)

    Erin Peterson

    2014-01-01

    Full Text Available This paper describes the STARS ArcGIS geoprocessing toolset, which is used to calcu- late the spatial information needed to fit spatial statistical models to stream network data using the SSN package. The STARS toolset is designed for use with a landscape network (LSN, which is a topological data model produced by the FLoWS ArcGIS geoprocessing toolset. An overview of the FLoWS LSN structure and a few particularly useful tools is also provided so that users will have a clear understanding of the underlying data struc- ture that the STARS toolset depends on. This document may be used as an introduction to new users. The methods used to calculate the spatial information and format the final .ssn object are also explicitly described so that users may create their own .ssn object using other data models and software.

  4. Contributions to statistics

    CERN Document Server

    Mahalanobis, P C

    1965-01-01

    Contributions to Statistics focuses on the processes, methodologies, and approaches involved in statistics. The book is presented to Professor P. C. Mahalanobis on the occasion of his 70th birthday. The selection first offers information on the recovery of ancillary information and combinatorial properties of partially balanced designs and association schemes. Discussions focus on combinatorial applications of the algebra of association matrices, sample size analogy, association matrices and the algebra of association schemes, and conceptual statistical experiments. The book then examines latt

  5. Baseline Statistics of Linked Statistical Data

    NARCIS (Netherlands)

    Scharnhorst, Andrea; Meroño-Peñuela, Albert; Guéret, Christophe

    2014-01-01

    We are surrounded by an ever increasing ocean of information, everybody will agree to that. We build sophisticated strategies to govern this information: design data models, develop infrastructures for data sharing, building tool for data analysis. Statistical datasets curated by National

  6. Mathematical statistics

    CERN Document Server

    Pestman, Wiebe R

    2009-01-01

    This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

  7. 100 statistical tests

    CERN Document Server

    Kanji, Gopal K

    2006-01-01

    This expanded and updated Third Edition of Gopal K. Kanji's best-selling resource on statistical tests covers all the most commonly used tests with information on how to calculate and interpret results with simple datasets. Each entry begins with a short summary statement about the test's purpose, and contains details of the test objective, the limitations (or assumptions) involved, a brief outline of the method, a worked example, and the numerical calculation. 100 Statistical Tests, Third Edition is the one indispensable guide for users of statistical materials and consumers of statistical information at all levels and across all disciplines.

  8. Harmonic statistics

    Science.gov (United States)

    Eliazar, Iddo

    2017-05-01

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their 'public relations' for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford's law, and 1/f noise.

  9. Statistical physics

    CERN Document Server

    Sadovskii, Michael V

    2012-01-01

    This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.

  10. Harmonic statistics

    Energy Technology Data Exchange (ETDEWEB)

    Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il

    2017-05-15

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  11. Statistical optics

    CERN Document Server

    Goodman, Joseph W

    2015-01-01

    This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications.  The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i

  12. CMS Statistics Reference Booklet

    Data.gov (United States)

    U.S. Department of Health & Human Services — The annual CMS Statistics reference booklet provides a quick reference for summary information about health expenditures and the Medicare and Medicaid health...

  13. EDI Performance Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — This section contains statistical information and reports related to the percentage of electronic transactions being sent to Medicare contractors in the formats...

  14. Illinois travel statistics, 2008

    Science.gov (United States)

    2009-01-01

    The 2008 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...

  15. Illinois travel statistics, 2010

    Science.gov (United States)

    2011-01-01

    The 2010 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...

  16. Illinois travel statistics, 2009

    Science.gov (United States)

    2010-01-01

    The 2009 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...

  17. Transport statistics 1996

    CSIR Research Space (South Africa)

    Shepperson, L

    1997-12-01

    Full Text Available This publication contains transport and related statistics on roads, vehicles, infrastructure, passengers, freight, rail, air, maritime and road traffic, and international comparisons. The information compiled in this publication has been gathered...

  18. Reversible Statistics

    DEFF Research Database (Denmark)

    Tryggestad, Kjell

    2004-01-01

    The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... by accounting for the significance of the materials and the equipment that enters into the production of statistics. Key words: Reversible statistics, diverse materials, constructivism, economics, science, and technology....

  19. Statistics Clinic

    Science.gov (United States)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  20. Statistical Physics

    CERN Document Server

    Wannier, Gregory Hugh

    1966-01-01

    Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for

  1. Semiconductor statistics

    CERN Document Server

    Blakemore, J S

    1962-01-01

    Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co

  2. Multiparametric statistics

    CERN Document Server

    Serdobolskii, Vadim Ivanovich

    2007-01-01

    This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...

  3. Accident Statistics

    Data.gov (United States)

    Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...

  4. Reversible Statistics

    DEFF Research Database (Denmark)

    Tryggestad, Kjell

    2004-01-01

    The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...

  5. Vital statistics

    CERN Document Server

    MacKenzie, Dana

    2004-01-01

    The drawbacks of using 19th-century mathematics in physics and astronomy are illustrated. To continue with the expansion of the knowledge about the cosmos, the scientists will have to come in terms with modern statistics. Some researchers have deliberately started importing techniques that are used in medical research. However, the physicists need to identify the brand of statistics that will be suitable for them, and make a choice between the Bayesian and the frequentists approach. (Edited abstract).

  6. Estimation of low-flow statistics at ungaged sites on streams in the Lower Hudson River Basin, New York, from data in geographic information systems

    Science.gov (United States)

    Randall, Allan D.; Freehafer, Douglas A.

    2017-08-02

    A variety of watershed properties available in 2015 from geographic information systems were tested in regression equations to estimate two commonly used statistical indices of the low flow of streams, namely the lowest flows averaged over 7 consecutive days that have a 1 in 10 and a 1 in 2 chance of not being exceeded in any given year (7-day, 10-year and 7-day, 2-year low flows). The equations were based on streamflow measurements in 51 watersheds in the Lower Hudson River Basin of New York during the years 1958–1978, when the number of streamflow measurement sites on unregulated streams was substantially greater than in subsequent years. These low-flow indices are chiefly a function of the area of surficial sand and gravel in the watershed; more precisely, 7-day, 10-year and 7-day, 2-year low flows both increase in proportion to the area of sand and gravel deposited by glacial meltwater, whereas 7-day, 2-year low flows also increase in proportion to the area of postglacial alluvium. Both low-flow statistics are also functions of mean annual runoff (a measure of net water input to the watershed from precipitation) and area of swamps and poorly drained soils in or adjacent to surficial sand and gravel (where groundwater recharge is unlikely and riparian water loss to evapotranspiration is substantial). Small but significant refinements in estimation accuracy resulted from the inclusion of two indices of stream geometry, channel slope and length, in the regression equations. Most of the regression analysis was undertaken with the ordinary least squares method, but four equations were replicated by using weighted least squares to provide a more realistic appraisal of the precision of low-flow estimates. The most accurate estimation equations tested in this study explain nearly 84 and 87 percent of the variation in 7-day, 10-year and 7-day, 2-year low flows, respectively, with standard errors of 0.032 and 0.050 cubic feet per second per square mile. The equations

  7. Spina Bifida Data and Statistics

    Science.gov (United States)

    ... Us Information For… Media Policy Makers Data and Statistics Recommend on Facebook Tweet Share Compartir Spina bifida ... the spine. Read below for the latest national statistics on spina bifida in the United States. In ...

  8. HPV-Associated Cancers Statistics

    Science.gov (United States)

    ... What CDC Is Doing Related Links Stay Informed Statistics for Other Kinds of Cancer Breast Cervical Colorectal ( ... Vaginal and Vulvar Cancer Home HPV-Associated Cancer Statistics Language: English (US) Español (Spanish) Recommend on Facebook ...

  9. Birth Defects Data and Statistics

    Science.gov (United States)

    ... Submit" /> Information For… Media Policy Makers Data & Statistics Recommend on Facebook Tweet Share Compartir On This ... and critical. Read below for the latest national statistics on the occurrence of birth defects in the ...

  10. Transportation statistics annual report 2010

    Science.gov (United States)

    2011-01-01

    The Transportation Statistics Annual Report (TSAR) presents data and information compiled by the Bureau of Transportation Statistics (BTS), a component of the U.S. Department of Transportations (USDOTs) Research and Innovative Technology Admini...

  11. Statistical mechanics

    CERN Document Server

    Schwabl, Franz

    2006-01-01

    The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...

  12. Statistical mechanics

    CERN Document Server

    Jana, Madhusudan

    2015-01-01

    Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...

  13. Statistical inference

    CERN Document Server

    Rohatgi, Vijay K

    2003-01-01

    Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth

  14. Statistical mechanics

    CERN Document Server

    Davidson, Norman

    2003-01-01

    Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody

  15. AP statistics

    CERN Document Server

    Levine-Wissing, Robin

    2012-01-01

    All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep

  16. Statistical Physics

    CERN Document Server

    Mandl, Franz

    1988-01-01

    The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient

  17. Statistical Computing

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 4; Issue 10. Statistical Computing - Understanding Randomness and Random Numbers. Sudhakar Kunte. Series Article Volume 4 Issue 10 October 1999 pp 16-21. Fulltext. Click here to view fulltext PDF. Permanent link:

  18. Statistical thermodynamics

    CERN Document Server

    Schrödinger, Erwin

    1952-01-01

    Nobel Laureate's brilliant attempt to develop a simple, unified standard method of dealing with all cases of statistical thermodynamics - classical, quantum, Bose-Einstein, Fermi-Dirac, and more.The work also includes discussions of Nernst theorem, Planck's oscillator, fluctuations, the n-particle problem, problem of radiation, much more.

  19. Business statistics I essentials

    CERN Document Server

    Clark, Louise

    2014-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Business Statistics I includes descriptive statistics, introduction to probability, probability distributions, sampling and sampling distributions, interval estimation, and hypothesis t

  20. Energy Statistics

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources

  1. Statistical Optics

    Science.gov (United States)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  2. Statistical Pattern Recognition

    CERN Document Server

    Webb, Andrew R

    2011-01-01

    Statistical pattern recognition relates to the use of statistical techniques for analysing data measurements in order to extract information and make justified decisions.  It is a very active area of study and research, which has seen many advances in recent years. Applications such as data mining, web searching, multimedia data retrieval, face recognition, and cursive handwriting recognition, all require robust and efficient pattern recognition techniques. This third edition provides an introduction to statistical pattern theory and techniques, with material drawn from a wide range of fields,

  3. AP statistics crash course

    CERN Document Server

    D'Alessio, Michael

    2012-01-01

    AP Statistics Crash Course - Gets You a Higher Advanced Placement Score in Less Time Crash Course is perfect for the time-crunched student, the last-minute studier, or anyone who wants a refresher on the subject. AP Statistics Crash Course gives you: Targeted, Focused Review - Study Only What You Need to Know Crash Course is based on an in-depth analysis of the AP Statistics course description outline and actual Advanced Placement test questions. It covers only the information tested on the exam, so you can make the most of your valuable study time. Our easy-to-read format covers: exploring da

  4. Information

    International Nuclear Information System (INIS)

    Boyard, Pierre.

    1981-01-01

    The fear for nuclear energy and more particularly for radioactive wastes is analyzed in the sociological context. Everybody agree on the information need, information is available but there is a problem for their diffusion. Reactions of the public are analyzed and journalists, scientists and teachers have a role to play [fr

  5. Energy statistics yearbook 2000

    International Nuclear Information System (INIS)

    2002-01-01

    The Energy Statistics Yearbook 2000 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-third in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities

  6. Energy statistics yearbook 2001

    International Nuclear Information System (INIS)

    2004-01-01

    The Energy Statistics Yearbook 2001 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-fifth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities

  7. Energy statistics yearbook 2002

    International Nuclear Information System (INIS)

    2005-01-01

    The Energy Statistics Yearbook 2002 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-sixth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities

  8. Energy statistics

    International Nuclear Information System (INIS)

    Anon.

    1989-01-01

    World data from the United Nation's latest Energy Statistics Yearbook, first published in our last issue, are completed here. The 1984-86 data were revised and 1987 data added for world commercial energy production and consumption, world natural gas plant liquids production, world LP-gas production, imports, exports, and consumption, world residual fuel oil production, imports, exports, and consumption, world lignite production, imports, exports, and consumption, world peat production and consumption, world electricity production, imports, exports, and consumption (Table 80), and world nuclear electric power production

  9. Statistical mechanics

    CERN Document Server

    Sheffield, Scott

    2009-01-01

    In recent years, statistical mechanics has been increasingly recognized as a central domain of mathematics. Major developments include the Schramm-Loewner evolution, which describes two-dimensional phase transitions, random matrix theory, renormalization group theory and the fluctuations of random surfaces described by dimers. The lectures contained in this volume present an introduction to recent mathematical progress in these fields. They are designed for graduate students in mathematics with a strong background in analysis and probability. This book will be of particular interest to graduate students and researchers interested in modern aspects of probability, conformal field theory, percolation, random matrices and stochastic differential equations.

  10. Official Statistics and Statistics Education: Bridging the Gap

    Directory of Open Access Journals (Sweden)

    Gal Iddo

    2017-03-01

    Full Text Available This article aims to challenge official statistics providers and statistics educators to ponder on how to help non-specialist adult users of statistics develop those aspects of statistical literacy that pertain to official statistics. We first document the gap in the literature in terms of the conceptual basis and educational materials needed for such an undertaking. We then review skills and competencies that may help adults to make sense of statistical information in areas of importance to society. Based on this review, we identify six elements related to official statistics about which non-specialist adult users should possess knowledge in order to be considered literate in official statistics: (1 the system of official statistics and its work principles; (2 the nature of statistics about society; (3 indicators; (4 statistical techniques and big ideas; (5 research methods and data sources; and (6 awareness and skills for citizens’ access to statistical reports. Based on this ad hoc typology, we discuss directions that official statistics providers, in cooperation with statistics educators, could take in order to (1 advance the conceptualization of skills needed to understand official statistics, and (2 expand educational activities and services, specifically by developing a collaborative digital textbook and a modular online course, to improve public capacity for understanding of official statistics.

  11. Florida Postsecondary Education Security Information Act. Annual Report of Campus Crime Statistics 1991-93 and Annual Assessment of Physical Plant Safety 1994.

    Science.gov (United States)

    Florida State Dept. of Education, Tallahassee. Office of Postsecondary Education Coordination.

    This state-mandated report presents crime statistics at higher education institutions in Florida and an assessment of physical plant security. The crime data list numbers of homicides, forcible sex offenses, robberies, aggravated assaults, burglaries/breaking and entering, larcenies and thefts, and motor vehicle thefts for each state university,…

  12. Helping Students to Climb the Mountain: A Study to Inform the Development of a Resource to Improve the Learning of Statistics in Psychology

    Science.gov (United States)

    Davies, Emma L.; Morys-Carter, Wakefield L.; Paltoglou, Aspasia E.

    2015-01-01

    Students often struggle with learning about statistics, which encompass a large proportion of a psychology degree. This pilot study explored how first- and final-year students reflected on their experiences of being taught this topic, in order to identify needs that could be addressed in a project to improve their learning. First-year students…

  13. Education in OECD Countries: A Compendium of Statistical Information, 1988-89 and 1989-90 = l'Enseignement dans les pays de l'OCDE: Recueil d'informations statistiques, 1988-89 et 1989-90.

    Science.gov (United States)

    Organisation for Economic Cooperation and Development, Paris (France).

    This report is an annual update on education statistics for member countries in the Organisation for Economic Cooperation and Development (OECD), with the cooperation of UNESCO and the Statistical Office of the European Union. Some changes from earlier versions of "Education in OECD Countries" are introduced in this edition. Data comes from…

  14. Reports on internet traffic statistics

    NARCIS (Netherlands)

    Hoogesteger, Martijn; de Oliveira Schmidt, R.; Sperotto, Anna; Pras, Aiko

    2013-01-01

    Internet traffic statistics can provide valuable information to network analysts and researchers about the way nowadays networks are used. In the past, such information was provided by Internet2 in a public website called Internet2 NetFlow: Weekly Reports. The website reported traffic statistics

  15. The Concise Encyclopedia of Statistics

    CERN Document Server

    Dodge, Yadolah

    2008-01-01

    The Concise Encyclopedia of Statistics presents the essential information about statistical tests, concepts, and analytical methods in language that is accessible to practitioners and students of the vast community using statistics in medicine, engineering, physical science, life science, social science, and business/economics. The reference is alphabetically arranged to provide quick access to the fundamental tools of statistical methodology and biographies of famous statisticians. The more than 500 entries include definitions, history, mathematical details, limitations, examples, references,

  16. Statistical learning and prejudice.

    Science.gov (United States)

    Madison, Guy; Ullén, Fredrik

    2012-12-01

    Human behavior is guided by evolutionarily shaped brain mechanisms that make statistical predictions based on limited information. Such mechanisms are important for facilitating interpersonal relationships, avoiding dangers, and seizing opportunities in social interaction. We thus suggest that it is essential for analyses of prejudice and prejudice reduction to take the predictive accuracy and adaptivity of the studied prejudices into account.

  17. Industrial commodity statistics yearbook 2001. Production statistics (1992-2001)

    International Nuclear Information System (INIS)

    2003-01-01

    This is the thirty-fifth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1992-2001 for about 200 countries and areas

  18. Industrial commodity statistics yearbook 2000. Production statistics (1991-2000)

    International Nuclear Information System (INIS)

    2002-01-01

    This is the thirty-third in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. Most of the statistics refer to the ten-year period 1991-2000 for about 200 countries and areas

  19. Industrial commodity statistics yearbook 2002. Production statistics (1993-2002)

    International Nuclear Information System (INIS)

    2004-01-01

    This is the thirty-sixth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title 'The Growth of World industry' and the next eight editions under the title 'Yearbook of Industrial Statistics'. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1993-2002 for about 200 countries and areas

  20. Stupid statistics!

    Science.gov (United States)

    Tellinghuisen, Joel

    2008-01-01

    The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.

  1. Data and Statistics: Women and Heart Disease

    Science.gov (United States)

    ... Summary Coverdell Program 2012-2015 State Summaries Data & Statistics Fact Sheets Heart Disease and Stroke Fact Sheets ... Roadmap for State Planning Other Data Resources Other Statistic Resources Grantee Information Cross-Program Information Online Tools ...

  2. Breakthroughs in statistics

    CERN Document Server

    Johnson, Norman

    This is author-approved bcc: This is the third volume of a collection of seminal papers in the statistical sciences written during the past 110 years. These papers have each had an outstanding influence on the development of statistical theory and practice over the last century. Each paper is preceded by an introduction written by an authority in the field providing background information and assessing its influence. Volume III concerntrates on articles from the 1980's while including some earlier articles not included in Volume I and II. Samuel Kotz is Professor of Statistics in the College of Business and Management at the University of Maryland. Norman L. Johnson is Professor Emeritus of Statistics at the University of North Carolina. Also available: Breakthroughs in Statistics Volume I: Foundations and Basic Theory Samuel Kotz and Norman L. Johnson, Editors 1993. 631 pp. Softcover. ISBN 0-387-94037-5 Breakthroughs in Statistics Volume II: Methodology and Distribution Samuel Kotz and Norman L. Johnson, Edi...

  3. Statistics for business

    CERN Document Server

    Waller, Derek L

    2008-01-01

    Statistical analysis is essential to business decision-making and management, but the underlying theory of data collection, organization and analysis is one of the most challenging topics for business students and practitioners. This user-friendly text and CD-ROM package will help you to develop strong skills in presenting and interpreting statistical information in a business or management environment. Based entirely on using Microsoft Excel rather than more complicated applications, it includes a clear guide to using Excel with the key functions employed in the book, a glossary of terms and

  4. Statistics II essentials

    CERN Document Server

    Milewski, Emil G

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Statistics II discusses sampling theory, statistical inference, independent and dependent variables, correlation theory, experimental design, count data, chi-square test, and time se

  5. Statistical secrecy and multibit commitments

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Pedersen, Torben P.; Pfitzmann, Birgit

    1998-01-01

    We present and compare definitions of "statistically hiding" protocols, and we propose a novel statistically hiding commitment scheme. Informally, a protocol statistically hides a secret if a computationally unlimited adversary who conducts the protocol with the owner of the secret learns almost...... nothing about it. One definition is based on the L1-norm distance between probability distributions, the other on information theory. We prove that the two definitions are essentially equivalent. We also show that statistical counterparts of definitions of computational secrecy are essentially equivalent...

  6. USING STATISTICAL SURVEY IN ECONOMICS

    Directory of Open Access Journals (Sweden)

    Delia TESELIOS

    2012-01-01

    Full Text Available Statistical survey is an effective method of statistical investigation that involves gathering quantitative data, which is often preferred in statistical reports due to the information which can be obtained regarding the entire population studied by observing a part of it. Therefore, because of the information provided, polls are used in many research areas. In economics, statistics are used in the decision making process in choosing competitive strategies in the analysis of certain economic phenomena, the formulation of forecasts. Economic study presented in this paper is to illustrate how a simple random sampling is used to analyze the existing parking spaces situation in a given locality.

  7. Use of The International Classification of Functioning, Disability and Health (ICF as a conceptual framework and common language for disability statistics and health information systems

    Directory of Open Access Journals (Sweden)

    Kostanjsek Nenad

    2011-05-01

    Full Text Available Abstract A common framework for describing functional status information is needed in order to make this information comparable and of value. The World Health Organization’s International Classification of Functioning, Disability and Health (ICF, which has been approved by all its member states, provides this common language and framework. The article provides an overview of ICF taxonomy, introduces the conceptual model which underpins ICF and elaborates on how ICF is used at population and clinical level. Furthermore, the article presents key features of the ICF tooling environment and outlines current and future developments of the classification.

  8. 1992 Energy statistics Yearbook

    International Nuclear Information System (INIS)

    1994-01-01

    The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from annual questionnaires distributed by the United Nations Statistical Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistical Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities

  9. Energy statistics manual

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-07-01

    Detailed, complete, timely and reliable statistics are essential to monitor the energy situation at a country level as well as at an international level. Energy statistics on supply, trade, stocks, transformation and demand are indeed the basis for any sound energy policy decision. For instance, the market of oil -- which is the largest traded commodity worldwide -- needs to be closely monitored in order for all market players to know at any time what is produced, traded, stocked and consumed and by whom. In view of the role and importance of energy in world development, one would expect that basic energy information to be readily available and reliable. This is not always the case and one can even observe a decline in the quality, coverage and timeliness of energy statistics over the last few years.

  10. Conformity and statistical tolerancing

    Science.gov (United States)

    Leblond, Laurent; Pillet, Maurice

    2018-02-01

    Statistical tolerancing was first proposed by Shewhart (Economic Control of Quality of Manufactured Product, (1931) reprinted 1980 by ASQC), in spite of this long history, its use remains moderate. One of the probable reasons for this low utilization is undoubtedly the difficulty for designers to anticipate the risks of this approach. The arithmetic tolerance (worst case) allows a simple interpretation: conformity is defined by the presence of the characteristic in an interval. Statistical tolerancing is more complex in its definition. An interval is not sufficient to define the conformance. To justify the statistical tolerancing formula used by designers, a tolerance interval should be interpreted as the interval where most of the parts produced should probably be located. This tolerance is justified by considering a conformity criterion of the parts guaranteeing low offsets on the latter characteristics. Unlike traditional arithmetic tolerancing, statistical tolerancing requires a sustained exchange of information between design and manufacture to be used safely. This paper proposes a formal definition of the conformity, which we apply successively to the quadratic and arithmetic tolerancing. We introduce a concept of concavity, which helps us to demonstrate the link between tolerancing approach and conformity. We use this concept to demonstrate the various acceptable propositions of statistical tolerancing (in the space decentring, dispersion).

  11. Methodology of the individual detection of cerebral activations by positrons emission tomography: statistical characterization of noise images and introduction of anatomical information

    International Nuclear Information System (INIS)

    Antoine, M.J.

    1996-01-01

    The work that presented here has been done in the context of non invasive study of human brain, with metabolism images techniques ( positrons emission tomography or P.E.T.) and anatomy images techniques (imaging by nuclear magnetic resonance or MRI). The objective of this thesis was to use jointly, the information given by these two ways, in the aim of improving the individual detection of cerebral activation. (N.C.)

  12. Concurrent Phenomena at the Reaction Path of the SN2 Reaction CH3Cl + F−. Information Planes and Statistical Complexity Analysis

    Directory of Open Access Journals (Sweden)

    Moyocoyani Molina-Espíritu

    2013-09-01

    Full Text Available An information-theoretical complexity analysis of the SN2 exchange reaction for CH3Cl + F− is performed in both position and momentum spaces by means of the following composite functionals of the one-particle density: D-L and I-J planes and Fisher-Shannon’s (FS and López-Ruiz-Mancini-Calbet (LMC shape complexities. It was found that all the chemical concepts traditionally assigned to elementary reactions such as the breaking/forming regions (B-B/F, the charge transfer/reorganization and the charge repulsion can be unraveled from the phenomenological analysis performed in this study through aspects of localizability, uniformity and disorder associated with the information-theoretical functionals. In contrast, no energy-based functionals can reveal the above mentioned chemical concepts. In addition, it is found that the TS critical point for this reaction does not show any chemical meaning (other than the barrier height as compared with the concurrent processes revealed by the information-theoretical analysis. Instead, it is apparent from this study that a maximum delocalized state could be identified in the transition region which is associated to the charge transfer process as a new concurrent phenomenon associated with the charge transfer region (CT for the ion-complex is identified. Finally it is discussed why most of the chemical features of interest (e.g., CT, B-B/F are only revealed when some information-theoretic properties are taken into account, such as localizability, uniformity and disorder.

  13. Statistics I essentials

    CERN Document Server

    Milewski, Emil G

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Statistics I covers include frequency distributions, numerical methods of describing data, measures of variability, parameters of distributions, probability theory, and distributions.

  14. On quantum statistical inference

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Gill, Richard D.; Jupp, Peter E.

    Recent developments in the mathematical foundations of quantum mechanics have brought the theory closer to that of classical probability and statistics. On the other hand, the unique character of quantum physics sets many of the questions addressed apart from those met classically in stochastics....... Furthermore, concurrent advances in experimental techniques and in the theory of quantum computation have led to a strong interest in questions of quantum information, in particular in the sense of the amount of information about unknown parameters in given observational data or accessible through various...

  15. Application of information statistical theory to the description of the effect of heat conduction on the chemical reaction rate in gases

    International Nuclear Information System (INIS)

    Fort, J.; Cukrowski, A.S.

    1998-01-01

    The effect of the heat flux on the rate of chemical reaction in dilute gases is shown to be important for reactions characterized by high activation energies and in the presence of very large temperature gradients. This effect, obtained from the second-order terms in the distribution function (similar to those obtained in the Burnett approximation to the solution of the Boltzmann equation), is derived on the basis of information theory. It is shown that the analytical results describing the effect are simpler if the kinetic definition for the nonequilibrium temperature is introduced than if the thermodynamic definition is introduced. The numerical results are nearly the same for both definitions. (author)

  16. Statistical methods in nonlinear dynamics

    Indian Academy of Sciences (India)

    Sensitivity to initial conditions in nonlinear dynamical systems leads to exponential divergence of trajectories that are initially arbitrarily close, and hence to unpredictability. Statistical methods have been found to be helpful in extracting useful information about such systems. In this paper, we review briefly some statistical ...

  17. CLIA Statistical Tables and Graphs

    Data.gov (United States)

    U.S. Department of Health & Human Services — The information in the Downloads section below for the Overview of CLIA Statistics since 1993 and the CLIA Top Ten Deficiencies in the Nation was obtained from the...

  18. Statistical Profile of Older Hispanics

    Science.gov (United States)

    A Statistical Profile of Older Hispanic Americans INTRODUCTION In 2014, there were 46.2 million Americans aged 65 and over and ... were Hispanic. Principal sources of data for this Profile are the most current information available from the ...

  19. Perception in statistical graphics

    Science.gov (United States)

    VanderPlas, Susan Ruth

    There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.

  20. READING STATISTICS AND RESEARCH

    Directory of Open Access Journals (Sweden)

    Reviewed by Yavuz Akbulut

    2008-10-01

    Full Text Available The book demonstrates the best and most conservative ways to decipher and critique research reports particularly for social science researchers. In addition, new editions of the book are always better organized, effectively structured and meticulously updated in line with the developments in the field of research statistics. Even the most trivial issues are revisited and updated in new editions. For instance, purchaser of the previous editions might check the interpretation of skewness and kurtosis indices in the third edition (p. 34 and in the fifth edition (p.29 to see how the author revisits every single detail. Theory and practice always go hand in hand in all editions of the book. Re-reading previous editions (e.g. third edition before reading the fifth edition gives the impression that the author never stops ameliorating his instructional text writing methods. In brief, “Reading Statistics and Research” is among the best sources showing research consumers how to understand and critically assess the statistical information and research results contained in technical research reports. In this respect, the review written by Mirko Savić in Panoeconomicus (2008, 2, pp. 249-252 will help the readers to get a more detailed overview of each chapters. I cordially urge the beginning researchers to pick a highlighter to conduct a detailed reading with the book. A thorough reading of the source will make the researchers quite selective in appreciating the harmony between the data analysis, results and discussion sections of typical journal articles. If interested, beginning researchers might begin with this book to grasp the basics of research statistics, and prop up their critical research reading skills with some statistics package applications through the help of Dr. Andy Field’s book, Discovering Statistics using SPSS (second edition published by Sage in 2005.

  1. Use of remote sensing, geographic information systems, and spatial statistics to assess spatio-temporal population dynamics of Heterodera glycines and soybean yield quantity and quality

    Science.gov (United States)

    Moreira, Antonio Jose De Araujo

    Soybean, Glycine max (L.) Merr., is an important source of oil and protein worldwide, and soybean cyst nematode (SCN), Heterodera glycines, is among the most important yield-limiting factors in soybean production worldwide. Early detection of SCN is difficult because soybean plants infected by SCN often do not exhibit visible symptoms. It was hypothesized, however, that reflectance data obtained by remote sensing from soybean canopies may be used to detect plant stress caused by SCN infection. Moreover, reflectance measurements may be related to soybean growth and yield. Two field experiments were conducted from 2000 to 2002 to study the relationships among reflectance data, quantity and quality of soybean yield, and SCN population densities. The best relationships between reflectance and the quantity of soybean grain yield occurred when reflectance data were obtained late August to early September. Similarly, reflectance was best related to seed oil and seed protein content and seed size when measured during late August/early September. Grain quality-reflectance relationships varied spatially and temporally. Reflectance measured early or late in the season had the best relationships with SCN population densities measured at planting. Soil properties likely affected reflectance measurements obtained at the beginning of the season and somehow may have been related to SCN population densities at planting. Reflectance data obtained at the end of the growing season likely was affected by early senescence of SCN-infected soybeans. Spatio-temporal aspects of SCN population densities in both experiments were assessed using spatial statistics and regression analyses. In the 2000 and 2001 growing seasons, spring-to-fall changes in SCN population densities were best related to SCN population densities at planting for both experiments. However, within-season changes in SCN population densities were best related to SCN population densities at harvest for both experiments in

  2. Identification of environmental parameters and risk mapping of visceral leishmaniasis in Ethiopia by using geographical information systems and a statistical approach

    Directory of Open Access Journals (Sweden)

    Teshome Tsegaw

    2013-05-01

    Full Text Available Visceral leishmaniasis (VL, a vector-borne disease strongly influenced by environmental factors, has (re-emerged in Ethiopia during the last two decades and is currently of increasing public health concern. Based on VL incidence in each locality (kebele documented from federal or regional health bureaus and/or hospital records in the country, geographical information systems (GIS, coupled with binary and multivariate logistic regression methods, were employed to develop a risk map for Ethiopia with respect to VL based on soil type, altitude, rainfall, slope and temperature. The risk model was subsequently validated in selected sites. This environmental VL risk model provided an overall prediction accuracy of 86% with mean land surface temperature and soil type found to be the best predictors of VL. The total population at risk was estimated at 3.2 million according to the national population census in 2007. The approach presented here should facilitate the identification of priority areas for intervention and the monitoring of trends as well as providing input for further epidemiological and applied research with regard to this disease in Ethiopia.

  3. Application of Fragment Ion Information as Further Evidence in Probabilistic Compound Screening Using Bayesian Statistics and Machine Learning: A Leap Toward Automation.

    Science.gov (United States)

    Woldegebriel, Michael; Zomer, Paul; Mol, Hans G J; Vivó-Truyols, Gabriel

    2016-08-02

    In this work, we introduce an automated, efficient, and elegant model to combine all pieces of evidence (e.g., expected retention times, peak shapes, isotope distributions, fragment-to-parent ratio) obtained from liquid chromatography-tandem mass spectrometry (LC-MS/MS/MS) data for screening purposes. Combining all these pieces of evidence requires a careful assessment of the uncertainties in the analytical system as well as all possible outcomes. To-date, the majority of the existing algorithms are highly dependent on user input parameters. Additionally, the screening process is tackled as a deterministic problem. In this work we present a Bayesian framework to deal with the combination of all these pieces of evidence. Contrary to conventional algorithms, the information is treated in a probabilistic way, and a final probability assessment of the presence/absence of a compound feature is computed. Additionally, all the necessary parameters except the chromatographic band broadening for the method are learned from the data in training and learning phase of the algorithm, avoiding the introduction of a large number of user-defined parameters. The proposed method was validated with a large data set and has shown improved sensitivity and specificity in comparison to a threshold-based commercial software package.

  4. Methods of statistical model estimation

    CERN Document Server

    Hilbe, Joseph

    2013-01-01

    Methods of Statistical Model Estimation examines the most important and popular methods used to estimate parameters for statistical models and provide informative model summary statistics. Designed for R users, the book is also ideal for anyone wanting to better understand the algorithms used for statistical model fitting. The text presents algorithms for the estimation of a variety of regression procedures using maximum likelihood estimation, iteratively reweighted least squares regression, the EM algorithm, and MCMC sampling. Fully developed, working R code is constructed for each method. Th

  5. Statistical modeling for degradation data

    CERN Document Server

    Lio, Yuhlong; Ng, Hon; Tsai, Tzong-Ru

    2017-01-01

    This book focuses on the statistical aspects of the analysis of degradation data. In recent years, degradation data analysis has come to play an increasingly important role in different disciplines such as reliability, public health sciences, and finance. For example, information on products’ reliability can be obtained by analyzing degradation data. In addition, statistical modeling and inference techniques have been developed on the basis of different degradation measures. The book brings together experts engaged in statistical modeling and inference, presenting and discussing important recent advances in degradation data analysis and related applications. The topics covered are timely and have considerable potential to impact both statistics and reliability engineering.

  6. Performance assessment and beamline diagnostics based on evaluation of temporal information from infrared spectral datasets by means of R Environment for statistical analysis.

    Science.gov (United States)

    Banas, Krzysztof; Banas, Agnieszka; Gajda, Mariusz; Kwiatek, Wojciech M; Pawlicki, Bohdan; Breese, Mark B H

    2014-07-15

    Assessment of the performance and up-to-date diagnostics of scientific equipment is one of the key components in contemporary laboratories. Most reliable checks are performed by real test experiments while varying the experimental conditions (typically, in the case of infrared spectroscopic measurements, the size of the beam aperture, the duration of the experiment, the spectral range, the scanner velocity, etc.). On the other hand, the stability of the instrument response in time is another key element of the great value. Source stability (or easy predictable temporal changes, similar to those observed in the case of synchrotron radiation-based sources working in non top-up mode), detector stability (especially in the case of liquid nitrogen- or liquid helium-cooled detectors) should be monitored. In these cases, recorded datasets (spectra) include additional variables such as time stamp when a particular spectrum was recorded (in the case of time trial experiments). A favorable approach in evaluating these data is building hyperspectral object that consist of all spectra and all additional parameters at which these spectra were recorded. Taking into account that these datasets could be considerably large in size, there is a need for the tools for semiautomatic data evaluation and information extraction. A comprehensive R archive network--the open-source R Environment--with its flexibility and growing potential, fits these requirements nicely. In this paper, examples of practical implementation of methods available in R for real-life Fourier transform infrared (FTIR) spectroscopic data problems are presented. However, this approach could easily be adopted to many various laboratory scenarios with other spectroscopic techniques.

  7. Computer intensive statistical methods

    Science.gov (United States)

    Yakowitz, S.

    The special session “Computer-Intensive Statistical Methods” was held in morning and afternoon parts at the 1985 AGU Fall Meeting in San Francisco. Calif. Its mission was to provide a forum for hydrologists and statisticians who are active in bringing unconventional, algorithmic-oriented statistical techniques to bear on problems of hydrology. Statistician Emanuel Parzen (Texas A&M University, College Station, Tex.) opened the session by relating recent developments in quantile estimation methods and showing how properties of such methods can be used to advantage to categorize runoff data previously analyzed by I. Rodriguez-Iturbe (Universidad Simon Bolivar, Caracas, Venezuela). Statistician Eugene Schuster (University of Texas, El Paso) discussed recent developments in nonparametric density estimation which enlarge the framework for convenient incorporation of prior and ancillary information. These extensions were motivated by peak annual flow analysis. Mathematician D. Myers (University of Arizona, Tucson) gave a brief overview of “kriging” and outlined some recently developed methodology.

  8. Statistical Yearbook of Norway 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    The Statistical Yearbook of Norway 2012 contains statistics on Norway and main figures for the Nordic countries and other countries selected from international statistics. The international over-views are integrated with the other tables and figures. The selection of tables in this edition is mostly the same as in the 2011 edition. The yearbook's 480 tables and figures present the main trends in official statistics in most areas of society. The list of tables and figures and an index at the back of the book provide easy access to relevant information. In addition, source information and Internet addresses below the tables make the yearbook a good starting point for those who are looking for more detailed statistics. The statistics are based on data gathered in statistical surveys and from administrative data, which, in cooperation with other public institutions, have been made available for statistical purposes. Some tables have been prepared in their entirety by other public institutions. The statistics follow approved principles, standards and classifications that are in line with international recommendations and guidelines. Content: 00. General subjects; 01. Environment; 02. Population; 03. Health and social conditions; 04. Education; 05. Personal economy and housing conditions; 06. Labour market; 07. Recreational, cultural and sporting activities; 08. Prices and indices; 09. National Economy and external trade; 10. Industrial activities; 11. Financial markets; 12. Public finances; Geographical survey.(eb)

  9. Measurement and statistics for teachers

    CERN Document Server

    Van Blerkom, Malcolm

    2008-01-01

    Written in a student-friendly style, Measurement and Statistics for Teachers shows teachers how to use measurement and statistics wisely in their classes. Although there is some discussion of theory, emphasis is given to the practical, everyday uses of measurement and statistics. The second part of the text provides more complete coverage of basic descriptive statistics and their use in the classroom than in any text now available.Comprehensive and accessible, Measurement and Statistics for Teachers includes:Short vignettes showing concepts in action Numerous classroom examples Highlighted vocabulary Boxes summarizing related concepts End-of-chapter exercises and problems Six full chapters devoted to the essential topic of Classroom Tests Instruction on how to carry out informal assessments, performance assessments, and portfolio assessments, and how to use and interpret standardized tests A five-chapter section on Descriptive Statistics, giving instructors the option of more thoroughly teaching basic measur...

  10. MQSA National Statistics

    Science.gov (United States)

    ... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...

  11. Functional and Operatorial Statistics

    CERN Document Server

    Dabo-niang, Sophie

    2008-01-01

    An increasing number of statistical problems and methods involve infinite-dimensional aspects. This is due to the progress of technologies which allow us to store more and more information while modern instruments are able to collect data much more effectively due to their increasingly sophisticated design. This evolution directly concerns statisticians, who have to propose new methodologies while taking into account such high-dimensional data (e.g. continuous processes, functional data, etc.). The numerous applications (micro-arrays, paleo- ecological data, radar waveforms, spectrometric curv

  12. Sampling, Probability Models and Statistical Reasoning Statistical ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  13. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  14. Statistics for dental researchers: descriptive statistics

    OpenAIRE

    Mohammad Reza Baneshi PhD; Amir Reza Ghassemi DDS; Arash Shahravan DDS, MS

    2012-01-01

    Descriptive statistics is the process of summarizing gathered raw data from a research and creating useful statistics,which help the better understanding of data. According to the types of variables, which consist of qualitative andquantitative variables, some descriptive statistics have been introduced. Frequency percentage is used in qualitativedata, and mean, median, mode, standard deviation, standard error, variance, and range are some of the statistics whichare used in quantitative data....

  15. Official statistics and Big Data

    Directory of Open Access Journals (Sweden)

    Peter Struijs

    2014-07-01

    Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.

  16. National Statistical Commission and Indian Official Statistics*

    Indian Academy of Sciences (India)

    IAS Admin

    existence in July. 2006, is mandated, among its functions, to exercise statistical co-ordination between. Ministries,. Departments and other agencies of the. Central government; ... tween the Directorate General of Commercial Intelligence and. Statistics ... in some states do not play a nodal role in the coordination of statistical ...

  17. National Statistical Commission and Indian Official Statistics*

    Indian Academy of Sciences (India)

    IAS Admin

    Advanced Institute of. Maths, Stats and Com- puter Science, UoH. Campus, Hyderabad. His research interests include theory and practice of sample surveys .... other agencies of the. Central government; and to exercise statistical audit over the statistical activities to ensure quality and integrity of the statistical products.

  18. Medical facility statistics in Japan.

    Science.gov (United States)

    Hamajima, Nobuyuki; Sugimoto, Takuya; Hasebe, Ryo; Myat Cho, Su; Khaing, Moe; Kariya, Tetsuyoshi; Mon Saw, Yu; Yamamoto, Eiko

    2017-11-01

    Medical facility statistics provide essential information to policymakers, administrators, academics, and practitioners in the field of health services. In Japan, the Health Statistics Office of the Director-General for Statistics and Information Policy at the Ministry of Health, Labour and Welfare is generating these statistics. Although the statistics are widely available in both Japanese and English, the methodology described in the technical reports are primarily in Japanese, and are not fully described in English. This article aimed to describe these processes for readers in the English-speaking world. The Health Statistics Office routinely conduct two surveys called the Hospital Report and the Survey of Medical Institutions. The subjects of the former are all the hospitals and clinics with long-term care beds in Japan. It comprises a Patient Questionnaire focusing on the numbers of inpatients, admissions, discharges, and outpatients in one month, and an Employee Questionnaire, which asks about the number of employees as of October 1. The Survey of Medical Institutions consists of the Dynamic Survey, which focuses on the opening and closing of facilities every month, and the Static Survey, which focuses on staff, facilities, and services as of October 1, as well as the number of inpatients as of September 30 and the total number of outpatients during September. All hospitals, clinics, and dental clinics are requested to submit the Static Survey questionnaire every three years. These surveys are useful tools for collecting essential information, as well as providing occasions to implicitly inform facilities of the movements of government policy.

  19. Statistics for dental researchers: descriptive statistics

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Baneshi PhD

    2012-09-01

    Full Text Available Descriptive statistics is the process of summarizing gathered raw data from a research and creating useful statistics,which help the better understanding of data. According to the types of variables, which consist of qualitative andquantitative variables, some descriptive statistics have been introduced. Frequency percentage is used in qualitativedata, and mean, median, mode, standard deviation, standard error, variance, and range are some of the statistics whichare used in quantitative data. In health sciences, the majority of continuous variables follow a normal distribution.skewness and kurtosis are two statistics which help to compare a given distribution with the normal distribution.

  20. Algebraic statistics computational commutative algebra in statistics

    CERN Document Server

    Pistone, Giovanni; Wynn, Henry P

    2000-01-01

    Written by pioneers in this exciting new field, Algebraic Statistics introduces the application of polynomial algebra to experimental design, discrete probability, and statistics. It begins with an introduction to Gröbner bases and a thorough description of their applications to experimental design. A special chapter covers the binary case with new application to coherent systems in reliability and two level factorial designs. The work paves the way, in the last two chapters, for the application of computer algebra to discrete probability and statistical modelling through the important concept of an algebraic statistical model.As the first book on the subject, Algebraic Statistics presents many opportunities for spin-off research and applications and should become a landmark work welcomed by both the statistical community and its relatives in mathematics and computer science.

  1. Adrenal Gland Tumors: Statistics

    Science.gov (United States)

    ... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...

  2. Neuroendocrine Tumor: Statistics

    Science.gov (United States)

    ... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 11/ ... the body. It is important to remember that statistics on the survival rates for people with a ...

  3. State transportation statistics 2009

    Science.gov (United States)

    2009-01-01

    The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2009, a statistical profile of transportation in the 50 states and the District of Col...

  4. BTS statistical standards manual

    Science.gov (United States)

    2005-10-01

    The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...

  5. The foundations of statistics

    CERN Document Server

    Savage, Leonard J

    1972-01-01

    Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.

  6. Childhood Cancer Statistics

    Science.gov (United States)

    ... Childhood Cancer Statistics Childhood Cancer Statistics – Graphs and Infographics Number of Diagnoses Incidence Rates Over Time Cancer Deaths Per Year 5-Year Survival Rate Infographics Childhood Cancer Statistics – Important Facts Each year, the ...

  7. Swiss energy statistics 2003

    International Nuclear Information System (INIS)

    2004-01-01

    This comprehensive report by the Swiss Federal Office of Energy (SFOE) presents statistics on energy production and consumption in Switzerland in 2003. Facts and figures are presented in tables and diagrams. First of all, a general overview of Swiss energy consumption is presented that includes details on the shares taken by the various energy carriers involved and their development during the period reviewed. The report also includes graphical representations of energy usage in various sectors such as households, trade and industry, transport and the services sector. Also, economic data on energy consumption is presented. A second chapter takes a look at energy flows from producers to consumers and presents an energy balance for Switzerland in the form of tables and an energy-flow diagram. The individual energy sources and the import, export and storage of energy carriers are discussed as is the conversion between various forms and categories of energy. Details on the consumption of energy, its growth over the years up to 2003 and energy use in various sectors are presented. Also, the Swiss energy balance with reference to the use of renewable forms of energy such as solar energy, biomass, wastes and ambient heat is discussed and figures are presented on the contribution of renewables to heating and the generation of electrical power. The third chapter provides data on the individual energy carriers and the final chapter looks at economical and ecological aspects. An appendix provides information on the methodology used in collecting the statistics and on data available in the Swiss cantons

  8. Swiss energy statistics 2002

    International Nuclear Information System (INIS)

    Swiss Federal Office of Energy, Berne

    2003-01-01

    This comprehensive report by the Swiss Federal Office of Energy (SFOE) presents statistics on energy production and consumption in Switzerland in 2002. Facts and figures are presented in tables and diagrams. First of all, a general overview of Swiss energy consumption is presented that includes details on the shares taken by the various energy carriers involved and their development during the period reviewed. The report also includes graphical representations of energy usage in various sectors such as households, trade and industry, transport and the services sector. Also, economic data on energy consumption is presented. A second chapter takes a look at energy flows from producers to consumers and presents an energy balance for Switzerland in the form of tables and an energy-flow diagram. The individual energy sources and the import, export and storage of energy carriers are discussed as is the conversion between various forms and categories of energy. Details on the consumption of energy, its growth over the years up to 2002 and energy use in various sectors are presented. Also, the Swiss energy balance with reference to the use of renewable forms of energy such as solar energy, biomass, wastes and ambient heat is discussed and figures are presented on the contribution of renewables to heating and the generation of electrical power. The third chapter provides data on the individual energy carriers and the final chapter looks at economical and ecological aspects. An appendix provides information on the methodology used in collecting the statistics and on data available in the Swiss cantons

  9. International petroleum statistics report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-05-01

    The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report is published for the use of Members of Congress, Federal agencies, State agencies, industry, and the general public. Publication of this report is in keeping with responsibilities given the Energy Information Administration in Public Law 95-91. The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.

  10. Statistical physics of vaccination

    Science.gov (United States)

    Wang, Zhen; Bauch, Chris T.; Bhattacharyya, Samit; d'Onofrio, Alberto; Manfredi, Piero; Perc, Matjaž; Perra, Nicola; Salathé, Marcel; Zhao, Dawei

    2016-12-01

    Historically, infectious diseases caused considerable damage to human societies, and they continue to do so today. To help reduce their impact, mathematical models of disease transmission have been studied to help understand disease dynamics and inform prevention strategies. Vaccination-one of the most important preventive measures of modern times-is of great interest both theoretically and empirically. And in contrast to traditional approaches, recent research increasingly explores the pivotal implications of individual behavior and heterogeneous contact patterns in populations. Our report reviews the developmental arc of theoretical epidemiology with emphasis on vaccination, as it led from classical models assuming homogeneously mixing (mean-field) populations and ignoring human behavior, to recent models that account for behavioral feedback and/or population spatial/social structure. Many of the methods used originated in statistical physics, such as lattice and network models, and their associated analytical frameworks. Similarly, the feedback loop between vaccinating behavior and disease propagation forms a coupled nonlinear system with analogs in physics. We also review the new paradigm of digital epidemiology, wherein sources of digital data such as online social media are mined for high-resolution information on epidemiologically relevant individual behavior. Armed with the tools and concepts of statistical physics, and further assisted by new sources of digital data, models that capture nonlinear interactions between behavior and disease dynamics offer a novel way of modeling real-world phenomena, and can help improve health outcomes. We conclude the review by discussing open problems in the field and promising directions for future research.

  11. Generalized quantum statistics

    International Nuclear Information System (INIS)

    Chou, C.

    1992-01-01

    In the paper, a non-anyonic generalization of quantum statistics is presented, in which Fermi-Dirac statistics (FDS) and Bose-Einstein statistics (BES) appear as two special cases. The new quantum statistics, which is characterized by the dimension of its single particle Fock space, contains three consistent parts, namely the generalized bilinear quantization, the generalized quantum mechanical description and the corresponding statistical mechanics

  12. Statistical Literacy in the Data Science Workplace

    Science.gov (United States)

    Grant, Robert

    2017-01-01

    Statistical literacy, the ability to understand and make use of statistical information including methods, has particular relevance in the age of data science, when complex analyses are undertaken by teams from diverse backgrounds. Not only is it essential to communicate to the consumers of information but also within the team. Writing from the…

  13. Arizona Public Library Statistics, 2000-2001.

    Science.gov (United States)

    Elliott, Jan, Comp.

    These statistics were compiled from information supplied by Arizona's public libraries. The document is divided according to the following county groups: Apache, Cochise; Coconino, Gila; Graham, Greenlee, La Paz; Maricopa; Mohave, Navajo; Pima, Pinal; Santa Cruz, Yavapai; and Yuma. Statistics are presented on the following: general information;…

  14. Arizona Public Library Statistics, 1999-2000.

    Science.gov (United States)

    Arizona State Dept. of Library, Archives and Public Records, Phoenix.

    These statistics were compiled from information supplied by Arizona's public libraries. The document is divided according to the following county groups: Apache, Cochise; Coconino, Gila; Graham, Greenlee, La Paz; Maricopa; Mohave, Navajo; Pima, Pinal; Santa Cruz, Yavapai; Yuma. Statistics are presented on the following: general information;…

  15. 2011 statistical abstract of the United States

    Science.gov (United States)

    Krisanda, Joseph M.

    2011-01-01

    The Statistical Abstract of the United States, published since 1878, is the authoritative and comprehensive summary of statistics on the social, political, and economic organization of the United States.Use the Abstract as a convenient volume for statistical reference, and as a guide to sources of more information both in print and on the Web.Sources of data include the Census Bureau, Bureau of Labor Statistics, Bureau of Economic Analysis, and many other Federal agencies and private organizations.

  16. Design research in statistics education : on symbolizing and computer tools

    NARCIS (Netherlands)

    Bakker, A.

    2004-01-01

    The present knowledge society requires statistical literacy-the ability to interpret, critically evaluate, and communicate about statistical information and messages (Gal, 2002). However, research shows that students generally do not gain satisfactory statistical understanding. The research

  17. Industrial statistics with Minitab

    CERN Document Server

    Cintas, Pere Grima; Llabres, Xavier Tort-Martorell

    2012-01-01

    Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores

  18. Statistics For Dummies

    CERN Document Server

    Rumsey, Deborah

    2011-01-01

    The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou

  19. Swiss electricity statistics 2003

    International Nuclear Information System (INIS)

    2004-01-01

    This publication by the Swiss Federal Office of Energy (SFOE) for the Swiss Federal Office of Energy (SFOE) provides statistical information on electricity supply, production, trading and consumption in Switzerland in 2003. Apart from a general overview of the Swiss electricity supply that includes details on power generation, energy transfer with neighbouring countries and data on prices, average consumption and capital investment, the publication also includes graphical representations of electrical energy flows in and out of Switzerland. Tables of data give information on electricity production, import and export for the years 1950 to 2003, the data being supplied for each hydrological year and the summer and winter seasons respectively. The structure of power production in Switzerland is examined in detail and compared with that of foreign countries. Details are given on the development of production capacities and the various means of production together with their respective shares of total production. Further tables and diagrams provide information on power production in various geographical regions and on the management of pumped storage hydro-electricity schemes. A further chapter deals in detail with the consumption of electricity, its growth between 1984 and 2003 and its use in various sectors. A fifth chapter examines electricity consumption, generation, import and export on single, typical days, presenting data in tables and diagrams. The next chapter examines energy transfer with foreign countries and the trading structures involved. The next two chapters cover the future developments in energy exchange and trading with foreign countries and the possibilities of augmenting power generation capacities up to 2010. The final chapter looks at economic considerations involved in the supply of electricity. An annex provides detailed tables of data

  20. Swiss electricity statistics 2002

    International Nuclear Information System (INIS)

    2003-01-01

    This publication by the Swiss Federal Office of Energy (SFOE) for the Swiss Federal Office of Energy (SFOE) provides statistical information on electricity supply, production, trading and consumption in Switzerland in 2002. Apart from a general overview of the Swiss electricity supply that includes details on power generation, energy transfer with neighbouring countries and data on prices, average consumption and capital investment, the publication also includes graphical representations of electrical energy flows in and out of Switzerland. Tables of data give information on electricity production, import and export for the years 1950 to 2002, the data being supplied for each hydrological year and the summer and winter seasons respectively. The structure of power production in Switzerland is examined in detail and compared with that of foreign countries. Details are given on the development of production capacities and the various means of production together with their respective shares of total production. Further tables and diagrams provide information on power production in various geographical regions and on the management of pumped storage hydro-electricity schemes. A further chapter deals in detail with the consumption of electricity, its growth between 1984 and 2002 and its use in various sectors. A fifth chapter examines electricity consumption, generation, import and export on single, typical days, presenting data in tables and diagrams. The next chapter examines energy transfer with foreign countries and the trading structures involved. The next two chapters cover the future developments in energy exchange and trading with foreign countries and the possibilities of augmenting power generation capacities up to 2009. The final chapter looks at economic considerations involved in the supply of electricity. An annex provides detailed tables of data

  1. Equivalent statistics and data interpretation.

    Science.gov (United States)

    Francis, Gregory

    2017-08-01

    Recent reform efforts in psychological science have led to a plethora of choices for scientists to analyze their data. A scientist making an inference about their data must now decide whether to report a p value, summarize the data with a standardized effect size and its confidence interval, report a Bayes Factor, or use other model comparison methods. To make good choices among these options, it is necessary for researchers to understand the characteristics of the various statistics used by the different analysis frameworks. Toward that end, this paper makes two contributions. First, it shows that for the case of a two-sample t test with known sample sizes, many different summary statistics are mathematically equivalent in the sense that they are based on the very same information in the data set. When the sample sizes are known, the p value provides as much information about a data set as the confidence interval of Cohen's d or a JZS Bayes factor. Second, this equivalence means that different analysis methods differ only in their interpretation of the empirical data. At first glance, it might seem that mathematical equivalence of the statistics suggests that it does not matter much which statistic is reported, but the opposite is true because the appropriateness of a reported statistic is relative to the inference it promotes. Accordingly, scientists should choose an analysis method appropriate for their scientific investigation. A direct comparison of the different inferential frameworks provides some guidance for scientists to make good choices and improve scientific practice.

  2. Statistical Methods in Translational Medicine

    Directory of Open Access Journals (Sweden)

    Shein-Chung Chow

    2008-12-01

    Full Text Available This study focuses on strategies and statistical considerations for assessment of translation in language (e.g. translation of case report forms in multinational clinical trials, information (e.g. translation of basic discoveries to the clinic and technology (e.g. translation of Chinese diagnostic techniques to well-established clinical study endpoints in pharmaceutical/clinical research and development. However, most of our efforts will be directed to statistical considerations for translation in information. Translational medicine has been defined as bench-to-bedside research, where a basic laboratory discovery becomes applicable to the diagnosis, treatment or prevention of a specific disease, and is brought forth by either a physician—scientist who works at the interface between the research laboratory and patient care, or by a team of basic and clinical science investigators. Statistics plays an important role in translational medicine to ensure that the translational process is accurate and reliable with certain statistical assurance. Statistical inference for the applicability of an animal model to a human model is also discussed. Strategies for selection of clinical study endpoints (e.g. absolute changes, relative changes, or responder-defined, based on either absolute or relative change are reviewed.

  3. Procedimiento para determinar las Tendencias Estadísticas del Desarrollo de la Competencia Investigativa del Ingeniero en Ciencias Informáticas (Procedure to Determinate the Statistical Trends of Development of Research Competence in Comp. Sci. Engineers

    Directory of Open Access Journals (Sweden)

    Odiel Estrada Molina

    2014-05-01

    Full Text Available Resumen En la Universidad de las Ciencias Informáticas de Cuba algunos estudiantes en su cuarto año académico de la carrera de ingeniería en ciencias informáticas se incorporan a los Centros de Desarrollo de Software y forman parte del equipo de desarrollo de un proyecto productivo real. Las tareas que se le orientan y evalúan se realizan por el mismo Sistema de Gestión de Proyectos (GESPRO que posee el proyecto de software, pero este sistema tiene como limitantes didácticas que el módulo de orientación de tareas carecen de elementos que le permitan al tutor o especialistas (profesionales que atienden al estudiante evaluar a los alumnos según los indicadores que comprende la competencia investigativa, ni permite conocer las tendencias estadísticas del aprendizaje del estudiante en un intervalo de tiempo determinado. Debido a las limitantes antes señaladas, se propuso desarrollar una aplicación que permitiera orientar a los tutores en la evaluación de la competencia investigativa y a su vez que este sistema se pudiera integrar al Sistema de Gestión de Proyectos (GESPRO de la universidad determinando las tendencias estadísticas actual del aprendizaje de los estudiantes en torno al desarrollo de la competencia investigativa para la toma de decisiones oportunas. Para desarrollar el software se elaboró un procedimiento basado en series temporales que permitiera determinar las tendencias estadísticas del aprendizaje del estudiante en función del desarrollo de la competencia investigativa asociada al desarrollo de software, que es a su vez el resultado que se presentará en este artículo. Abstract At the University of Informatics Sciences, Cuba, some students in their fourth academic year of an engineering in computer science degree are incorporated into the Software Development Centers and are part of the development team of a real production project. The tasks to be oriented and evaluated for by the same Project Management System

  4. National transportation statistics 2010

    Science.gov (United States)

    2010-01-01

    National Transportation Statistics presents statistics on the U.S. transportation system, including its physical components, safety record, economic performance, the human and natural environment, and national security. This is a large online documen...

  5. Blood Facts and Statistics

    Science.gov (United States)

    ... Home > Learn About Blood > Blood Facts and Statistics Blood Facts and Statistics Facts about blood needs Facts ... about American Red Cross Blood Services Facts about blood needs Every two seconds someone in the U.S. ...

  6. CMS Program Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics has developed CMS Program Statistics, which includes detailed summary statistics on national health care, Medicare...

  7. Statistics for Finance

    DEFF Research Database (Denmark)

    Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard

    Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...

  8. Developments in Statistical Education.

    Science.gov (United States)

    Kapadia, Ramesh

    1980-01-01

    The current status of statistics education at the secondary level is reviewed, with particular attention focused on the various instructional programs in England. A description and preliminary evaluation of the Schools Council Project on Statistical Education is included. (MP)

  9. Principles of applied statistics

    National Research Council Canada - National Science Library

    Cox, D. R; Donnelly, Christl A

    2011-01-01

    .... David Cox and Christl Donnelly distil decades of scientific experience into usable principles for the successful application of statistics, showing how good statistical strategy shapes every stage of an investigation...

  10. Statistical data analysis handbook

    National Research Council Canada - National Science Library

    Wall, Francis J

    1986-01-01

    It must be emphasized that this is not a text book on statistics. Instead it is a working tool that presents data analysis in clear, concise terms which can be readily understood even by those without formal training in statistics...

  11. Applying contemporary statistical techniques

    CERN Document Server

    Wilcox, Rand R

    2003-01-01

    Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc

  12. Conversion factors and oil statistics

    International Nuclear Information System (INIS)

    Karbuz, Sohbet

    2004-01-01

    World oil statistics, in scope and accuracy, are often far from perfect. They can easily lead to misguided conclusions regarding the state of market fundamentals. Without proper attention directed at statistic caveats, the ensuing interpretation of oil market data opens the door to unnecessary volatility, and can distort perception of market fundamentals. Among the numerous caveats associated with the compilation of oil statistics, conversion factors, used to produce aggregated data, play a significant role. Interestingly enough, little attention is paid to conversion factors, i.e. to the relation between different units of measurement for oil. Additionally, the underlying information regarding the choice of a specific factor when trying to produce measurements of aggregated data remains scant. The aim of this paper is to shed some light on the impact of conversion factors for two commonly encountered issues, mass to volume equivalencies (barrels to tonnes) and for broad energy measures encountered in world oil statistics. This paper will seek to demonstrate how inappropriate and misused conversion factors can yield wildly varying results and ultimately distort oil statistics. Examples will show that while discrepancies in commonly used conversion factors may seem trivial, their impact on the assessment of a world oil balance is far from negligible. A unified and harmonised convention for conversion factors is necessary to achieve accurate comparisons and aggregate oil statistics for the benefit of both end-users and policy makers

  13. Ethics in Statistics

    Science.gov (United States)

    Lenard, Christopher; McCarthy, Sally; Mills, Terence

    2014-01-01

    There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…

  14. Fisher's Contributions to Statistics

    Indian Academy of Sciences (India)

    T Krishnan received his. Ph.D. from the Indian. Statistical Institute. He joined the faculty of lSI in. 1965 and has been with the Institute ever since. He is at present a professor in the Applied. Statistics, Surveys and. Computing Division of the Institute. Krishnan's research interests are in. Statistical Pattern. Recognition ...

  15. Fermi–Dirac Statistics

    Indian Academy of Sciences (India)

    IAS Admin

    Dirac statistics, identical and in- distinguishable particles, Fermi gas. ... They obey. Fermi–Dirac statistics. In contrast, those with integer spin such as photons, mesons, 7Li atoms are called bosons and they obey. Bose–Einstein statistics. .... hypothesis (which later was extended as the third law of thermody- namics) was ...

  16. Research and development statistics 2001

    CERN Document Server

    2002-01-01

    This publication provides recent basic statistics on the resources devoted to R&D in OECD countries. The statistical series are presented for the last seven years for which data are available and cover expenditure by source of funds and type of costs; personnel by occupation and/or level of qualification; both at the national level by performance sector, for enterprises by industry, and for higher education by field of science. The publication also provides information on the output of science and technology (S&T) activities relating to the technology balance of payments.

  17. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2010-01-01

    Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente

  18. Statistics & probaility for dummies

    CERN Document Server

    Rumsey, Deborah J

    2013-01-01

    Two complete eBooks for one low price! Created and compiled by the publisher, this Statistics I and Statistics II bundle brings together two math titles in one, e-only bundle. With this special bundle, you'll get the complete text of the following two titles: Statistics For Dummies, 2nd Edition  Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more. Tra

  19. Statistics for Research

    CERN Document Server

    Dowdy, Shirley; Chilko, Daniel

    2011-01-01

    Praise for the Second Edition "Statistics for Research has other fine qualities besides superior organization. The examples and the statistical methods are laid out with unusual clarity by the simple device of using special formats for each. The book was written with great care and is extremely user-friendly."-The UMAP Journal Although the goals and procedures of statistical research have changed little since the Second Edition of Statistics for Research was published, the almost universal availability of personal computers and statistical computing application packages have made it possible f

  20. Statistics in a nutshell

    CERN Document Server

    Boslaugh, Sarah

    2013-01-01

    Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.

  1. Head First Statistics

    CERN Document Server

    Griffiths, Dawn

    2009-01-01

    Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics

  2. Statistics for economics

    CERN Document Server

    Naghshpour, Shahdad

    2012-01-01

    Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...

  3. Statistical inference based on divergence measures

    CERN Document Server

    Pardo, Leandro

    2005-01-01

    The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this powerful approach.Statistical Inference Based on Divergence Measures explores classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence. The first two chapters form an overview, from a statistical perspective, of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations, prese...

  4. Permutation statistical methods an integrated approach

    CERN Document Server

    Berry, Kenneth J; Johnston, Janis E

    2016-01-01

    This research monograph provides a synthesis of a number of statistical tests and measures, which, at first consideration, appear disjoint and unrelated. Numerous comparisons of permutation and classical statistical methods are presented, and the two methods are compared via probability values and, where appropriate, measures of effect size. Permutation statistical methods, compared to classical statistical methods, do not rely on theoretical distributions, avoid the usual assumptions of normality and homogeneity of variance, and depend only on the data at hand. This text takes a unique approach to explaining statistics by integrating a large variety of statistical methods, and establishing the rigor of a topic that to many may seem to be a nascent field in statistics. This topic is new in that it took modern computing power to make permutation methods available to people working in the mainstream of research. This research monograph addresses a statistically-informed audience, and can also easily serve as a ...

  5. National Statistical Commission and Indian Official Statistics

    Indian Academy of Sciences (India)

    T J Rao1. C. R. Rao Advanced Institute of Mathematics, Statistics and Computer Science (AIMSCS) University of Hyderabad Campus Central University Post Office, Prof. C. R. Rao Road Hyderabad 500 046, AP, India. Resonance – Journal of Science Education. Current Issue : Vol. 23, Issue 2 · Current Issue Volume 23 ...

  6. Waste statistics 2001

    International Nuclear Information System (INIS)

    2004-01-01

    Reports to the ISAG (Information System for Waste and Recycling) for 2001 cover 402 Danish waste treatment plants owned by 295 enterprises. The total waste generation in 2001 amounted to 12,768,000 tonnes, which is 2% less than in 2000. Reductions are primarily due to the fact that sludge for mineralization is included with a dry matter content of 20% compared to 1,5% in previous statistics. This means that sludge amounts have been reduced by 808,886 tonnes. The overall rate of recycling amounted to 63%, which is 1% less than the overall recycling target of 64% for 2004. Since sludge has a high recycling rate, the reduction in sludge amounts of 808,886 tonnes has also caused the total recycling rate to fall. Waste amounts incinerated accounted for 25%, which is 1% more than the overall target of 24% for incineration in 2004. Waste going to landfill amounted to 10%, which is better than the overall landfill target for 2004 of a maximum of 12% for landfilling. Targets for treatment of waste from the different sectors, however, are still not complied with, since too little waste from households and the service sector is recycled, and too much waste from industry is led to landfill. (BA)

  7. Energy statistics manual

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    The Manual is written in a question-and-answer format. The points developed are introduced with a basic question, such as: What do people mean by 'fuels' and 'energy'? What units are used to express oil? How are energy data presented? Answers are given in simple terms and illustrated by graphs, charts and tables. More technical explanations are found in the annexes. The Manual contains seven chapters. The first one presents the fundamentals of energy statistics, five chapters deal with the five different fuels (electricity and heat; natural gas; oil; solid fuels and manufactured gases; renewables and waste) and the last chapter explains the energy balance. Three technical annexes and a glossary are also included. For the five chapters dedicated to the fuels, there are three levels of reading: the first one contains general information on the subject, the second one reviews issues which are specific to the joint IEA/OECD-Eurostat-UNECE questionnaires and the third one focuses on the essential elements of the subject. 43 figs., 22 tabs., 3 annexes.

  8. Waste statistics 2001

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    Reports to the ISAG (Information System for Waste and Recycling) for 2001 cover 402 Danish waste treatment plants owned by 295 enterprises. The total waste generation in 2001 amounted to 12,768,000 tonnes, which is 2% less than in 2000. Reductions are primarily due to the fact that sludge for mineralization is included with a dry matter content of 20% compared to 1,5% in previous statistics. This means that sludge amounts have been reduced by 808,886 tonnes. The overall rate of recycling amounted to 63%, which is 1% less than the overall recycling target of 64% for 2004. Since sludge has a high recycling rate, the reduction in sludge amounts of 808,886 tonnes has also caused the total recycling rate to fall. Waste amounts incinerated accounted for 25%, which is 1% more than the overall target of 24% for incineration in 2004. Waste going to landfill amounted to 10%, which is better than the overall landfill target for 2004 of a maximum of 12% for landfilling. Targets for treatment of waste from the different sectors, however, are still not complied with, since too little waste from households and the service sector is recycled, and too much waste from industry is led to landfill. (BA)

  9. Energy Statistics Manual; Handbuch Energiestatistik

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    Detailed, complete, timely and reliable statistics are essential to monitor the energy situation at a country level as well as at an international level. Energy statistics on supply, trade, stocks, transformation and demand are indeed the basis for any sound energy policy decision. For instance, the market of oil -- which is the largest traded commodity worldwide -- needs to be closely monitored in order for all market players to know at any time what is produced, traded, stocked and consumed and by whom. In view of the role and importance of energy in world development, one would expect that basic energy information to be readily available and reliable. This is not always the case and one can even observe a decline in the quality, coverage and timeliness of energy statistics over the last few years.

  10. Statistics in Social Context: Using Issues of Lesbians, Gays, Bisexuals and Transsexuals in Teaching Statistics.

    Science.gov (United States)

    Kellermeier, John

    2002-01-01

    Describes a "criticalmathematics" approach to teaching statistics that uses gay, lesbian, bisexual, and transsexual issues and statistics about them as teaching examples. The inclusions of these social issues gives the analysis of statistical data more relevance and encourages students to question, understand, and confront information more…

  11. Statistical Physics An Introduction

    CERN Document Server

    Yoshioka, Daijiro

    2007-01-01

    This book provides a comprehensive presentation of the basics of statistical physics. The first part explains the essence of statistical physics and how it provides a bridge between microscopic and macroscopic phenomena, allowing one to derive quantities such as entropy. Here the author avoids going into details such as Liouville’s theorem or the ergodic theorem, which are difficult for beginners and unnecessary for the actual application of the statistical mechanics. In the second part, statistical mechanics is applied to various systems which, although they look different, share the same mathematical structure. In this way readers can deepen their understanding of statistical physics. The book also features applications to quantum dynamics, thermodynamics, the Ising model and the statistical dynamics of free spins.

  12. The statistical stability phenomenon

    CERN Document Server

    Gorban, Igor I

    2017-01-01

    This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations – the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers...

  13. Applied statistics for economists

    CERN Document Server

    Lewis, Margaret

    2012-01-01

    This book is an undergraduate text that introduces students to commonly-used statistical methods in economics. Using examples based on contemporary economic issues and readily-available data, it not only explains the mechanics of the various methods, it also guides students to connect statistical results to detailed economic interpretations. Because the goal is for students to be able to apply the statistical methods presented, online sources for economic data and directions for performing each task in Excel are also included.

  14. Introduction to Statistics

    Directory of Open Access Journals (Sweden)

    Mirjam Nielen

    2017-01-01

    Full Text Available Always wondered why research papers often present rather complicated statistical analyses? Or wondered how to properly analyse the results of a pragmatic trial from your own practice? This talk will give an overview of basic statistical principles and focus on the why of statistics, rather than on the how.This is a podcast of Mirjam's talk at the Veterinary Evidence Today conference, Edinburgh November 2, 2016. 

  15. Equilibrium statistical mechanics

    CERN Document Server

    Jackson, E Atlee

    2000-01-01

    Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t

  16. Optimization techniques in statistics

    CERN Document Server

    Rustagi, Jagdish S

    1994-01-01

    Statistics help guide us to optimal decisions under uncertainty. A large variety of statistical problems are essentially solutions to optimization problems. The mathematical techniques of optimization are fundamentalto statistical theory and practice. In this book, Jagdish Rustagi provides full-spectrum coverage of these methods, ranging from classical optimization and Lagrange multipliers, to numerical techniques using gradients or direct search, to linear, nonlinear, and dynamic programming using the Kuhn-Tucker conditions or the Pontryagin maximal principle. Variational methods and optimiza

  17. Equilibrium statistical mechanics

    CERN Document Server

    Mayer, J E

    1968-01-01

    The International Encyclopedia of Physical Chemistry and Chemical Physics, Volume 1: Equilibrium Statistical Mechanics covers the fundamental principles and the development of theoretical aspects of equilibrium statistical mechanics. Statistical mechanical is the study of the connection between the macroscopic behavior of bulk matter and the microscopic properties of its constituent atoms and molecules. This book contains eight chapters, and begins with a presentation of the master equation used for the calculation of the fundamental thermodynamic functions. The succeeding chapters highlight t

  18. Fundamentals of statistical signal processing

    CERN Document Server

    Kay, Steven M

    1993-01-01

    A unified presentation of parameter estimation for those involved in the design and implementation of statistical signal processing algorithms. Covers important approaches to obtaining an optimal estimator and analyzing its performance; and includes numerous examples as well as applications to real- world problems. MARKETS: For practicing engineers and scientists who design and analyze signal processing systems, i.e., to extract information from noisy signals — radar engineer, sonar engineer, geophysicist, oceanographer, biomedical engineer, communications engineer, economist, statistician, physicist, etc.

  19. Annual Statistical Supplement, 2001

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2001 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  20. Annual Statistical Supplement, 2011

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2011 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  1. Annual Statistical Supplement, 2003

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2003 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  2. Annual Statistical Supplement, 2015

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2015 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  3. Annual Statistical Supplement, 2000

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2000 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  4. Annual Statistical Supplement, 2005

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2005 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  5. Annual Statistical Supplement, 2014

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2014 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  6. Annual Statistical Supplement, 2009

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2009 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  7. Annual Statistical Supplement, 2017

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2017 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  8. Annual Statistical Supplement, 2008

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2008 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  9. Annual Statistical Supplement, 2010

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2010 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  10. Annual Statistical Supplement, 2016

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2016 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  11. Annual Statistical Supplement, 2004

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2004 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  12. Annual Statistical Supplement, 2002

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2002 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  13. Annual Statistical Supplement, 2007

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2007 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  14. Annual Statistical Supplement, 2006

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2006 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  15. Principles of statistics

    CERN Document Server

    Bulmer, M G

    1979-01-01

    There are many textbooks which describe current methods of statistical analysis, while neglecting related theory. There are equally many advanced textbooks which delve into the far reaches of statistical theory, while bypassing practical applications. But between these two approaches is an unfilled gap, in which theory and practice merge at an intermediate level. Professor M. G. Bulmer's Principles of Statistics, originally published in 1965, was created to fill that need. The new, corrected Dover edition of Principles of Statistics makes this invaluable mid-level text available once again fo

  16. Fundamental statistical theories

    International Nuclear Information System (INIS)

    Demopoulos, W.

    1976-01-01

    Einstein argued that since quantum mechanics is not a fundamental theory it cannot be regarded as in any sense final. The pure statistical states of the quantum theory are not dispersion-free. In this sense, the theory is significantly statistical. The problem investigated in this paper is to determine under what conditions is a significalty statistical theory correctly regarded as fundamental. The solution developed in this paper is that a statistical theory is fundamental only if it is complete; moreover the quantum theory is complete. (B.R.H.)

  17. Plague Maps and Statistics

    Science.gov (United States)

    ... Healthcare Professionals Clinicians Public Health Officials Veterinarians Prevention History of Plague Resources FAQ Maps and Statistics Recommend on Facebook Tweet Share Compartir Plague in the United States ...

  18. Understanding Computational Bayesian Statistics

    CERN Document Server

    Bolstad, William M

    2011-01-01

    A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic

  19. Statistics in a Nutshell

    CERN Document Server

    Boslaugh, Sarah

    2008-01-01

    Need to learn statistics as part of your job, or want some help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference that's perfect for anyone with no previous background in the subject. This book gives you a solid understanding of statistics without being too simple, yet without the numbing complexity of most college texts. You get a firm grasp of the fundamentals and a hands-on understanding of how to apply them before moving on to the more advanced material that follows. Each chapter presents you with easy-to-follow descriptions illustrat

  20. Statistics is Easy

    CERN Document Server

    Shasha, Dennis

    2010-01-01

    Statistics is the activity of inferring results about a population given a sample. Historically, statistics books assume an underlying distribution to the data (typically, the normal distribution) and derive results under that assumption. Unfortunately, in real life, one cannot normally be sure of the underlying distribution. For that reason, this book presents a distribution-independent approach to statistics based on a simple computational counting idea called resampling. This book explains the basic concepts of resampling, then systematically presents the standard statistical measures along

  1. Auditory Perception of Statistically Blurred Sound Textures

    DEFF Research Database (Denmark)

    McWalter, Richard Ian; MacDonald, Ewen; Dau, Torsten

    Sound textures have been identified as a category of sounds which are processed by the peripheral auditory system and captured with running timeaveraged statistics. Although sound textures are temporally homogeneous, they offer a listener with enough information to identify and differentiate...... sources. This experiment investigated the ability of the auditory system to identify statistically blurred sound textures and the perceptual relationship between sound textures. Identification performance of statistically blurred sound textures presented at a fixed blur increased over those presented...

  2. STATISTICAL ANALYSIS OF MONETARY POLICY INDICATORS VARIABILITY

    Directory of Open Access Journals (Sweden)

    ANAMARIA POPESCU

    2016-10-01

    Full Text Available This paper attempts to characterize through statistical indicators of statistical data that we have available. The purpose of this paper is to present statistical indicators, primary and secondary, simple and synthetic, which is frequently used for statistical characterization of statistical series. We can thus analyze central tendency, and data variability, form and concentration distributions package data using analytical tools in Microsoft Excel that enables automatic calculation of descriptive statistics using Data Analysis option from the Tools menu. We will also study the links which exist between statistical variables can be studied using two techniques, correlation and regression. From the analysis of monetary policy in the period 2003 - 2014 and information provided by the website of the National Bank of Romania (BNR seems to be a certain tendency towards eccentricity and asymmetry of financial data series.

  3. Thiele. Pioneer in statistics

    DEFF Research Database (Denmark)

    Lauritzen, Steffen Lilholt

    This book studies the brilliant Danish 19th Century astronomer, T.N. Thiele who made important contributions to statistics, actuarial science, astronomy and mathematics. The most important of these contributions in statistics are translated into English for the first time, and the text includes c...

  4. Applied Statistics with SPSS

    Science.gov (United States)

    Huizingh, Eelko K. R. E.

    2007-01-01

    Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…

  5. Cancer Statistics Animator

    Science.gov (United States)

    This tool allows users to animate cancer trends over time by cancer site and cause of death, race, and sex. Provides access to incidence, mortality, and survival. Select the type of statistic, variables, format, and then extract the statistics in a delimited format for further analyses.

  6. Principles of medical statistics

    National Research Council Canada - National Science Library

    Feinstein, Alvan R

    2002-01-01

    ... or limited attention. They are then offered a simple, superficial account of the most common doctrines and applications of statistical theory. The "get-it-over-withquickly" approach has been encouraged and often necessitated by the short time given to statistics in modern biomedical education. The curriculum is supposed to provide fundament...

  7. Statistics 101 for Radiologists.

    Science.gov (United States)

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.

  8. Statistical Engine Knock Control

    DEFF Research Database (Denmark)

    Stotsky, Alexander A.

    2008-01-01

    A new statistical concept of the knock control of a spark ignition automotive engine is proposed . The control aim is associated with the statistical hy pothesis test which compares the threshold value to the average value of the max imal amplitud e of the knock sensor signal at a given freq uency...

  9. Practical statistics simply explained

    CERN Document Server

    Langley, Dr Russell A

    1971-01-01

    For those who need to know statistics but shy away from math, this book teaches how to extract truth and draw valid conclusions from numerical data using logic and the philosophy of statistics rather than complex formulae. Lucid discussion of averages and scatter, investigation design, more. Problems with solutions.

  10. Thiele. Pioneer in statistics

    DEFF Research Database (Denmark)

    Lauritzen, Steffen Lilholt

    This book studies the brilliant Danish 19th Century astronomer, T.N. Thiele who made important contributions to statistics, actuarial science, astronomy and mathematics. The most important of these contributions in statistics are translated into English for the first time, and the text includes...

  11. Enhancing statistical literacy

    NARCIS (Netherlands)

    Droogers, M.J.S.|info:eu-repo/dai/nl/413392252; Drijvers, P.H.M.|info:eu-repo/dai/nl/074302922

    2017-01-01

    Current secondary school statistics curricula focus on procedural knowledge and pay too little attention to statistical reasoning. As a result, students are not able to apply their knowledge to practice. In addition, education often targets the average student, which may lead to gifted students

  12. Water Quality Statistics

    Science.gov (United States)

    Hodgson, Ted; Andersen, Lyle; Robison-Cox, Jim; Jones, Clain

    2004-01-01

    Water quality experiments, especially the use of macroinvertebrates as indicators of water quality, offer an ideal context for connecting statistics and science. In the STAR program for secondary students and teachers, water quality experiments were also used as a context for teaching statistics. In this article, we trace one activity that uses…

  13. Bayesian statistical inference

    Directory of Open Access Journals (Sweden)

    Bruno De Finetti

    2017-04-01

    Full Text Available This work was translated into English and published in the volume: Bruno De Finetti, Induction and Probability, Biblioteca di Statistica, eds. P. Monari, D. Cocchi, Clueb, Bologna, 1993.Bayesian statistical Inference is one of the last fundamental philosophical papers in which we can find the essential De Finetti's approach to the statistical inference.

  14. Practical statistics for educators

    CERN Document Server

    Ravid, Ruth

    2014-01-01

    Practical Statistics for Educators, Fifth Edition, is a clear and easy-to-follow text written specifically for education students in introductory statistics courses and in action research courses. It is also a valuable resource and guidebook for educational practitioners who wish to study their own settings.

  15. Using Microsoft Excel to Generate Usage Statistics

    Science.gov (United States)

    Spellman, Rosemary

    2011-01-01

    At the Libraries Service Center, statistics are generated on a monthly, quarterly, and yearly basis by using four Microsoft Excel workbooks. These statistics provide information about what materials are being requested and by whom. They also give details about why certain requests may not have been filled. Utilizing Excel allows for a shallower…

  16. Statistical Power in Meta-Analysis

    Science.gov (United States)

    Liu, Jin

    2015-01-01

    Statistical power is important in a meta-analysis study, although few studies have examined the performance of simulated power in meta-analysis. The purpose of this study is to inform researchers about statistical power estimation on two sample mean difference test under different situations: (1) the discrepancy between the analytical power and…

  17. Improving Fuel Statistics for Danish Aviation

    DEFF Research Database (Denmark)

    Winther, M.

    Islands, obtained with the NERI model. In addition a complete overview of the aviation fuel use from the two latter areas is given, based on fuel sale information from Statistics Greenland and Statistics Faroe Islands, and fuel use data from airline companies. The fuel use figures are presented on a level...

  18. Clinical Decision Support: Statistical Hopes and Challenges

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan; Zvárová, Jana

    2016-01-01

    Roč. 4, č. 1 (2016), s. 30-34 ISSN 1805-8698 Grant - others:Nadační fond na opdporu vědy(CZ) Neuron Institutional support: RVO:67985807 Keywords : decision support * data mining * multivariate statistics * psychiatry * information based medicine Subject RIV: BB - Applied Statistics, Operational Research

  19. Tourette Syndrome (TS): Data and Statistics

    Science.gov (United States)

    ... Submit" /> Information For… Media Policy Makers Data & Statistics Recommend on Facebook Tweet Share Compartir * The data ... Behavioral or conduct problems, 26%; Anxiety problems, 49%; Depression, 25%; Autism spectrum disorder, 35%; Learning disability, 47%; ...

  20. SEER Statistics | DCCPS/NCI/NIH

    Science.gov (United States)

    The Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute works to provide information on cancer statistics in an effort to reduce the burden of cancer among the U.S. population.

  1. Introduction to Bayesian statistics

    CERN Document Server

    Bolstad, William M

    2017-01-01

    There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...

  2. Understanding advanced statistical methods

    CERN Document Server

    Westfall, Peter

    2013-01-01

    Introduction: Probability, Statistics, and ScienceReality, Nature, Science, and ModelsStatistical Processes: Nature, Design and Measurement, and DataModelsDeterministic ModelsVariabilityParametersPurely Probabilistic Statistical ModelsStatistical Models with Both Deterministic and Probabilistic ComponentsStatistical InferenceGood and Bad ModelsUses of Probability ModelsRandom Variables and Their Probability DistributionsIntroductionTypes of Random Variables: Nominal, Ordinal, and ContinuousDiscrete Probability Distribution FunctionsContinuous Probability Distribution FunctionsSome Calculus-Derivatives and Least SquaresMore Calculus-Integrals and Cumulative Distribution FunctionsProbability Calculation and SimulationIntroductionAnalytic Calculations, Discrete and Continuous CasesSimulation-Based ApproximationGenerating Random NumbersIdentifying DistributionsIntroductionIdentifying Distributions from Theory AloneUsing Data: Estimating Distributions via the HistogramQuantiles: Theoretical and Data-Based Estimate...

  3. Statistical Plasma Physics

    CERN Document Server

    Ichimaru, Setsuo

    2004-01-01

    Plasma physics is an integral part of statistical physics, complete with its own basic theories. Designed as a two-volume set, Statistical Plasma Physics is intended for advanced undergraduate and beginning graduate courses on plasma and statistical physics, and as such, its presentation is self-contained and should be read without difficulty by those with backgrounds in classical mechanics, electricity and magnetism, quantum mechanics, and statistics. Major topics include: plasma phenomena in nature, kinetic equations, plasmas and dielectric media, electromagnetic properties of Vlasov plasmas in thermodynamic equilibria, transient processes, and instabilities. Statistical Plasma Physics, Volume II, treats subjects in the field of condensed plasma physics, with applications to condensed matter physics, atomic physics, nuclear physics, and astrophysics. The aim of this book is to elucidate a number of basic topics in physics of dense plasmas that interface with condensed matter physics, atomic physics, nuclear...

  4. Powerful Statistical Inference for Nested Data Using Sufficient Summary Statistics.

    Science.gov (United States)

    Dowding, Irene; Haufe, Stefan

    2018-01-01

    Hierarchically-organized data arise naturally in many psychology and neuroscience studies. As the standard assumption of independent and identically distributed samples does not hold for such data, two important problems are to accurately estimate group-level effect sizes, and to obtain powerful statistical tests against group-level null hypotheses. A common approach is to summarize subject-level data by a single quantity per subject, which is often the mean or the difference between class means, and treat these as samples in a group-level t -test. This "naive" approach is, however, suboptimal in terms of statistical power, as it ignores information about the intra-subject variance. To address this issue, we review several approaches to deal with nested data, with a focus on methods that are easy to implement. With what we call the sufficient-summary-statistic approach, we highlight a computationally efficient technique that can improve statistical power by taking into account within-subject variances, and we provide step-by-step instructions on how to apply this approach to a number of frequently-used measures of effect size. The properties of the reviewed approaches and the potential benefits over a group-level t -test are quantitatively assessed on simulated data and demonstrated on EEG data from a simulated-driving experiment.

  5. Powerful Statistical Inference for Nested Data Using Sufficient Summary Statistics

    Science.gov (United States)

    Dowding, Irene; Haufe, Stefan

    2018-01-01

    Hierarchically-organized data arise naturally in many psychology and neuroscience studies. As the standard assumption of independent and identically distributed samples does not hold for such data, two important problems are to accurately estimate group-level effect sizes, and to obtain powerful statistical tests against group-level null hypotheses. A common approach is to summarize subject-level data by a single quantity per subject, which is often the mean or the difference between class means, and treat these as samples in a group-level t-test. This “naive” approach is, however, suboptimal in terms of statistical power, as it ignores information about the intra-subject variance. To address this issue, we review several approaches to deal with nested data, with a focus on methods that are easy to implement. With what we call the sufficient-summary-statistic approach, we highlight a computationally efficient technique that can improve statistical power by taking into account within-subject variances, and we provide step-by-step instructions on how to apply this approach to a number of frequently-used measures of effect size. The properties of the reviewed approaches and the potential benefits over a group-level t-test are quantitatively assessed on simulated data and demonstrated on EEG data from a simulated-driving experiment. PMID:29615885

  6. Statistical mechanics in JINR

    International Nuclear Information System (INIS)

    Tonchev, N.; Shumovskij, A.S.

    1986-01-01

    The history of investigations, conducted at the JINR in the field of statistical mechanics, beginning with the fundamental works by Bogolyubov N.N. on superconductivity microscopic theory is presented. Ideas, introduced in these works and methods developed in them, have largely determined the ways for developing statistical mechanics in the JINR and Hartree-Fock-Bogolyubov variational principle has become an important method of the modern nucleus theory. A brief review of the main achievements, connected with the development of statistical mechanics methods and their application in different fields of physical science is given

  7. Statistics a complete introduction

    CERN Document Server

    Graham, Alan

    2013-01-01

    Statistics: A Complete Introduction is the most comprehensive yet easy-to-use introduction to using Statistics. Written by a leading expert, this book will help you if you are studying for an important exam or essay, or if you simply want to improve your knowledge. The book covers all the key areas of Statistics including graphs, data interpretation, spreadsheets, regression, correlation and probability. Everything you will need is here in this one book. Each chapter includes not only an explanation of the knowledge and skills you need, but also worked examples and test questions.

  8. Evolutionary Statistical Procedures

    CERN Document Server

    Baragona, Roberto; Poli, Irene

    2011-01-01

    This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions a

  9. Statistical physics; Physique statistique

    Energy Technology Data Exchange (ETDEWEB)

    Couture, L.; Zitoun, R. [Universite Pierre et Marie Curie, 75 - Paris (France)

    1992-12-31

    The basis of statistical physics is exposed. The statistical models of Maxwell-Boltzmann, of Bose-Einstein and of Fermi-Dirac and their particular application fields are presented. The statistical theory is applied in different ranges of physics: gas characteristics, paramagnetism, crystal thermal properties and solid electronic properties. A whole chapter is dedicated to helium and its characteristics such as superfluidity, another deals with superconductivity. Superconductivity is presented both experimentally and theoretically. Meissner effect and Josephson effect are described and the framework of BCS theory is drawn. (A.C.)

  10. The nature of statistics

    CERN Document Server

    Wallis, W Allen

    2014-01-01

    Focusing on everyday applications as well as those of scientific research, this classic of modern statistical methods requires little to no mathematical background. Readers develop basic skills for evaluating and using statistical data. Lively, relevant examples include applications to business, government, social and physical sciences, genetics, medicine, and public health. ""W. Allen Wallis and Harry V. Roberts have made statistics fascinating."" - The New York Times ""The authors have set out with considerable success, to write a text which would be of interest and value to the student who,

  11. Statistical deception at work

    CERN Document Server

    Mauro, John

    2013-01-01

    Written to reveal statistical deceptions often thrust upon unsuspecting journalists, this book views the use of numbers from a public perspective. Illustrating how the statistical naivete of journalists often nourishes quantitative misinformation, the author's intent is to make journalists more critical appraisers of numerical data so that in reporting them they do not deceive the public. The book frequently uses actual reported examples of misused statistical data reported by mass media and describes how journalists can avoid being taken in by them. Because reports of survey findings seldom g

  12. Methods of statistical physics

    CERN Document Server

    Akhiezer, Aleksandr I

    1981-01-01

    Methods of Statistical Physics is an exposition of the tools of statistical mechanics, which evaluates the kinetic equations of classical and quantized systems. The book also analyzes the equations of macroscopic physics, such as the equations of hydrodynamics for normal and superfluid liquids and macroscopic electrodynamics. The text gives particular attention to the study of quantum systems. This study begins with a discussion of problems of quantum statistics with a detailed description of the basics of quantum mechanics along with the theory of measurement. An analysis of the asymptotic be

  13. Statistical Group Comparison

    CERN Document Server

    Liao, Tim Futing

    2011-01-01

    An incomparably useful examination of statistical methods for comparisonThe nature of doing science, be it natural or social, inevitably calls for comparison. Statistical methods are at the heart of such comparison, for they not only help us gain understanding of the world around us but often define how our research is to be carried out. The need to compare between groups is best exemplified by experiments, which have clearly defined statistical methods. However, true experiments are not always possible. What complicates the matter more is a great deal of diversity in factors that are not inde

  14. Statistical downscaling of river flows

    Science.gov (United States)

    Tisseuil, Clement; Vrac, Mathieu; Lek, Sovan; Wade, Andrew J.

    2010-05-01

    SummaryAn extensive statistical 'downscaling' study is done to relate large-scale climate information from a general circulation model (GCM) to local-scale river flows in SW France for 51 gauging stations ranging from nival (snow-dominated) to pluvial (rainfall-dominated) river-systems. This study helps to select the appropriate statistical method at a given spatial and temporal scale to downscale hydrology for future climate change impact assessment of hydrological resources. The four proposed statistical downscaling models use large-scale predictors (derived from climate model outputs or reanalysis data) that characterize precipitation and evaporation processes in the hydrological cycle to estimate summary flow statistics. The four statistical models used are generalized linear (GLM) and additive (GAM) models, aggregated boosted trees (ABT) and multi-layer perceptron neural networks (ANN). These four models were each applied at two different spatial scales, namely at that of a single flow-gauging station (local downscaling) and that of a group of flow-gauging stations having the same hydrological behaviour (regional downscaling). For each statistical model and each spatial resolution, three temporal resolutions were considered, namely the daily mean flows, the summary statistics of fortnightly flows and a daily 'integrated approach'. The results show that flow sensitivity to atmospheric factors is significantly different between nival and pluvial hydrological systems which are mainly influenced, respectively, by shortwave solar radiations and atmospheric temperature. The non-linear models (i.e. GAM, ABT and ANN) performed better than the linear GLM when simulating fortnightly flow percentiles. The aggregated boosted trees method showed higher and less variable R2 values to downscale the hydrological variability in both nival and pluvial regimes. Based on GCM cnrm-cm3 and scenarios A2 and A1B, future relative changes of fortnightly median flows were projected

  15. Infant Statistical Learning

    Science.gov (United States)

    Saffran, Jenny R.; Kirkham, Natasha Z.

    2017-01-01

    Perception involves making sense of a dynamic, multimodal environment. In the absence of mechanisms capable of exploiting the statistical patterns in the natural world, infants would face an insurmountable computational problem. Infant statistical learning mechanisms facilitate the detection of structure. These abilities allow the infant to compute across elements in their environmental input, extracting patterns for further processing and subsequent learning. In this selective review, we summarize findings that show that statistical learning is both a broad and flexible mechanism (supporting learning from different modalities across many different content areas) and input specific (shifting computations depending on the type of input and goal of learning). We suggest that statistical learning not only provides a framework for studying language development and object knowledge in constrained laboratory settings, but also allows researchers to tackle real-world problems, such as multilingualism, the role of ever-changing learning environments, and differential developmental trajectories. PMID:28793812

  16. School Violence: Data & Statistics

    Science.gov (United States)

    ... Programs Press Room Social Media Publications Injury Center School Violence: Data & Statistics Recommend on Facebook Tweet Share Compartir The first step in preventing school violence is to understand the extent and nature ...

  17. Medicaid Drug Claims Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicaid Drug Claims Statistics CD is a useful tool that conveniently breaks up Medicaid claim counts and separates them by quarter and includes an annual count.

  18. Basics of statistical physics

    CERN Document Server

    Müller-Kirsten, Harald J W

    2013-01-01

    Statistics links microscopic and macroscopic phenomena, and requires for this reason a large number of microscopic elements like atoms. The results are values of maximum probability or of averaging. This introduction to statistical physics concentrates on the basic principles, and attempts to explain these in simple terms supplemented by numerous examples. These basic principles include the difference between classical and quantum statistics, a priori probabilities as related to degeneracies, the vital aspect of indistinguishability as compared with distinguishability in classical physics, the differences between conserved and non-conserved elements, the different ways of counting arrangements in the three statistics (Maxwell-Boltzmann, Fermi-Dirac, Bose-Einstein), the difference between maximization of the number of arrangements of elements, and averaging in the Darwin-Fowler method. Significant applications to solids, radiation and electrons in metals are treated in separate chapters, as well as Bose-Eins...

  19. Statistics: a Bayesian perspective

    National Research Council Canada - National Science Library

    Berry, Donald A

    1996-01-01

    ...: it is the only introductory textbook based on Bayesian ideas, it combines concepts and methods, it presents statistics as a means of integrating data into the significant process, it develops ideas...

  20. Statistical Measures of Marksmanship

    National Research Council Canada - National Science Library

    Johnson, Richard

    2001-01-01

    .... This report describes objective statistical procedures to measure both rifle marksmanship accuracy, the proximity of an array of shots to the center of mass of a target, and marksmanship precision...

  1. Statistical mechanics of superconductivity

    CERN Document Server

    Kita, Takafumi

    2015-01-01

    This book provides a theoretical, step-by-step comprehensive explanation of superconductivity for undergraduate and graduate students who have completed elementary courses on thermodynamics and quantum mechanics. To this end, it adopts the unique approach of starting with the statistical mechanics of quantum ideal gases and successively adding and clarifying elements and techniques indispensible for understanding it. They include the spin-statistics theorem, second quantization, density matrices, the Bloch–De Dominicis theorem, the variational principle in statistical mechanics, attractive interaction, and bound states. Ample examples of their usage are also provided in terms of topics from advanced statistical mechanics such as two-particle correlations of quantum ideal gases, derivation of the Hartree–Fock equations, and Landau’s Fermi-liquid theory, among others. With these preliminaries, the fundamental mean-field equations of superconductivity are derived with maximum mathematical clarity based on ...

  2. CDC WONDER: Cancer Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The United States Cancer Statistics (USCS) online databases in WONDER provide cancer incidence and mortality data for the United States for the years since 1999, by...

  3. Statistical theory of heat

    CERN Document Server

    Scheck, Florian

    2016-01-01

    Scheck’s textbook starts with a concise introduction to classical thermodynamics, including geometrical aspects. Then a short introduction to probabilities and statistics lays the basis for the statistical interpretation of thermodynamics. Phase transitions, discrete models and the stability of matter are explained in great detail. Thermodynamics has a special role in theoretical physics. Due to the general approach of thermodynamics the field has a bridging function between several areas like the theory of condensed matter, elementary particle physics, astrophysics and cosmology. The classical thermodynamics describes predominantly averaged properties of matter, reaching from few particle systems and state of matter to stellar objects. Statistical Thermodynamics covers the same fields, but explores them in greater depth and unifies classical statistical mechanics with quantum theory of multiple particle systems. The content is presented as two tracks: the fast track for master students, providing the essen...

  4. Statistics For Neuroscientists

    Directory of Open Access Journals (Sweden)

    Subbakrishna D.K

    2000-01-01

    Full Text Available The role statistical methods play in medicine in the interpretation of empirical data is well recognized by researchers. With modern computing facilities and software packages there is little need for familiarity with the computational details of statistical calculations. However, for the researcher to understand whether these calculations are valid and appropriate it is necessary that the user is aware of the rudiments of the statistical methodology. Also, it needs to be emphasized that no amount of advanced analysis can be a substitute for a properly planned and executed study. An attempt is made in this communication to discuss some of the theoretical issues that are important for the valid analysis and interpretation of precious date that are gathered. The article summarises some of the basic statistical concepts followed by illustrations from live data generated from various research projects from the department of Neurology of this Institute.

  5. Elements of statistical thermodynamics

    CERN Document Server

    Nash, Leonard K

    2006-01-01

    Encompassing essentially all aspects of statistical mechanics that appear in undergraduate texts, this concise, elementary treatment shows how an atomic-molecular perspective yields new insights into macroscopic thermodynamics. 1974 edition.

  6. Ehrlichiosis: Statistics and Epidemiology

    Science.gov (United States)

    ... a tick Diseases transmitted by ticks Statistics and Epidemiology Recommend on Facebook Tweet Share Compartir On This ... Holman RC, McQuiston JH, Krebs JW, Swerdlow DL. Epidemiology of human ehrlichiosis and anaplasmosis in the United ...

  7. Anaplasmosis: Statistics and Epidemiology

    Science.gov (United States)

    ... a tick Diseases transmitted by ticks Statistics and Epidemiology Recommend on Facebook Tweet Share Compartir On This ... Holman RC, McQuiston JH, Krebs JW, Swerdlow DL. Epidemiology of human ehrlichiosis and anaplasmosis in the United ...

  8. Boating Accident Statistics

    Data.gov (United States)

    Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...

  9. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  10. Statistical Engine Knock Control

    DEFF Research Database (Denmark)

    Stotsky, Alexander A.

    2008-01-01

    A new statistical concept of the knock control of a spark ignition automotive engine is proposed . The control aim is associated with the statistical hy pothesis test which compares the threshold value to the average value of the max imal amplitud e of the knock sensor signal at a given freq uency...... which includ es generation of the amplitud e signals, a threshold value d etermination and a knock sound mod el is d eveloped for evaluation of the control concept....

  11. Statistical mechanics rigorous results

    CERN Document Server

    Ruelle, David

    1999-01-01

    This classic book marks the beginning of an era of vigorous mathematical progress in equilibrium statistical mechanics. Its treatment of the infinite system limit has not been superseded, and the discussion of thermodynamic functions and states remains basic for more recent work. The conceptual foundation provided by the Rigorous Results remains invaluable for the study of the spectacular developments of statistical mechanics in the second half of the 20th century.

  12. Introductory statistical inference

    CERN Document Server

    Mukhopadhyay, Nitis

    2014-01-01

    This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of dist

  13. Multilevel statistical models

    CERN Document Server

    Goldstein, Harvey

    2011-01-01

    This book provides a clear introduction to this important area of statistics. The author provides a wide of coverage of different kinds of multilevel models, and how to interpret different statistical methodologies and algorithms applied to such models. This 4th edition reflects the growth and interest in this area and is updated to include new chapters on multilevel models with mixed response types, smoothing and multilevel data, models with correlated random effects and modeling with variance.

  14. European environmental statistics handbook

    Energy Technology Data Exchange (ETDEWEB)

    Newman, O.; Foster, A. [comps.] [Manchester Business School, Manchester (United Kingdom). Library and Information Service

    1993-12-31

    This book is a compilation of statistical materials on environmental pollution drawn from governmental and private sources. It is divided into ten chapters: air, water and land - monitoring statistics; cities, regions and nations; costs, budgets and expenditures - costs of pollution and its control, including air pollution; effects; general industry and government data; laws and regulations; politics and opinion - including media coverage; pollutants and wastes; pollution control industry; and tools, methods and solutions. 750 tabs.

  15. Applied statistics for social and management sciences

    CERN Document Server

    Miah, Abdul Quader

    2016-01-01

    This book addresses the application of statistical techniques and methods across a wide range of disciplines. While its main focus is on the application of statistical methods, theoretical aspects are also provided as fundamental background information. It offers a systematic interpretation of results often discovered in general descriptions of methods and techniques such as linear and non-linear regression. SPSS is also used in all the application aspects. The presentation of data in the form of tables and graphs throughout the book not only guides users, but also explains the statistical application and assists readers in interpreting important features. The analysis of statistical data is presented consistently throughout the text. Academic researchers, practitioners and other users who work with statistical data will benefit from reading Applied Statistics for Social and Management Sciences. .

  16. Statistics for mathematicians a rigorous first course

    CERN Document Server

    Panaretos, Victor M

    2016-01-01

    This textbook provides a coherent introduction to the main concepts and methods of one-parameter statistical inference. Intended for students of Mathematics taking their first course in Statistics, the focus is on Statistics for Mathematicians rather than on Mathematical Statistics. The goal is not to focus on the mathematical/theoretical aspects of the subject, but rather to provide an introduction to the subject tailored to the mindset and tastes of Mathematics students, who are sometimes turned off by the informal nature of Statistics courses. This book can be used as the basis for an elementary semester-long first course on Statistics with a firm sense of direction that does not sacrifice rigor. The deeper goal of the text is to attract the attention of promising Mathematics students.

  17. Solar-climatic statistical study

    Energy Technology Data Exchange (ETDEWEB)

    Bray, R.E.

    1979-02-01

    The Solar-Climatic Statistical Study was performed to provide statistical information on the expected future availability of solar and wind power at various nationwide sites. Historic data (SOLMET), at 26 National Weather Service stations reporting hourly solar insolation and collateral meteorological information, were interrogated to provide an estimate of future trends. Solar data are global radiation incident on a horizontal surface, and wind data represent wind power normal to the air flow. Selected insolation and wind power conditions were investigated for their occurrence and persistence, for defined periods of time, on a monthly basis. Information of this nature are intended as an aid to preliminary planning activities for the design and operation of solar and wind energy utilization and conversion systems. Presented in this volume are probability estimates of solar insolation and wind power, alone and in combination, occurring and persisting at or above specified thresholds, for up to one week, for each of the 26 SOLMET stations. Diurnal variations of wind power were also considered. Selected probability data for each station are presented graphically, and comprehensive plots for all stations are provided on a set of microfiche included in a folder in the back of this volume.

  18. UN Data- Environmental Statistics: Waste

    Data.gov (United States)

    World Wide Human Geography Data Working Group — The Environment Statistics Database contains selected water and waste statistics by country. Statistics on water and waste are based on official statistics supplied...

  19. UN Data: Environment Statistics: Waste

    Data.gov (United States)

    World Wide Human Geography Data Working Group — The Environment Statistics Database contains selected water and waste statistics by country. Statistics on water and waste are based on official statistics supplied...

  20. Exotic statistics on surfaces

    International Nuclear Information System (INIS)

    Imbo, T.D.; March-Russel, J.

    1990-01-01

    We investigate the allowed spectrum of statistics for n identical spinless particles on an arbitrary closed two-manifold M, by using a powerful topological approach to the study of quantum kinematics. On a surface of genus g≥1 statistics other than Bose or Fermi can only be obtained by utilizing multi-component state vectors transforming as an irreducible unitary representation of the fundamental group of the n-particle configuration space. These multi-component (or nonscalar) quantizations allow the possibility of fractional statistics, as well as other exotic, nonfractional statistics some of whose properties we discuss. On an orientable surface of genus g≥0 only anyons with rational statistical parameter θ/π=p/q are allowed, and their number is restricted to be sq-g+1 (selement ofZ). For nonorientable surfaces only θ=0, π are allowed. Finally, we briefly comment on systems of spinning particles and make a comparison with the results for solitons in the O(3)-invariant nonlinear sigma model with space manifold M. (orig.)

  1. Intuitive introductory statistics

    CERN Document Server

    Wolfe, Douglas A

    2017-01-01

    This textbook is designed to give an engaging introduction to statistics and the art of data analysis. The unique scope includes, but also goes beyond, classical methodology associated with the normal distribution. What if the normal model is not valid for a particular data set? This cutting-edge approach provides the alternatives. It is an introduction to the world and possibilities of statistics that uses exercises, computer analyses, and simulations throughout the core lessons. These elementary statistical methods are intuitive. Counting and ranking features prominently in the text. Nonparametric methods, for instance, are often based on counts and ranks and are very easy to integrate into an introductory course. The ease of computation with advanced calculators and statistical software, both of which factor into this text, allows important techniques to be introduced earlier in the study of statistics. This book's novel scope also includes measuring symmetry with Walsh averages, finding a nonp...

  2. Statistical Hair on Black Holes

    International Nuclear Information System (INIS)

    Strominger, A.

    1996-01-01

    The Bekenstein-Hawking entropy for certain BPS-saturated black holes in string theory has recently been derived by counting internal black hole microstates at weak coupling. We argue that the black hole microstate can be measured by interference experiments even in the strong coupling region where there is clearly an event horizon. Extracting information which is naively behind the event horizon is possible due to the existence of statistical quantum hair carried by the black hole. This quantum hair arises from the arbitrarily large number of discrete gauge symmetries present in string theory. copyright 1996 The American Physical Society

  3. Statistical analysis of JET disruptions

    International Nuclear Information System (INIS)

    Tanga, A.; Johnson, M.F.

    1991-07-01

    In the operation of JET and of any tokamak many discharges are terminated by a major disruption. The disruptive termination of a discharge is usually an unwanted event which may cause damage to the structure of the vessel. In a reactor disruptions are potentially a very serious problem, hence the importance of studying them and devising methods to avoid disruptions. Statistical information has been collected about the disruptions which have occurred at JET over a long span of operations. The analysis is focused on the operational aspects of the disruptions rather than on the underlining physics. (Author)

  4. Characterizing Financial and Statistical Literacy

    DEFF Research Database (Denmark)

    Di Girolamo, Amalia; Harrison, Glenn W.; Lau, Morten

    We characterize the literacy of an individual in a domain by their elicited subjective belief distribution over the possible responses to a question posed in that domain. We consider literacy across several financial, economic and statistical domains. We find considerable demographic heterogeneity...... in the degree of literacy. We also characterize the degree of consistency within a sample about their knowledge, even when that knowledge is imperfect. We show how uncertainty aversion might be a normatively attractive behavior for individuals who have imperfect literacy. Finally, we discuss extensions of our...... approach to characterize financial capability, the consequences of non-literacy, social literacy, and the information content of hypothetical survey measures of literacy....

  5. Aspects of multivariate statistical theory

    CERN Document Server

    Muirhead, Robb J

    2009-01-01

    The Wiley-Interscience Paperback Series consists of selected books that have been made more accessible to consumers in an effort to increase global appeal and general circulation. With these new unabridged softcover volumes, Wiley hopes to extend the lives of these works by making them available to future generations of statisticians, mathematicians, and scientists. "". . . the wealth of material on statistics concerning the multivariate normal distribution is quite exceptional. As such it is a very useful source of information for the general statistician and a must for anyone wanting to pen

  6. Philosophy of statistics

    CERN Document Server

    Forster, Malcolm R

    2011-01-01

    Statisticians and philosophers of science have many common interests but restricted communication with each other. This volume aims to remedy these shortcomings. It provides state-of-the-art research in the area of philosophy of statistics by encouraging numerous experts to communicate with one another without feeling "restricted” by their disciplines or thinking "piecemeal” in their treatment of issues. A second goal of this book is to present work in the field without bias toward any particular statistical paradigm. Broadly speaking, the essays in this Handbook are concerned with problems of induction, statistics and probability. For centuries, foundational problems like induction have been among philosophers' favorite topics; recently, however, non-philosophers have increasingly taken a keen interest in these issues. This volume accordingly contains papers by both philosophers and non-philosophers, including scholars from nine academic disciplines.

  7. Statistics and social critique

    Directory of Open Access Journals (Sweden)

    Alain Desrosières

    2014-07-01

    Full Text Available This paper focuses on the history of the uses of statistics as a tool for socialcritique. Whereas nowadays they are very often conceived as being in the hands of the powerful, there are many historical cases when they were, on the contrary, used to oppose the authority. The author first illustrates the theory of Ted Porter according to which quantification might be a “tool of weakness”. He then addresses the fact that statistics were used in the context of labour and on living conditions, thus being a resource for the lower class of society (and presenting the theory of statistics of Pelloutier, an anarchist activist. Finally comes the question of the conditions of success of these counterpropositions, discussed on the examples of the new random experiments in public policies, and of the measure of the 1% of the richest persons

  8. Per Object statistical analysis

    DEFF Research Database (Denmark)

    2008-01-01

    This RS code is to do Object-by-Object analysis of each Object's sub-objects, e.g. statistical analysis of an object's individual image data pixels. Statistics, such as percentiles (so-called "quartiles") are derived by the process, but the return of that can only be a Scene Variable, not an Object...... an analysis of the values of the object's pixels in MS-Excel. The shell of the proceedure could also be used for purposes other than just the derivation of Object - Sub-object statistics, e.g. rule-based assigment processes....... Variable. This procedure was developed in order to be able to export objects as ESRI shape data with the 90-percentile of the Hue of each object's pixels as an item in the shape attribute table. This procedure uses a sub-level single pixel chessboard segmentation, loops for each of the objects...

  9. Statistics for Finance

    DEFF Research Database (Denmark)

    Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard

    that rarely connect concepts to data and books on econometrics and time series analysis that do not cover specific problems related to option valuation. The book discusses applications of financial derivatives pertaining to risk assessment and elimination. The authors cover various statistical......Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...... and mathematical techniques, including linear and nonlinear time series analysis, stochastic calculus models, stochastic differential equations, Itō’s formula, the Black–Scholes model, the generalized method-of-moments, and the Kalman filter. They explain how these tools are used to price financial derivatives...

  10. Statistics for Finance

    DEFF Research Database (Denmark)

    Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard

    Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...... that rarely connect concepts to data and books on econometrics and time series analysis that do not cover specific problems related to option valuation. The book discusses applications of financial derivatives pertaining to risk assessment and elimination. The authors cover various statistical...... and mathematical techniques, including linear and nonlinear time series analysis, stochastic calculus models, stochastic differential equations, Itō’s formula, the Black–Scholes model, the generalized method-of-moments, and the Kalman filter. They explain how these tools are used to price financial derivatives...

  11. Statistical inferences in phylogeography

    DEFF Research Database (Denmark)

    Nielsen, Rasmus; Beaumont, Mark A

    2009-01-01

    can randomly lead to multiple different genealogies. Likewise, the same gene trees can arise under different demographic models. This problem has led to the emergence of many statistical methods for making phylogeographic inferences. A popular phylogeographic approach based on nested clade analysis...... is challenged by the fact that a certain amount of the interpretation of the data is left to the subjective choices of the user, and it has been argued that the method performs poorly in simulation studies. More rigorous statistical methods based on coalescence theory have been developed. However, these methods...... may also be challenged by computational problems or poor model choice. In this review, we will describe the development of statistical methods in phylogeographic analysis, and discuss some of the challenges facing these methods....

  12. Multivariate Statistical Process Control

    DEFF Research Database (Denmark)

    Kulahci, Murat

    2013-01-01

    As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control (SPC) and monitoring for which the aim...... is to identify “out-of-control” state of a process using control charts in order to reduce the excessive variation caused by so-called assignable causes. In practice, the most common method of monitoring multivariate data is through a statistic akin to the Hotelling’s T2. For high dimensional data with excessive...... in conjunction with image data are plagued with various challenges beyond the usual ones encountered in current applications. In this presentation we will introduce the basic ideas of SPC and the multivariate control charts commonly used in industry. We will further discuss the challenges the practitioners...

  13. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  14. Diffeomorphic Statistical Deformation Models

    DEFF Research Database (Denmark)

    Hansen, Michael Sass; Hansen, Mads/Fogtman; Larsen, Rasmus

    2007-01-01

    In this paper we present a new method for constructing diffeomorphic statistical deformation models in arbitrary dimensional images with a nonlinear generative model and a linear parameter space. Our deformation model is a modified version of the diffeomorphic model introduced by Cootes et al....... The modifications ensure that no boundary restriction has to be enforced on the parameter space to prevent folds or tears in the deformation field. For straightforward statistical analysis, principal component analysis and sparse methods, we assume that the parameters for a class of deformations lie on a linear...

  15. Statistics As Principled Argument

    CERN Document Server

    Abelson, Robert P

    2012-01-01

    In this illuminating volume, Robert P. Abelson delves into the too-often dismissed problems of interpreting quantitative data and then presenting them in the context of a coherent story about one's research. Unlike too many books on statistics, this is a remarkably engaging read, filled with fascinating real-life (and real-research) examples rather than with recipes for analysis. It will be of true interest and lasting value to beginning graduate students and seasoned researchers alike. The focus of the book is that the purpose of statistics is to organize a useful argument from quantitative

  16. Radiation counting statistics

    Energy Technology Data Exchange (ETDEWEB)

    Suh, M. Y.; Jee, K. Y.; Park, K. K. [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-08-01

    This report is intended to describe the statistical methods necessary to design and conduct radiation counting experiments and evaluate the data from the experiments. The methods are described for the evaluation of the stability of a counting system and the estimation of the precision of counting data by application of probability distribution models. The methods for the determination of the uncertainty of the results calculated from the number of counts, as well as various statistical methods for the reduction of counting error are also described. 11 refs., 6 figs., 8 tabs. (Author)

  17. Bayesian statistics an introduction

    CERN Document Server

    Lee, Peter M

    2012-01-01

    Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel

  18. Environmental accounting and statistics

    International Nuclear Information System (INIS)

    Bartelmus, P.L.P.

    1992-01-01

    The objective of sustainable development is to integrate environmental concerns with mainstream socio-economic policies. Integrated policies need to be supported by integrated data. Environmental accounting achieves this integration by incorporating environmental costs and benefits into conventional national accounts. Modified accounting aggregates can thus be used in defining and measuring environmentally sound and sustainable economic growth. Further development objectives need to be assessed by more comprehensive, though necessarily less integrative, systems of environmental statistics and indicators. Integrative frameworks for the different statistical systems in the fields of economy, environment and population would facilitate the provision of comparable data for the analysis of integrated development. (author). 19 refs, 2 figs, 2 tabs

  19. Applied nonparametric statistical methods

    CERN Document Server

    Sprent, Peter

    2007-01-01

    While preserving the clear, accessible style of previous editions, Applied Nonparametric Statistical Methods, Fourth Edition reflects the latest developments in computer-intensive methods that deal with intractable analytical problems and unwieldy data sets. Reorganized and with additional material, this edition begins with a brief summary of some relevant general statistical concepts and an introduction to basic ideas of nonparametric or distribution-free methods. Designed experiments, including those with factorial treatment structures, are now the focus of an entire chapter. The text also e

  20. Elementary Statistics Tables

    CERN Document Server

    Neave, Henry R

    2012-01-01

    This book, designed for students taking a basic introductory course in statistical analysis, is far more than just a book of tables. Each table is accompanied by a careful but concise explanation and useful worked examples. Requiring little mathematical background, Elementary Statistics Tables is thus not just a reference book but a positive and user-friendly teaching and learning aid. The new edition contains a new and comprehensive "teach-yourself" section on a simple but powerful approach, now well-known in parts of industry but less so in academia, to analysing and interpreting process dat

  1. Computational statistical mechanics

    CERN Document Server

    Hoover, WG

    1991-01-01

    Computational Statistical Mechanics describes the use of fast computers to simulate the equilibrium and nonequilibrium properties of gases, liquids, and solids at, and away from equilibrium. The underlying theory is developed from basic principles and illustrated by applying it to the simplest possible examples. Thermodynamics, based on the ideal gas thermometer, is related to Gibb's statistical mechanics through the use of Nosé-Hoover heat reservoirs. These reservoirs use integral feedback to control temperature. The same approach is carried through to the simulation and anal

  2. Radiation counting statistics

    International Nuclear Information System (INIS)

    Suh, M. Y.; Jee, K. Y.; Park, K. K.; Park, Y. J.; Kim, W. H.

    1999-08-01

    This report is intended to describe the statistical methods necessary to design and conduct radiation counting experiments and evaluate the data from the experiment. The methods are described for the evaluation of the stability of a counting system and the estimation of the precision of counting data by application of probability distribution models. The methods for the determination of the uncertainty of the results calculated from the number of counts, as well as various statistical methods for the reduction of counting error are also described. (Author). 11 refs., 8 tabs., 8 figs

  3. Bayes linear statistics, theory & methods

    CERN Document Server

    Goldstein, Michael

    2007-01-01

    Bayesian methods combine information available from data with any prior information available from expert knowledge. The Bayes linear approach follows this path, offering a quantitative structure for expressing beliefs, and systematic methods for adjusting these beliefs, given observational data. The methodology differs from the full Bayesian methodology in that it establishes simpler approaches to belief specification and analysis based around expectation judgements. Bayes Linear Statistics presents an authoritative account of this approach, explaining the foundations, theory, methodology, and practicalities of this important field. The text provides a thorough coverage of Bayes linear analysis, from the development of the basic language to the collection of algebraic results needed for efficient implementation, with detailed practical examples. The book covers:The importance of partial prior specifications for complex problems where it is difficult to supply a meaningful full prior probability specification...

  4. Statistical inference an integrated approach

    CERN Document Server

    Migon, Helio S; Louzada, Francisco

    2014-01-01

    Introduction Information The concept of probability Assessing subjective probabilities An example Linear algebra and probability Notation Outline of the bookElements of Inference Common statistical modelsLikelihood-based functions Bayes theorem Exchangeability Sufficiency and exponential family Parameter elimination Prior Distribution Entirely subjective specification Specification through functional forms Conjugacy with the exponential family Non-informative priors Hierarchical priors Estimation Introduction to decision theoryBayesian point estimation Classical point estimation Empirical Bayes estimation Comparison of estimators Interval estimation Estimation in the Normal model Approximating Methods The general problem of inference Optimization techniquesAsymptotic theory Other analytical approximations Numerical integration methods Simulation methods Hypothesis Testing Introduction Classical hypothesis testingBayesian hypothesis testing Hypothesis testing and confidence intervalsAsymptotic tests Prediction...

  5. Visualizing Summary Statistics and Uncertainty

    KAUST Repository

    Potter, K.

    2010-08-12

    The graphical depiction of uncertainty information is emerging as a problem of great importance. Scientific data sets are not considered complete without indications of error, accuracy, or levels of confidence. The visual portrayal of this information is a challenging task. This work takes inspiration from graphical data analysis to create visual representations that show not only the data value, but also important characteristics of the data including uncertainty. The canonical box plot is reexamined and a new hybrid summary plot is presented that incorporates a collection of descriptive statistics to highlight salient features of the data. Additionally, we present an extension of the summary plot to two dimensional distributions. Finally, a use-case of these new plots is presented, demonstrating their ability to present high-level overviews as well as detailed insight into the salient features of the underlying data distribution. © 2010 The Eurographics Association and Blackwell Publishing Ltd.

  6. Nuclear material statistical accountancy system

    International Nuclear Information System (INIS)

    Argentest, F.; Casilli, T.; Franklin, M.

    1979-01-01

    The statistical accountancy system developed at JRC Ispra is refered as 'NUMSAS', ie Nuclear Material Statistical Accountancy System. The principal feature of NUMSAS is that in addition to an ordinary material balance calcultation, NUMSAS can calculate an estimate of the standard deviation of the measurement error accumulated in the material balance calculation. The purpose of the report is to describe in detail, the statistical model on wich the standard deviation calculation is based; the computational formula which is used by NUMSAS in calculating the standard deviation and the information about nuclear material measurements and the plant measurement system which are required as data for NUMSAS. The material balance records require processing and interpretation before the material balance calculation is begun. The material balance calculation is the last of four phases of data processing undertaken by NUMSAS. Each of these phases is implemented by a different computer program. The activities which are carried out in each phase can be summarised as follows; the pre-processing phase; the selection and up-date phase; the transformation phase, and the computation phase

  7. Statistics of Local Extremes

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Bierbooms, W.; Hansen, Kurt Schaldemose

    2003-01-01

    . A theoretical expression for the probability density function associated with local extremes of a stochasticprocess is presented. The expression is basically based on the lower four statistical moments and a bandwidth parameter. The theoretical expression is subsequently verified by comparison with simulated...

  8. Beyond quantum microcanonical statistics

    International Nuclear Information System (INIS)

    Fresch, Barbara; Moro, Giorgio J.

    2011-01-01

    Descriptions of molecular systems usually refer to two distinct theoretical frameworks. On the one hand the quantum pure state, i.e., the wavefunction, of an isolated system is determined to calculate molecular properties and their time evolution according to the unitary Schroedinger equation. On the other hand a mixed state, i.e., a statistical density matrix, is the standard formalism to account for thermal equilibrium, as postulated in the microcanonical quantum statistics. In the present paper an alternative treatment relying on a statistical analysis of the possible wavefunctions of an isolated system is presented. In analogy with the classical ergodic theory, the time evolution of the wavefunction determines the probability distribution in the phase space pertaining to an isolated system. However, this alone cannot account for a well defined thermodynamical description of the system in the macroscopic limit, unless a suitable probability distribution for the quantum constants of motion is introduced. We present a workable formalism assuring the emergence of typical values of thermodynamic functions, such as the internal energy and the entropy, in the large size limit of the system. This allows the identification of macroscopic properties independently of the specific realization of the quantum state. A description of material systems in agreement with equilibrium thermodynamics is then derived without constraints on the physical constituents and interactions of the system. Furthermore, the canonical statistics is recovered in all generality for the reduced density matrix of a subsystem.

  9. Swiss electricity statistics 1982

    International Nuclear Information System (INIS)

    1983-01-01

    The Swiss Department of Energy has published electricity statistics for 1982. This report presents them in tabular form. The tables are classified under the following headings: important reference numbers, Swiss electricity review, production of electrical energy, use of electrical energy, load diagrams and coping with user requirements, import and export of energy 1982, possible building of power stations before 1989, finance, appendix

  10. Simple Statistics: - Summarized!

    Science.gov (United States)

    Blai, Boris, Jr.

    Statistics are an essential tool for making proper judgement decisions. It is concerned with probability distribution models, testing of hypotheses, significance tests and other means of determining the correctness of deductions and the most likely outcome of decisions. Measures of central tendency include the mean, median and mode. A second…

  11. Minnesota forest statistics, 1990.

    Science.gov (United States)

    Patrick D. Miles; Chung M. Chen

    1992-01-01

    The fifth inventory of Minnesota's forests reports 51.0 million acres of land, of which 16.7 million acres are forested. This bulletin presents statistical highlights and contains detailed tables of forest area, as well as timber volume, growth, removals, mortality, and ownership.

  12. The Bayesian Score Statistic

    NARCIS (Netherlands)

    Kleibergen, F.R.; Kleijn, R.; Paap, R.

    2000-01-01

    We propose a novel Bayesian test under a (noninformative) Jeffreys'priorspecification. We check whether the fixed scalar value of the so-calledBayesian Score Statistic (BSS) under the null hypothesis is aplausiblerealization from its known and standardized distribution under thealternative. Unlike

  13. SAPS, Crime statistics

    African Journals Online (AJOL)

    perverse incentive' to under record violent crime, particularly the various forms of assault.6 In effect this has rendered the SAPS statistics for inter-personal. * Gould and Burger are senior researchers in the Crime and Justice Programme of the ISS ...

  14. Bolivia; Statistical Annex

    OpenAIRE

    International Monetary Fund

    1995-01-01

    This paper provides statistical data of macroeconomic flows, national accounts, production, and employment, combined public sector, financial sector, and external sector. They are listed as follows: gross domestic product by expenditure, mining reserves and production, investment in petroleum exploration, consumer prices, public sector employment, operations of the central government, monetary surveys, selected interest rates, open market bills, balance of payments, exports by principal produ...

  15. Indiana forest statistics.

    Science.gov (United States)

    W. Brad Smith; Mark F. Golitz

    1988-01-01

    The third inventory of Indiana's timber resource shows that timberland area in Indiana climbed from 3.9 to 4.3 million acres between 1967 and 1986, an increase of more than 10%. During the same period growing-stock volume increased 43%. Highlights and statistics are presented on area, volume, growth, mortality, and removals.

  16. Statistical core design

    International Nuclear Information System (INIS)

    Oelkers, E.; Heller, A.S.; Farnsworth, D.A.; Kearfott, K.J.

    1978-01-01

    The report describes the statistical analysis of DNBR thermal-hydraulic margin of a 3800 MWt, 205-FA core under design overpower conditions. The analysis used LYNX-generated data at predetermined values of the input variables whose uncertainties were to be statistically combined. LYNX data were used to construct an efficient response surface model in the region of interest; the statistical analysis was accomplished through the evaluation of core reliability; utilizing propagation of the uncertainty distributions of the inputs. The response surface model was implemented in both the analytical error propagation and Monte Carlo Techniques. The basic structural units relating to the acceptance criteria are fuel pins. Therefore, the statistical population of pins with minimum DNBR values smaller than specified values is determined. The specified values are designated relative to the most probable and maximum design DNBR values on the power limiting pin used in present design analysis, so that gains over the present design criteria could be assessed for specified probabilistic acceptance criteria. The results are equivalent to gains ranging from 1.2 to 4.8 percent of rated power dependent on the acceptance criterion. The corresponding acceptance criteria range from 95 percent confidence that no pin will be in DNB to 99.9 percent of the pins, which are expected to avoid DNB

  17. Fermi–Dirac Statistics

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 19; Issue 1. Fermi–Dirac Statistics. Subhash Chaturvedi Shyamal Biswas. General ... School of Physics University of Hyderabad C R Rao Road, Gachibowli Hyderabad 500 046, India. University of Hyderabad C R Rao Road, Gachibowli Hyderabad 500 ...

  18. Statistical origin of gravity

    International Nuclear Information System (INIS)

    Banerjee, Rabin; Majhi, Bibhas Ranjan

    2010-01-01

    Starting from the definition of entropy used in statistical mechanics we show that it is proportional to the gravity action. For a stationary black hole this entropy is expressed as S=E/2T, where T is the Hawking temperature and E is shown to be the Komar energy. This relation is also compatible with the generalized Smarr formula for mass.

  19. Search Databases and Statistics

    DEFF Research Database (Denmark)

    Refsgaard, Jan C; Munk, Stephanie; Jensen, Lars J

    2016-01-01

    searches. Additionally, careful filtering and use of appropriate statistical tests on the output datasets affects the quality of all downstream analyses and interpretation of the data. Our considerations and general practices on these aspects of phosphoproteomics data processing are presented here....

  20. The Pleasures of Statistics

    CERN Document Server

    Mosteller, Frederick; Hoaglin, David C; Tanur, Judith M

    2010-01-01

    Includes chapter-length insider accounts of work on the pre-election polls of 1948, statistical aspects of the Kinsey report on sexual behavior in the human male, mathematical learning theory, authorship of the disputed Federalist papers, safety of anesthetics, and an examination of the Coleman report on equality of educational opportunity

  1. Elementary statistical physics

    CERN Document Server

    Kittel, C

    1965-01-01

    This book is intended to help physics students attain a modest working knowledge of several areas of statistical mechanics, including stochastic processes and transport theory. The areas discussed are among those forming a useful part of the intellectual background of a physicist.

  2. Statistics for Learning Genetics

    Science.gov (United States)

    Charles, Abigail Sheena

    2012-01-01

    This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing…

  3. Fisher's Contributions to Statistics

    Indian Academy of Sciences (India)

    research workers of various disciplines in designing their studies and in analysing data thereof. He is also called upon to advise organisations like the ... such visual aids. It is believed that this situation helped him develop a keen geometrical sense. Fisher's contributions to statistics have also given rise to a number of bitter ...

  4. Topics in Statistical Calibration

    Science.gov (United States)

    2014-03-27

    type of garden cress called nasturtium. The response is weight of the plant in milligrams (mg) after three weeks of growth, and the predictor is the...7 (1):1–26, 1979. B. Efron. The bootstrap and markov-chain monte carlo. Journal of Biopharmaceutical Statistics, 21(6):1052–1062, 2011. B. Efron and

  5. Air Carrier Traffic Statistics.

    Science.gov (United States)

    2013-11-01

    This report contains airline operating statistics for large certificated air carriers based on data reported to U.S. Department of Transportation (DOT) by carriers that hold a certificate issued under Section 401 of the Federal Aviation Act of 1958 a...

  6. Statistically Valid Planting Trials

    Science.gov (United States)

    C. B. Briscoe

    1961-01-01

    More than 100 million tree seedlings are planted each year in Latin America, and at least ten time'that many should be planted Rational control and development of a program of such magnitude require establishing and interpreting carefully planned trial plantings which will yield statistically valid answers to real and important questions. Unfortunately, many...

  7. Geometric statistical inference

    International Nuclear Information System (INIS)

    Periwal, Vipul

    1999-01-01

    A reparametrization-covariant formulation of the inverse problem of probability is explicitly solved for finite sample sizes. The inferred distribution is explicitly continuous for finite sample size. A geometric solution of the statistical inference problem in higher dimensions is outlined

  8. On quantum statistical inference

    NARCIS (Netherlands)

    Barndorff-Nielsen, O.E.; Gill, R.D.; Jupp, P.E.

    2001-01-01

    Recent developments in the mathematical foundations of quantum mechanics have brought the theory closer to that of classical probability and statistics. On the other hand, the unique character of quantum physics sets many of the questions addressed apart from those met classically in stochastics.

  9. Whither Statistics Education Research?

    Science.gov (United States)

    Watson, Jane

    2016-01-01

    This year marks the 25th anniversary of the publication of a "National Statement on Mathematics for Australian Schools", which was the first curriculum statement this country had including "Chance and Data" as a significant component. It is hence an opportune time to survey the history of the related statistics education…

  10. Statistical Hadronization and Holography

    DEFF Research Database (Denmark)

    Bechi, Jacopo

    2009-01-01

    In this paper we consider some issues about the statistical model of the hadronization in a holographic approach. We introduce a Rindler like horizon in the bulk and we understand the string breaking as a tunneling event under this horizon. We calculate the hadron spectrum and we get a thermal, a...

  11. Transportation Statistics Annual Report 1997

    Energy Technology Data Exchange (ETDEWEB)

    Fenn, M.

    1997-01-01

    This document is the fourth Transportation Statistics Annual Report (TSAR) prepared by the Bureau of Transportation Statistics (BTS) for the President and Congress. As in previous years, it reports on the state of U.S. transportation system at two levels. First, in Part I, it provides a statistical and interpretive survey of the system—its physical characteristics, its economic attributes, aspects of its use and performance, and the scale and severity of unintended consequences of transportation, such as fatalities and injuries, oil import dependency, and environment impacts. Part I also explores the state of transportation statistics, and new needs of the rapidly changing world of transportation. Second, Part II of the report, as in prior years, explores in detail the performance of the U.S. transportation system from the perspective of desired social outcomes or strategic goals. This year, the performance aspect of transportation chosen for thematic treatment is “Mobility and Access,” which complements past TSAR theme sections on “The Economic Performance of Transportation” (1995) and “Transportation and the Environment” (1996). Mobility and access are at the heart of the transportation system’s performance from the user’s perspective. In what ways and to what extent does the geographic freedom provided by transportation enhance personal fulfillment of the nation’s residents and contribute to economic advancement of people and businesses? This broad question underlies many of the topics examined in Part II: What is the current level of personal mobility in the United States, and how does it vary by sex, age, income level, urban or rural location, and over time? What factors explain variations? Has transportation helped improve people’s access to work, shopping, recreational facilities, and medical services, and in what ways and in what locations? How have barriers, such as age, disabilities, or lack of an automobile, affected these

  12. Demystifying EQA statistics and reports.

    Science.gov (United States)

    Coucke, Wim; Soumali, Mohamed Rida

    2017-02-15

    Reports act as an important feedback tool in External Quality Assessment (EQA). Their main role is to score laboratories for their performance in an EQA round. The most common scores that apply to quantitative data are Q- and Z-scores. To calculate these scores, EQA providers need to have an assigned value and standard deviation for the sample. Both assigned values and standard deviations can be derived chemically or statistically. When derived statistically, different anomalies against the normal distribution of the data have to be handled. Various procedures for evaluating laboratories are able to handle these anomalies. Formal tests and graphical representation techniques are discussed and suggestions are given to help choosing between the different evaluations techniques. In order to obtain reliable estimates for calculating performance scores, a satisfactory number of data is needed. There is no general agreement about the minimal number that is needed. A solution for very small numbers is proposed by changing the limits of evaluation.
Apart from analyte- and sample-specific laboratory evaluation, supplementary information can be obtained by combining results for different analytes and samples. Various techniques are overviewed. It is shown that combining results leads to supplementary information, not only for quantitative, but also for qualitative and semi-quantitative analytes.

  13. Software Used to Generate Cancer Statistics - SEER Cancer Statistics

    Science.gov (United States)

    Videos that highlight topics and trends in cancer statistics and definitions of statistical terms. Also software tools for analyzing and reporting cancer statistics, which are used to compile SEER's annual reports.

  14. Thermodynamics of statistical inference by cells.

    Science.gov (United States)

    Lang, Alex H; Fisher, Charles K; Mora, Thierry; Mehta, Pankaj

    2014-10-03

    The deep connection between thermodynamics, computation, and information is now well established both theoretically and experimentally. Here, we extend these ideas to show that thermodynamics also places fundamental constraints on statistical estimation and learning. To do so, we investigate the constraints placed by (nonequilibrium) thermodynamics on the ability of biochemical signaling networks to estimate the concentration of an external signal. We show that accuracy is limited by energy consumption, suggesting that there are fundamental thermodynamic constraints on statistical inference.

  15. Statistics and Biomedical Informatics in Forensic Sciences

    Czech Academy of Sciences Publication Activity Database

    Zvárová, Jana

    2009-01-01

    Roč. 20, č. 6 (2009), s. 743-750 ISSN 1180-4009. [TIES 2007. Annual Meeting of the International Environmental Society /18./. Mikulov, 16.08.2007-20.08.2007] Institutional research plan: CEZ:AV0Z10300504 Keywords : biomedical informatics * biomedical statistics * genetic information * forensic dentistry Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.000, year: 2009

  16. Statistical data presentation.

    Science.gov (United States)

    In, Junyong; Lee, Sangseok

    2017-06-01

    Data are usually collected in a raw format and thus the inherent information is difficult to understand. Therefore, raw data need to be summarized, processed, and analyzed. However, no matter how well manipulated, the information derived from the raw data should be presented in an effective format, otherwise, it would be a great loss for both authors and readers. In this article, the techniques of data and information presentation in textual, tabular, and graphical forms are introduced. Text is the principal method for explaining findings, outlining trends, and providing contextual information. A table is best suited for representing individual information and represents both quantitative and qualitative information. A graph is a very effective visual tool as it displays data at a glance, facilitates comparison, and can reveal trends and relationships within the data such as changes over time, frequency distribution, and correlation or relative share of a whole. Text, tables, and graphs for data and information presentation are very powerful communication tools. They can make an article easy to understand, attract and sustain the interest of readers, and efficiently present large amounts of complex information. Moreover, as journal editors and reviewers glance at these presentations before reading the whole article, their importance cannot be ignored.

  17. Some challenges with statistical inference in adaptive designs.

    Science.gov (United States)

    Hung, H M James; Wang, Sue-Jane; Yang, Peiling

    2014-01-01

    Adaptive designs have generated a great deal of attention to clinical trial communities. The literature contains many statistical methods to deal with added statistical uncertainties concerning the adaptations. Increasingly encountered in regulatory applications are adaptive statistical information designs that allow modification of sample size or related statistical information and adaptive selection designs that allow selection of doses or patient populations during the course of a clinical trial. For adaptive statistical information designs, a few statistical testing methods are mathematically equivalent, as a number of articles have stipulated, but arguably there are large differences in their practical ramifications. We pinpoint some undesirable features of these methods in this work. For adaptive selection designs, the selection based on biomarker data for testing the correlated clinical endpoints may increase statistical uncertainty in terms of type I error probability, and most importantly the increased statistical uncertainty may be impossible to assess.

  18. Statistical theory and inference

    CERN Document Server

    Olive, David J

    2014-01-01

    This text is for  a one semester graduate course in statistical theory and covers minimal and complete sufficient statistics, maximum likelihood estimators, method of moments, bias and mean square error, uniform minimum variance estimators and the Cramer-Rao lower bound, an introduction to large sample theory, likelihood ratio tests and uniformly most powerful  tests and the Neyman Pearson Lemma. A major goal of this text is to make these topics much more accessible to students by using the theory of exponential families. Exponential families, indicator functions and the support of the distribution are used throughout the text to simplify the theory. More than 50 ``brand name" distributions are used to illustrate the theory with many examples of exponential families, maximum likelihood estimators and uniformly minimum variance unbiased estimators. There are many homework problems with over 30 pages of solutions.

  19. Statistics of force networks

    Science.gov (United States)

    Tighe, Brian

    2009-03-01

    We study the statistics of contact forces in the force network ensemble, a minimal model of jammed granular media that emphasizes the role of vector force balance. We show that the force probability distribution can be calculated analytically by way of an analogy to equilibrium ensemble methods. In two dimensions the large force tail decays asymptotically as a Gaussian, distinct from earlier predictions, due to the existence of a conserved quantity related to the presence of local vector force balance. We confirm our predictions with highly accurate statistical sampling -- we sample the force distribution over more than 40 decades -- permitting unambiguous confrontation of theory with numerics. We show how the conserved quantity arises naturally within the context of any constant stress ensemble.

  20. Classical and statistical thermodynamics

    CERN Document Server

    Rizk, Hanna A

    2016-01-01

    This is a text book of thermodynamics for the student who seeks thorough training in science or engineering. Systematic and thorough treatment of the fundamental principles rather than presenting the large mass of facts has been stressed. The book includes some of the historical and humanistic background of thermodynamics, but without affecting the continuity of the analytical treatment. For a clearer and more profound understanding of thermodynamics this book is highly recommended. In this respect, the author believes that a sound grounding in classical thermodynamics is an essential prerequisite for the understanding of statistical thermodynamics. Such a book comprising the two wide branches of thermodynamics is in fact unprecedented. Being a written work dealing systematically with the two main branches of thermodynamics, namely classical thermodynamics and statistical thermodynamics, together with some important indexes under only one cover, this treatise is so eminently useful.

  1. General and Statistical Thermodynamics

    CERN Document Server

    Tahir-Kheli, Raza

    2012-01-01

    This textbook explains completely the general and statistical thermodynamics. It begins with an introductory statistical mechanics course, deriving all the important formulae meticulously and explicitly, without mathematical short cuts. The main part of the book deals with the careful discussion of the concepts and laws of thermodynamics, van der Waals, Kelvin and Claudius theories, ideal and real gases, thermodynamic potentials, phonons and all the related aspects. To elucidate the concepts introduced and to provide practical problem solving support, numerous carefully worked examples are of great value for students. The text is clearly written and punctuated with many interesting anecdotes. This book is written as main textbook for upper undergraduate students attending a course on thermodynamics.

  2. Applied statistical thermodynamics

    CERN Document Server

    Lucas, Klaus

    1991-01-01

    The book guides the reader from the foundations of statisti- cal thermodynamics including the theory of intermolecular forces to modern computer-aided applications in chemical en- gineering and physical chemistry. The approach is new. The foundations of quantum and statistical mechanics are presen- ted in a simple way and their applications to the prediction of fluid phase behavior of real systems are demonstrated. A particular effort is made to introduce the reader to expli- cit formulations of intermolecular interaction models and to show how these models influence the properties of fluid sy- stems. The established methods of statistical mechanics - computer simulation, perturbation theory, and numerical in- tegration - are discussed in a style appropriate for newcom- ers and are extensively applied. Numerous worked examples illustrate how practical calculations should be carried out.

  3. Visuanimation in statistics

    KAUST Repository

    Genton, Marc G.

    2015-04-14

    This paper explores the use of visualization through animations, coined visuanimation, in the field of statistics. In particular, it illustrates the embedding of animations in the paper itself and the storage of larger movies in the online supplemental material. We present results from statistics research projects using a variety of visuanimations, ranging from exploratory data analysis of image data sets to spatio-temporal extreme event modelling; these include a multiscale analysis of classification methods, the study of the effects of a simulated explosive volcanic eruption and an emulation of climate model output. This paper serves as an illustration of visuanimation for future publications in Stat. Copyright © 2015 John Wiley & Sons, Ltd.

  4. Fermions from classical statistics

    International Nuclear Information System (INIS)

    Wetterich, C.

    2010-01-01

    We describe fermions in terms of a classical statistical ensemble. The states τ of this ensemble are characterized by a sequence of values one or zero or a corresponding set of two-level observables. Every classical probability distribution can be associated to a quantum state for fermions. If the time evolution of the classical probabilities p τ amounts to a rotation of the wave function q τ (t)=±√(p τ (t)), we infer the unitary time evolution of a quantum system of fermions according to a Schroedinger equation. We establish how such classical statistical ensembles can be mapped to Grassmann functional integrals. Quantum field theories for fermions arise for a suitable time evolution of classical probabilities for generalized Ising models.

  5. 1979 DOE statistical symposium

    International Nuclear Information System (INIS)

    Gardiner, D.A.; Truett, T.

    1980-09-01

    The 1979 DOE Statistical Symposium was the fifth in the series of annual symposia designed to bring together statisticians and other interested parties who are actively engaged in helping to solve the nation's energy problems. The program included presentations of technical papers centered around exploration and disposal of nuclear fuel, general energy-related topics, and health-related issues, and workshops on model evaluation, risk analysis, analysis of large data sets, and resource estimation

  6. 1979 DOE statistical symposium

    Energy Technology Data Exchange (ETDEWEB)

    Gardiner, D.A.; Truett T. (comps. and eds.)

    1980-09-01

    The 1979 DOE Statistical Symposium was the fifth in the series of annual symposia designed to bring together statisticians and other interested parties who are actively engaged in helping to solve the nation's energy problems. The program included presentations of technical papers centered around exploration and disposal of nuclear fuel, general energy-related topics, and health-related issues, and workshops on model evaluation, risk analysis, analysis of large data sets, and resource estimation.

  7. Asymptotics in Quantum Statistics

    OpenAIRE

    Gill, Richard D.

    2004-01-01

    Observations or measurements taken of a quantum system (a small number of fundamental particles) are inherently random. If the state of the system depends on unknown parameters, then the distribution of the outcome depends on these parameters too, and statistical inference problems result. Often one has a choice of what measurement to take, corresponding to different experimental set-ups or settings of measurement apparatus. This leads to a design problem--which measurement is best for a give...

  8. 2002 energy statistics

    International Nuclear Information System (INIS)

    2003-01-01

    This report has 12 chapters. The first chapter includes world energy reserves, the second chapter is about world primary energy production and consumption condition. Other chapters include; world energy prices, energy reserves in Turkey, Turkey primary energy production and consumption condition, Turkey energy balance tables, Turkey primary energy reserves production, consumption, imports and exports conditions, sectoral energy consumptions, Turkey secondary electricity plants, Turkey energy investments, Turkey energy prices.This report gives world and Turkey statistics on energy

  9. READING STATISTICS AND RESEARCH

    OpenAIRE

    Reviewed by Yavuz Akbulut

    2008-01-01

    The book demonstrates the best and most conservative ways to decipher and critique research reports particularly for social science researchers. In addition, new editions of the book are always better organized, effectively structured and meticulously updated in line with the developments in the field of research statistics. Even the most trivial issues are revisited and updated in new editions. For instance, purchaser of the previous editions might check the interpretation of skewness and ku...

  10. Flaws and fallacies in statistical thinking

    CERN Document Server

    Campbell, Stephen K

    2004-01-01

    This book was written with a dual purpose: first, the author was motivated to relieve his distress over the faulty conclusions drawn from the frequent misuse of relatively simple statistical tools such as percents, graphs, and averages. Second, his objective was to create a nontechnical book that would help people make better-informed decisions by increasing their ability to judge the quality of statistical evidence. This volume achieves both, serving as a supplemental text for students taking their first course in statistics, and as a self-help guide for anyone wishing to evaluate statistica

  11. Monthly bulletin of statistics. June 1995

    International Nuclear Information System (INIS)

    1995-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  12. Monthly bulletin of statistics. March 1994

    International Nuclear Information System (INIS)

    1994-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  13. Monthly Bulletin of Statistics. July 1993

    International Nuclear Information System (INIS)

    1993-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  14. Monthly bulletin of statistics. October 1993

    International Nuclear Information System (INIS)

    1993-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  15. Monthly bulletin of statistics. February 1994

    International Nuclear Information System (INIS)

    1994-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  16. 7th International Workshop on Statistical Simulation

    CERN Document Server

    Mignani, Stefania; Monari, Paola; Salmaso, Luigi

    2014-01-01

    The Department of Statistical Sciences of the University of Bologna in collaboration with the Department of Management and Engineering of the University of Padova, the Department of Statistical Modelling of Saint Petersburg State University, and INFORMS Simulation Society sponsored the Seventh Workshop on Simulation. This international conference was devoted to statistical techniques in stochastic simulation, data collection, analysis of scientific experiments, and studies representing broad areas of interest. The previous workshops took place in St. Petersburg, Russia in 1994, 1996, 1998, 2001, 2005, and 2009. The Seventh Workshop took place in the Rimini Campus of the University of Bologna, which is in Rimini’s historical center.

  17. IBM SPSS statistics 19 made simple

    CERN Document Server

    Gray, Colin D

    2012-01-01

    This new edition of one of the most widely read textbooks in its field introduces the reader to data analysis with the most powerful and versatile statistical package on the market: IBM SPSS Statistics 19. Each new release of SPSS Statistics features new options and other improvements. There remains a core of fundamental operating principles and techniques which have continued to apply to all releases issued in recent years and have been proved to be worth communicating in a small volume. This practical and informal book combines simplicity and clarity of presentation with a comprehensive trea

  18. Monthly bulletin of statistics. June 2007

    International Nuclear Information System (INIS)

    2007-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  19. Statistics and Data Interpretation for Social Work

    CERN Document Server

    Rosenthal, James

    2011-01-01

    "Without question, this text will be the most authoritative source of information on statistics in the human services. From my point of view, it is a definitive work that combines a rigorous pedagogy with a down to earth (commonsense) exploration of the complex and difficult issues in data analysis (statistics) and interpretation. I welcome its publication.". -Praise for the First Edition. Written by a social worker for social work students, this is a nuts and bolts guide to statistics that presents complex calculations and concepts in clear, easy-to-understand language. It includes

  20. R for statistics

    CERN Document Server

    Cornillon, Pierre-Andre; Husson, Francois; Jegou, Nicolas; Josse, Julie; Kloareg, Maela; Matzner-Lober, Eric; Rouviere, Laurent

    2012-01-01

    An Overview of RMain ConceptsInstalling RWork SessionHelpR ObjectsFunctionsPackagesExercisesPreparing DataReading Data from FileExporting ResultsManipulating VariablesManipulating IndividualsConcatenating Data TablesCross-TabulationExercisesR GraphicsConventional Graphical FunctionsGraphical Functions with latticeExercisesMaking Programs with RControl FlowsPredefined FunctionsCreating a FunctionExercisesStatistical MethodsIntroduction to the Statistical MethodsA Quick Start with RInstalling ROpening and Closing RThe Command PromptAttribution, Objects, and FunctionSelectionOther Rcmdr PackageImporting (or Inputting) DataGraphsStatistical AnalysisHypothesis TestConfidence Intervals for a MeanChi-Square Test of IndependenceComparison of Two MeansTesting Conformity of a ProportionComparing Several ProportionsThe Power of a TestRegressionSimple Linear RegressionMultiple Linear RegressionPartial Least Squares (PLS) RegressionAnalysis of Variance and CovarianceOne-Way Analysis of VarianceMulti-Way Analysis of Varian...