WorldWideScience

Sample records for include descriptive statistics

  1. Descriptive statistics.

    Science.gov (United States)

    Nick, Todd G

    2007-01-01

    Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.

  2. A Statistical Primer: Understanding Descriptive and Inferential Statistics

    OpenAIRE

    Gillian Byrne

    2007-01-01

    As libraries and librarians move more towards evidence‐based decision making, the data being generated in libraries is growing. Understanding the basics of statistical analysis is crucial for evidence‐based practice (EBP), in order to correctly design and analyze researchas well as to evaluate the research of others. This article covers the fundamentals of descriptive and inferential statistics, from hypothesis construction to sampling to common statistical techniques including chi‐square, co...

  3. Practical Statistics for LHC Physicists: Descriptive Statistics, Probability and Likelihood (1/3)

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    These lectures cover those principles and practices of statistics that are most relevant for work at the LHC. The first lecture discusses the basic ideas of descriptive statistics, probability and likelihood. The second lecture covers the key ideas in the frequentist approach, including confidence limits, profile likelihoods, p-values, and hypothesis testing. The third lecture covers inference in the Bayesian approach. Throughout, real-world examples will be used to illustrate the practical application of the ideas. No previous knowledge is assumed.

  4. Descriptive and inferential statistical methods used in burns research.

    Science.gov (United States)

    Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars

    2010-05-01

    Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals

  5. Descriptive statistics: the specification of statistical measures and their presentation in tables and graphs. Part 7 of a series on evaluation of scientific publications.

    Science.gov (United States)

    Spriestersbach, Albert; Röhrig, Bernd; du Prel, Jean-Baptist; Gerhold-Ay, Aslihan; Blettner, Maria

    2009-09-01

    Descriptive statistics are an essential part of biometric analysis and a prerequisite for the understanding of further statistical evaluations, including the drawing of inferences. When data are well presented, it is usually obvious whether the author has collected and evaluated them correctly and in keeping with accepted practice in the field. Statistical variables in medicine may be of either the metric (continuous, quantitative) or categorical (nominal, ordinal) type. Easily understandable examples are given. Basic techniques for the statistical description of collected data are presented and illustrated with examples. The goal of a scientific study must always be clearly defined. The definition of the target value or clinical endpoint determines the level of measurement of the variables in question. Nearly all variables, whatever their level of measurement, can be usefully presented graphically and numerically. The level of measurement determines what types of diagrams and statistical values are appropriate. There are also different ways of presenting combinations of two independent variables graphically and numerically. The description of collected data is indispensable. If the data are of good quality, valid and important conclusions can already be drawn when they are properly described. Furthermore, data description provides a basis for inferential statistics.

  6. Application of descriptive statistics in analysis of experimental data

    OpenAIRE

    Mirilović Milorad; Pejin Ivana

    2008-01-01

    Statistics today represent a group of scientific methods for the quantitative and qualitative investigation of variations in mass appearances. In fact, statistics present a group of methods that are used for the accumulation, analysis, presentation and interpretation of data necessary for reaching certain conclusions. Statistical analysis is divided into descriptive statistical analysis and inferential statistics. The values which represent the results of an experiment, and which are the subj...

  7. An introduction to descriptive statistics: A review and practical guide

    International Nuclear Information System (INIS)

    Marshall, Gill; Jonker, Leon

    2010-01-01

    This paper, the first of two, demonstrates why it is necessary for radiographers to understand basic statistical concepts both to assimilate the work of others and also in their own research work. As the emphasis on evidence-based practice increases, it will become more pressing for radiographers to be able to dissect other people's research and to contribute to research themselves. The different types of data that one can come across are covered here, as well as different ways to describe data. Furthermore, the statistical terminology and methods used that comprise descriptive statistics are explained, including levels of measurement, measures of central tendency (average), and dispersion (spread) and the concept of normal distribution. This paper reviews relevant literature, provides a checklist of points to consider before progressing with the application of appropriate statistical methods to a data set, and provides a glossary of relevant terms for reference.

  8. Thin film description by wavelet coefficients statistics

    Czech Academy of Sciences Publication Activity Database

    Boldyš, Jiří; Hrach, R.

    2005-01-01

    Roč. 55, č. 1 (2005), s. 55-64 ISSN 0011-4626 Grant - others:GA UK(CZ) 173/2003 Institutional research plan: CEZ:AV0Z10750506 Keywords : thin films * wavelet transform * descriptors * histogram model Subject RIV: BD - Theory of Information Impact factor: 0.360, year: 2005 http://library.utia.cas.cz/separaty/2009/ZOI/boldys-thin film description by wavelet coefficients statistics .pdf

  9. Statistical analysis and interpretation of prenatal diagnostic imaging studies, Part 2: descriptive and inferential statistical methods.

    Science.gov (United States)

    Tuuli, Methodius G; Odibo, Anthony O

    2011-08-01

    The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.

  10. Statistical descriptions of polydisperse turbulent two-phase flows

    Energy Technology Data Exchange (ETDEWEB)

    Minier, Jean-Pierre, E-mail: jean-pierre.minier@edf.fr

    2016-12-15

    Disperse two-phase flows are flows containing two non-miscible phases where one phase is present as a set of discrete elements dispersed in the second one. These discrete elements, or ‘particles’, can be droplets, bubbles or solid particles having different sizes. This situation encompasses a wide range of phenomena, from nano-particles and colloids sensitive to the molecular fluctuations of the carrier fluid to inertia particles transported by the large-scale motions of turbulent flows and, depending on the phenomenon studied, a broad spectrum of approaches have been developed. The aim of the present article is to analyze statistical models of particles in turbulent flows by addressing this issue as the extension of the classical formulations operating at a molecular or meso-molecular level of description. It has a three-fold purpose: (1) to bring out the thread of continuity between models for discrete particles in turbulent flows (above the hydrodynamical level of description) and classical mesoscopic formulations of statistical physics (below the hydrodynamical level); (2) to reveal the specific challenges met by statistical models in turbulence; (3) to establish a methodology for modeling particle dynamics in random media with non-zero space and time correlations. The presentation is therefore centered on organizing the different approaches, establishing links and clarifying physical foundations. The analysis of disperse two-phase flow models is developed by discussing: first, approaches of classical statistical physics; then, by considering models for single-phase turbulent flows; and, finally, by addressing current formulations for discrete particles in turbulent flows. This brings out that particle-based models do not cease to exist above the hydrodynamical level and offer great interest when combined with proper stochastic formulations to account for the lack of equilibrium distributions and scale separation. In the course of this study, general

  11. Academic Training Lecture | Practical Statistics for LHC Physicists: Descriptive Statistics, Probability and Likelihood | 7-9 April

    CERN Multimedia

    2015-01-01

    Please note that our next series of Academic Training Lectures will take place on the 7, 8 and 9 April 2015   Practical Statistics for LHC Physicists: Descriptive Statistics, Probability and Likelihood, by Harrison Prosper, Floridia State University, USA. from 11.00 a.m. to 12.00 p.m. in the Council Chamber (503-1-001) https://indico.cern.ch/event/358542/

  12. Statistical mechanics and the description of the early universe I

    DEFF Research Database (Denmark)

    Pessah, Martin Elias; F. Torres, Diego; Vucetich, H.

    2001-01-01

    We analyze how the thermal history of the universe is influenced by the statistical description, assuming a deviation from the usual Bose-Einstein, Fermi-Dirac and Boltzmann-Gibbs distribution functions. These deviations represent the possible appearance of non-extensive effects related with the ......We analyze how the thermal history of the universe is influenced by the statistical description, assuming a deviation from the usual Bose-Einstein, Fermi-Dirac and Boltzmann-Gibbs distribution functions. These deviations represent the possible appearance of non-extensive effects related...... and to place limits to the range of its validity. The corrections obtained will change with temperature, and consequently, the bounds on the possible amount of non-extensivity will also change with time. We generalize results which can be used in other contexts as well, as the Boltzmann equation and the Saha...

  13. Consistent dynamical and statistical description of fission and comparison

    Energy Technology Data Exchange (ETDEWEB)

    Shunuan, Wang [Chinese Nuclear Data Center, Beijing, BJ (China)

    1996-06-01

    The research survey of consistent dynamical and statistical description of fission is briefly introduced. The channel theory of fission with diffusive dynamics based on Bohr channel theory of fission and Fokker-Planck equation and Kramers-modified Bohr-Wheeler expression according to Strutinsky method given by P.Frobrich et al. are compared and analyzed. (2 figs.).

  14. Descriptive Research

    DEFF Research Database (Denmark)

    Wigram, Anthony Lewis

    2003-01-01

    Descriptive research is described by Lathom-Radocy and Radocy (1995) to include Survey research, ex post facto research, case studies and developmental studies. Descriptive research also includes a review of the literature in order to provide both quantitative and qualitative evidence of the effect...... starts will allow effect size calculations to be made in order to evaluate effect over time. Given the difficulties in undertaking controlled experimental studies in the creative arts therapies, descriptive research methods offer a way of quantifying effect through descriptive statistical analysis...

  15. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    Science.gov (United States)

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix

  16. About the statistical description of gas-liquid flows

    Energy Technology Data Exchange (ETDEWEB)

    Sanz, D.; Guido-Lavalle, G.; Carrica, P. [Centro Atomico Bariloche and Instituto Balseiro (Argentina)] [and others

    1995-09-01

    Elements of the probabilistic geometry are used to derive the bubble coalescence term of the statistical description of gas liquid flows. It is shown that the Boltzmann`s hypothesis, that leads to the kinetic theory of dilute gases, is not appropriate for this kind of flows. The resulting integro-differential transport equation is numerically integrated to study the flow development in slender bubble columns. The solution remarkably predicts the transition from bubbly to slug flow pattern. Moreover, a bubbly bimodal size distribution is predicted, which has already been observed experimentally.

  17. Using Microsoft Excel[R] to Calculate Descriptive Statistics and Create Graphs

    Science.gov (United States)

    Carr, Nathan T.

    2008-01-01

    Descriptive statistics and appropriate visual representations of scores are important for all test developers, whether they are experienced testers working on large-scale projects, or novices working on small-scale local tests. Many teachers put in charge of testing projects do not know "why" they are important, however, and are utterly convinced…

  18. Descriptive Statistics: Reporting the Answers to the 5 Basic Questions of Who, What, Why, When, Where, and a Sixth, So What?

    Science.gov (United States)

    Vetter, Thomas R

    2017-11-01

    Descriptive statistics are specific methods basically used to calculate, describe, and summarize collected research data in a logical, meaningful, and efficient way. Descriptive statistics are reported numerically in the manuscript text and/or in its tables, or graphically in its figures. This basic statistical tutorial discusses a series of fundamental concepts about descriptive statistics and their reporting. The mean, median, and mode are 3 measures of the center or central tendency of a set of data. In addition to a measure of its central tendency (mean, median, or mode), another important characteristic of a research data set is its variability or dispersion (ie, spread). In simplest terms, variability is how much the individual recorded scores or observed values differ from one another. The range, standard deviation, and interquartile range are 3 measures of variability or dispersion. The standard deviation is typically reported for a mean, and the interquartile range for a median. Testing for statistical significance, along with calculating the observed treatment effect (or the strength of the association between an exposure and an outcome), and generating a corresponding confidence interval are 3 tools commonly used by researchers (and their collaborating biostatistician or epidemiologist) to validly make inferences and more generalized conclusions from their collected data and descriptive statistics. A number of journals, including Anesthesia & Analgesia, strongly encourage or require the reporting of pertinent confidence intervals. A confidence interval can be calculated for virtually any variable or outcome measure in an experimental, quasi-experimental, or observational research study design. Generally speaking, in a clinical trial, the confidence interval is the range of values within which the true treatment effect in the population likely resides. In an observational study, the confidence interval is the range of values within which the true strength

  19. Fundamental Statistical Descriptions of Plasma Turbulence in Magnetic Fields

    Energy Technology Data Exchange (ETDEWEB)

    John A. Krommes

    2001-02-16

    A pedagogical review of the historical development and current status (as of early 2000) of systematic statistical theories of plasma turbulence is undertaken. Emphasis is on conceptual foundations and methodology, not practical applications. Particular attention is paid to equations and formalism appropriate to strongly magnetized, fully ionized plasmas. Extensive reference to the literature on neutral-fluid turbulence is made, but the unique properties and problems of plasmas are emphasized throughout. Discussions are given of quasilinear theory, weak-turbulence theory, resonance-broadening theory, and the clump algorithm. Those are developed independently, then shown to be special cases of the direct-interaction approximation (DIA), which provides a central focus for the article. Various methods of renormalized perturbation theory are described, then unified with the aid of the generating-functional formalism of Martin, Siggia, and Rose. A general expression for the renormalized dielectric function is deduced and discussed in detail. Modern approaches such as decimation and PDF methods are described. Derivations of DIA-based Markovian closures are discussed. The eddy-damped quasinormal Markovian closure is shown to be nonrealizable in the presence of waves, and a new realizable Markovian closure is presented. The test-field model and a realizable modification thereof are also summarized. Numerical solutions of various closures for some plasma-physics paradigms are reviewed. The variational approach to bounds on transport is developed. Miscellaneous topics include Onsager symmetries for turbulence, the interpretation of entropy balances for both kinetic and fluid descriptions, self-organized criticality, statistical interactions between disparate scales, and the roles of both mean and random shear. Appendices are provided on Fourier transform conventions, dimensional and scaling analysis, the derivations of nonlinear gyrokinetic and gyrofluid equations

  20. Fundamental Statistical Descriptions of Plasma Turbulence in Magnetic Fields

    International Nuclear Information System (INIS)

    Krommes, John A.

    2001-01-01

    A pedagogical review of the historical development and current status (as of early 2000) of systematic statistical theories of plasma turbulence is undertaken. Emphasis is on conceptual foundations and methodology, not practical applications. Particular attention is paid to equations and formalism appropriate to strongly magnetized, fully ionized plasmas. Extensive reference to the literature on neutral-fluid turbulence is made, but the unique properties and problems of plasmas are emphasized throughout. Discussions are given of quasilinear theory, weak-turbulence theory, resonance-broadening theory, and the clump algorithm. Those are developed independently, then shown to be special cases of the direct-interaction approximation (DIA), which provides a central focus for the article. Various methods of renormalized perturbation theory are described, then unified with the aid of the generating-functional formalism of Martin, Siggia, and Rose. A general expression for the renormalized dielectric function is deduced and discussed in detail. Modern approaches such as decimation and PDF methods are described. Derivations of DIA-based Markovian closures are discussed. The eddy-damped quasinormal Markovian closure is shown to be nonrealizable in the presence of waves, and a new realizable Markovian closure is presented. The test-field model and a realizable modification thereof are also summarized. Numerical solutions of various closures for some plasma-physics paradigms are reviewed. The variational approach to bounds on transport is developed. Miscellaneous topics include Onsager symmetries for turbulence, the interpretation of entropy balances for both kinetic and fluid descriptions, self-organized criticality, statistical interactions between disparate scales, and the roles of both mean and random shear. Appendices are provided on Fourier transform conventions, dimensional and scaling analysis, the derivations of nonlinear gyrokinetic and gyrofluid equations

  1. Statistics with JMP graphs, descriptive statistics and probability

    CERN Document Server

    Goos, Peter

    2015-01-01

    Peter Goos, Department of Statistics, University ofLeuven, Faculty of Bio-Science Engineering and University ofAntwerp, Faculty of Applied Economics, BelgiumDavid Meintrup, Department of Mathematics and Statistics,University of Applied Sciences Ingolstadt, Faculty of MechanicalEngineering, GermanyThorough presentation of introductory statistics and probabilitytheory, with numerous examples and applications using JMPDescriptive Statistics and Probability provides anaccessible and thorough overview of the most important descriptivestatistics for nominal, ordinal and quantitative data withpartic

  2. Fermi–Dirac Statistics

    Indian Academy of Sciences (India)

    IAS Admin

    Pauli exclusion principle, Fermi–. Dirac statistics, identical and in- distinguishable particles, Fermi gas. Fermi–Dirac Statistics. Derivation and Consequences. S Chaturvedi and Shyamal Biswas. (left) Subhash Chaturvedi is at University of. Hyderabad. His current research interests include phase space descriptions.

  3. From Matched Spatial Filtering towards the Fused Statistical Descriptive Regularization Method for Enhanced Radar Imaging

    Directory of Open Access Journals (Sweden)

    Shkvarko Yuriy

    2006-01-01

    Full Text Available We address a new approach to solve the ill-posed nonlinear inverse problem of high-resolution numerical reconstruction of the spatial spectrum pattern (SSP of the backscattered wavefield sources distributed over the remotely sensed scene. An array or synthesized array radar (SAR that employs digital data signal processing is considered. By exploiting the idea of combining the statistical minimum risk estimation paradigm with numerical descriptive regularization techniques, we address a new fused statistical descriptive regularization (SDR strategy for enhanced radar imaging. Pursuing such an approach, we establish a family of the SDR-related SSP estimators, that encompass a manifold of existing beamforming techniques ranging from traditional matched filter to robust and adaptive spatial filtering, and minimum variance methods.

  4. The usefulness of descriptive statistics in the interpretation of data on occupational physical activity of Poles

    Directory of Open Access Journals (Sweden)

    Elżbieta Biernat

    2014-12-01

    Full Text Available Background: The aim of this paper is to assess whether basic descriptive statistics is sufficient to interpret the data on physical activity of Poles within occupational domain of life. Material and Methods: The study group consisted of 964 randomly selected Polish working professionals. The long version of the International Physical Activity Questionnaire (IPAQ was used. Descriptive statistics included characteristics of variables using: mean (M, median (Me, maximal and minimal values (max–min., standard deviation (SD and percentile values. Statistical inference was based on the comparison of variables with the significance level of 0.05 (Kruskal-Wallis and Pearson’s Chi2 tests. Results: Occupational physical activity (OPA was declared by 46.4% of respondents (vigorous – 23.5%, moderate – 30.2%, walking – 39.5%. The total OPA amounted to 2751.1 MET-min/week (Metabolic Equivalent of Task with very high standard deviation (SD = 5302.8 and max = 35 511 MET-min/week. It concerned different types of activities. Approximately 10% (90th percentile overstated the average. However, there was no significant difference depended on the character of the profession, or the type of activity. The average time of sitting was 256 min/day. As many as 39% of the respondents met the World Health Organization standards only due to OPA (42.5% of white-collar workers, 38% of administrative and technical employees and only 37.9% of physical workers. Conclusions: In the data analysis it is necessary to define quantiles to provide a fuller picture of the distributions of OPA in MET-min/week. It is also crucial to update the guidelines for data processing and analysis of long version of IPAQ. It seems that 16 h of activity/day is not a sufficient criterion for excluding the results from further analysis. Med Pr 2014;65(6:743–753

  5. Statistics in the pharmacy literature.

    Science.gov (United States)

    Lee, Charlene M; Soin, Herpreet K; Einarson, Thomas R

    2004-09-01

    Research in statistical methods is essential for maintenance of high quality of the published literature. To update previous reports of the types and frequencies of statistical terms and procedures in research studies of selected professional pharmacy journals. We obtained all research articles published in 2001 in 6 journals: American Journal of Health-System Pharmacy, The Annals of Pharmacotherapy, Canadian Journal of Hospital Pharmacy, Formulary, Hospital Pharmacy, and Journal of the American Pharmaceutical Association. Two independent reviewers identified and recorded descriptive and inferential statistical terms/procedures found in the methods, results, and discussion sections of each article. Results were determined by tallying the total number of times, as well as the percentage, that each statistical term or procedure appeared in the articles. One hundred forty-four articles were included. Ninety-eight percent employed descriptive statistics; of these, 28% used only descriptive statistics. The most common descriptive statistical terms were percentage (90%), mean (74%), standard deviation (58%), and range (46%). Sixty-nine percent of the articles used inferential statistics, the most frequent being chi(2) (33%), Student's t-test (26%), Pearson's correlation coefficient r (18%), ANOVA (14%), and logistic regression (11%). Statistical terms and procedures were found in nearly all of the research articles published in pharmacy journals. Thus, pharmacy education should aim to provide current and future pharmacists with an understanding of the common statistical terms and procedures identified to facilitate the appropriate appraisal and consequential utilization of the information available in research articles.

  6. The CSB Incident Screening Database: description, summary statistics and uses.

    Science.gov (United States)

    Gomez, Manuel R; Casper, Susan; Smith, E Allen

    2008-11-15

    This paper briefly describes the Chemical Incident Screening Database currently used by the CSB to identify and evaluate chemical incidents for possible investigations, and summarizes descriptive statistics from this database that can potentially help to estimate the number, character, and consequences of chemical incidents in the US. The report compares some of the information in the CSB database to roughly similar information available from databases operated by EPA and the Agency for Toxic Substances and Disease Registry (ATSDR), and explores the possible implications of these comparisons with regard to the dimension of the chemical incident problem. Finally, the report explores in a preliminary way whether a system modeled after the existing CSB screening database could be developed to serve as a national surveillance tool for chemical incidents.

  7. Proposal to Include Electrical Energy in the Industrial Return Statistics

    CERN Document Server

    2003-01-01

    At its 108th session on the 20 June 1997, the Council approved the Report of the Finance Committee Working Group on the Review of CERN Purchasing Policy and Procedures. Among other topics, the report recommended the inclusion of utility supplies in the calculation of the return statistics as soon as the relevant markets were deregulated, without reaching a consensus on the exact method of calculation. At its 296th meeting on the 18 June 2003, the Finance Committee approved a proposal to award a contract for the supply of electrical energy (CERN/FC/4693). The purpose of the proposal in this document is to clarify the way electrical energy will be included in future calculations of the return statistics. The Finance Committee is invited: 1. to agree that the full cost to CERN of electrical energy (excluding the cost of transport) be included in the Industrial Service return statistics; 2. to recommend that the Council approves the corresponding amendment to the Financial Rules set out in section 2 of this docum...

  8. Statistical Tutorial | Center for Cancer Research

    Science.gov (United States)

    Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data.  ST is designed as a follow up to Statistical Analysis of Research Data (SARD) held in April 2018.  The tutorial will apply the general principles of statistical analysis of research data including descriptive statistics, z- and t-tests of means and mean

  9. Measurement and statistics for teachers

    CERN Document Server

    Van Blerkom, Malcolm

    2008-01-01

    Written in a student-friendly style, Measurement and Statistics for Teachers shows teachers how to use measurement and statistics wisely in their classes. Although there is some discussion of theory, emphasis is given to the practical, everyday uses of measurement and statistics. The second part of the text provides more complete coverage of basic descriptive statistics and their use in the classroom than in any text now available.Comprehensive and accessible, Measurement and Statistics for Teachers includes:Short vignettes showing concepts in action Numerous classroom examples Highlighted vocabulary Boxes summarizing related concepts End-of-chapter exercises and problems Six full chapters devoted to the essential topic of Classroom Tests Instruction on how to carry out informal assessments, performance assessments, and portfolio assessments, and how to use and interpret standardized tests A five-chapter section on Descriptive Statistics, giving instructors the option of more thoroughly teaching basic measur...

  10. Back to basics: an introduction to statistics.

    Science.gov (United States)

    Halfens, R J G; Meijers, J M M

    2013-05-01

    In the second in the series, Professor Ruud Halfens and Dr Judith Meijers give an overview of statistics, both descriptive and inferential. They describe the first principles of statistics, including some relevant inferential tests.

  11. Statistical modeling in phenomenological description of electromagnetic cascade processes produced by high-energy gamma quanta

    International Nuclear Information System (INIS)

    Slowinski, B.

    1987-01-01

    A description of a simple phenomenological model of electromagnetic cascade process (ECP) initiated by high-energy gamma quanta in heavy absorbents is given. Within this model spatial structure and fluctuations of ionization losses of shower electrons and positrons are described. Concrete formulae have been obtained as a result of statistical analysis of experimental data from the xenon bubble chamber of ITEP (Moscow)

  12. Absorbed impact energy and mode of fracture: A statistical description of the micro-structural dispersion

    Energy Technology Data Exchange (ETDEWEB)

    Pontikis, V., E-mail: Vassilis.Pontikis@cea.f [Commissariat a l' Energie Atomique, IRAMIS, Laboratoire des Solides Irradies, CNRS UMR 7642, Ecole Polytechnique, 91191 Gif sur Yvette Cedex (France); Gorse, D. [Commissariat a l' Energie Atomique, IRAMIS, Laboratoire des Solides Irradies, CNRS UMR 7642, Ecole Polytechnique, 91191 Gif sur Yvette Cedex (France)

    2009-10-01

    A statistical model is proposed to account for the influence of the dispersion of the microstructure on the ductile-to-brittle transition in body centered cubic (bcc) metals and their alloys. In this model, the dispersion of the microstructure is expressed via a normal distribution of transition temperatures whereas a simple relation exists between the values of absorbed, lower and upper shelf energies, the ductile area fraction and the distribution parameters. It is shown that via an appropriate renormalization of energies and temperatures, experimental data for different materials and ageing conditions align all together on a master curve, guaranteeing thereby the effectiveness of the proposed statistical description.

  13. Absorbed impact energy and mode of fracture: A statistical description of the micro-structural dispersion

    International Nuclear Information System (INIS)

    Pontikis, V.; Gorse, D.

    2009-01-01

    A statistical model is proposed to account for the influence of the dispersion of the microstructure on the ductile-to-brittle transition in body centered cubic (bcc) metals and their alloys. In this model, the dispersion of the microstructure is expressed via a normal distribution of transition temperatures whereas a simple relation exists between the values of absorbed, lower and upper shelf energies, the ductile area fraction and the distribution parameters. It is shown that via an appropriate renormalization of energies and temperatures, experimental data for different materials and ageing conditions align all together on a master curve, guaranteeing thereby the effectiveness of the proposed statistical description.

  14. Using Carbon Emissions Data to "Heat Up" Descriptive Statistics

    Science.gov (United States)

    Brooks, Robert

    2012-01-01

    This article illustrates using carbon emissions data in an introductory statistics assignment. The carbon emissions data has desirable characteristics including: choice of measure; skewness; and outliers. These complexities allow research and public policy debate to be introduced. (Contains 4 figures and 2 tables.)

  15. Microscopic description of production cross sections including deexcitation effects

    Science.gov (United States)

    Sekizawa, Kazuyuki

    2017-07-01

    Background: At the forefront of the nuclear science, production of new neutron-rich isotopes is continuously pursued at accelerator laboratories all over the world. To explore the currently unknown territories in the nuclear chart far away from the stability, reliable theoretical predictions are inevitable. Purpose: To provide a reliable prediction of production cross sections taking into account secondary deexcitation processes, both particle evaporation and fission, a new method called TDHF+GEMINI is proposed, which combines the microscopic time-dependent Hartree-Fock (TDHF) theory with a sophisticated statistical compound-nucleus deexcitation model, GEMINI++. Methods: Low-energy heavy ion reactions are described based on three-dimensional Skyrme-TDHF calculations. Using the particle-number projection method, production probabilities, total angular momenta, and excitation energies of primary reaction products are extracted from the TDHF wave function after collision. Production cross sections for secondary reaction products are evaluated employing GEMINI++. Results are compared with available experimental data and widely used grazing calculations. Results: The method is applied to describe cross sections for multinucleon transfer processes in 40Ca+124Sn (Ec .m .≃128.54 MeV ), 48Ca+124Sn (Ec .m .≃125.44 MeV ), 40Ca+208Pb (Ec .m .≃208.84 MeV ), 58Ni+208Pb (Ec .m .≃256.79 MeV ), 64Ni+238U (Ec .m .≃307.35 MeV ), and 136Xe+198Pt (Ec .m .≃644.98 MeV ) reactions at energies close to the Coulomb barrier. It is shown that the inclusion of secondary deexcitation processes, which are dominated by neutron evaporation in the present systems, substantially improves agreement with the experimental data. The magnitude of the evaporation effects is very similar to the one observed in grazing calculations. TDHF+GEMINI provides better description of the absolute value of the cross sections for channels involving transfer of more than one proton, compared to the grazing

  16. A Note on Unified Statistics Including Fermi-Dirac, Bose-Einstein, and Tsallis Statistics, and Plausible Extension to Anisotropic Effect

    Directory of Open Access Journals (Sweden)

    Christianto V.

    2007-04-01

    Full Text Available In the light of some recent hypotheses suggesting plausible unification of thermostatistics where Fermi-Dirac, Bose-Einstein and Tsallis statistics become its special subsets, we consider further plausible extension to include non-integer Hausdorff dimension, which becomes realization of fractal entropy concept. In the subsequent section, we also discuss plausible extension of this unified statistics to include anisotropic effect by using quaternion oscillator, which may be observed in the context of Cosmic Microwave Background Radiation. Further observation is of course recommended in order to refute or verify this proposition.

  17. Descriptive data analysis.

    Science.gov (United States)

    Thompson, Cheryl Bagley

    2009-01-01

    This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.

  18. Statistical description of turbulent dispersion

    NARCIS (Netherlands)

    Brouwers, J.J.H.

    2012-01-01

    We derive a comprehensive statistical model for dispersion of passive or almost passive admixture particles such as fine particulate matter, aerosols, smoke and fumes, in turbulent flow. The model rests on the Markov limit for particle velocity. It is in accordance with the asymptotic structure of

  19. Statistical physics including applications to condensed matter

    CERN Document Server

    Hermann, Claudine

    2005-01-01

    Statistical Physics bridges the properties of a macroscopic system and the microscopic behavior of its constituting particles, otherwise impossible due to the giant magnitude of Avogadro's number. Numerous systems of today's key technologies -- as e.g. semiconductors or lasers -- are macroscopic quantum objects; only statistical physics allows for understanding their fundamentals. Therefore, this graduate text also focuses on particular applications such as the properties of electrons in solids with applications, and radiation thermodynamics and the greenhouse effect.

  20. Statistics 101 for Radiologists.

    Science.gov (United States)

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.

  1. Fast Quantum Algorithm for Predicting Descriptive Statistics of Stochastic Processes

    Science.gov (United States)

    Williams Colin P.

    1999-01-01

    Stochastic processes are used as a modeling tool in several sub-fields of physics, biology, and finance. Analytic understanding of the long term behavior of such processes is only tractable for very simple types of stochastic processes such as Markovian processes. However, in real world applications more complex stochastic processes often arise. In physics, the complicating factor might be nonlinearities; in biology it might be memory effects; and in finance is might be the non-random intentional behavior of participants in a market. In the absence of analytic insight, one is forced to understand these more complex stochastic processes via numerical simulation techniques. In this paper we present a quantum algorithm for performing such simulations. In particular, we show how a quantum algorithm can predict arbitrary descriptive statistics (moments) of N-step stochastic processes in just O(square root of N) time. That is, the quantum complexity is the square root of the classical complexity for performing such simulations. This is a significant speedup in comparison to the current state of the art.

  2. Statistical methods in radiation physics

    CERN Document Server

    Turner, James E; Bogard, James S

    2012-01-01

    This statistics textbook, with particular emphasis on radiation protection and dosimetry, deals with statistical solutions to problems inherent in health physics measurements and decision making. The authors begin with a description of our current understanding of the statistical nature of physical processes at the atomic level, including radioactive decay and interactions of radiation with matter. Examples are taken from problems encountered in health physics, and the material is presented such that health physicists and most other nuclear professionals will more readily understand the application of statistical principles in the familiar context of the examples. Problems are presented at the end of each chapter, with solutions to selected problems provided online. In addition, numerous worked examples are included throughout the text.

  3. Fish: A New Computer Program for Friendly Introductory Statistics Help

    Science.gov (United States)

    Brooks, Gordon P.; Raffle, Holly

    2005-01-01

    All introductory statistics students must master certain basic descriptive statistics, including means, standard deviations and correlations. Students must also gain insight into such complex concepts as the central limit theorem and standard error. This article introduces and describes the Friendly Introductory Statistics Help (FISH) computer…

  4. Levy's distributions for statistical description of fractal structures; discontinuous metal films on dielectric substrates

    International Nuclear Information System (INIS)

    Dobierzewska-Mozrzymas, E.; Szymczak, G.; Bieganski, P.; Pieciul, E.

    2003-01-01

    The ranges of statistical description of the systems may be determined on the basis of the inverse power law of the Mandelbrot. The slope of the straight line representing the power law in a double-logarithmic plot, determined as -1/μ (μ being a critical exponent), characterizes the distribution of elements in the system. In this paper, the inverse power law is used to describe the statistical distribution of discontinuous metal films with higher coverage coefficients (near percolation threshold). For these films the critical exponent μ∼1, both the mean value and the variance are infinite. The objects with such microstructure are described according to the Levy distribution; Cauchy, inverse Gauss and inverse gamma distribution, respectively. The experimental histograms are compared with the calculated ones. Inhomogeneous metal films were obtained experimentally, their microstructures were examined by means of electron microscope. On the basis of electron micrographs, the fractal dimensions were determined for the metal films with coverage coefficient ranging from 0.35 to 1.00

  5. Business statistics I essentials

    CERN Document Server

    Clark, Louise

    2014-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Business Statistics I includes descriptive statistics, introduction to probability, probability distributions, sampling and sampling distributions, interval estimation, and hypothesis t

  6. Australian energy statistics - Australian energy update 2005

    Energy Technology Data Exchange (ETDEWEB)

    Donaldson, K.

    2005-06-15

    ABARE's energy statistics include comprehensive coverage of Australian energy consumption, by state, by industry and by fuel. Australian Energy Update 2005 provides an overview of recent trends and description of the full coverage of the dataset. There are 14 Australian energy statistical tables available as free downloads (product codes 13172 to 13185).

  7. Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference

    Energy Technology Data Exchange (ETDEWEB)

    Beggs, W.J.

    1981-02-01

    This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; the analysis of variance; quality control procedures; and linear regression analysis.

  8. Nuclear matter descriptions including quark structure of the hadrons

    International Nuclear Information System (INIS)

    Huguet, R.

    2008-07-01

    It is nowadays well established that nucleons are composite objects made of quarks and gluons, whose interactions are described by Quantum chromodynamics (QCD). However, because of the non-perturbative character of QCD at the energies of nuclear physics, a description of atomic nuclei starting from quarks and gluons is still not available. A possible alternative is to construct effective field theories based on hadronic degrees of freedom, in which the interaction is constrained by QCD. In this framework, we have constructed descriptions of infinite nuclear matter in relativistic mean field theories taking into account the quark structure of hadrons. In a first approach, the in medium modifications of mesons properties is dynamically obtained in a Nambu-Jona-Lasinio (NJL) quark model. This modification is taken into account in a relativistic mean field theory based on a meson exchange interaction between nucleons. The in-medium modification of mesons masses and the properties of infinite nuclear matter have been studied. In a second approach, the long and short range contributions to the in-medium modification of the nucleon are determined. The short range part is obtained in a NJL quark model of the nucleon. The long range part, related to pions exchanges between nucleons, has been determined in the framework of Chiral Perturbation theory. These modifications have been used to constrain the couplings of a point coupling relativistic mean field model. A realistic description of the saturation properties of nuclear matter is obtained. (author)

  9. Mathematics and Statistics Research Department progress report for period ending June 30, 1977

    International Nuclear Information System (INIS)

    Lever, W.E.; Shepherd, D.E.; Ward, R.C.; Wilson, D.G.

    1977-09-01

    Brief descriptions are given of work done in mathematical and statistical research (moving-boundary problems; numerical analysis; continuum mechanics; matrices and other operators; experiment design; statistical testing; multivariate, multipopulation classification; statistical estimation) and statistical and mathematical collaboration (analytical chemistry, biological research, chemistry and physics research, energy research, engineering technology research, environmental sciences research, health physics research, meterials research, sampling inspection and quality control, uranium resource evaluation research). Most of the descriptions are a page or less in length. Educational activities, publications, seminar titles, etc., are also included

  10. Academic dishonesty among nursing students: a descriptive study.

    Science.gov (United States)

    Keçeci, Ayla; Bulduk, Serap; Oruç, Deniz; Çelik, Serpil

    2011-09-01

    This descriptive and cross-sectional study aims to evaluate academic dishonesty among university nursing students in Turkey. The study's sample included 196 students. Two instruments were used for gathering data. The first instrument, a questionnaire, which included some socio-demographic variables (age, class, gender, education, family structure, parents' attitude and educators' attitude) formed the first part. The second part included the Academic Dishonesty Tendency Scale developed by Eminoğlu and Nartgün. The data were analyzed using descriptive statistics and Kruskall Wallis, One-way Anova, t- test and Mann-Whitney U test. It was found that academic dishonesty was at medium-level (2.60-3.39) in nursing students.

  11. A Nineteenth Century Statistical Society that Abandoned Statistics

    NARCIS (Netherlands)

    Stamhuis, I.H.

    2007-01-01

    In 1857, a Statistical Society was founded in the Netherlands. Within this society, statistics was considered a systematic, quantitative, and qualitative description of society. In the course of time, the society attracted a wide and diverse membership, although the number of physicians on its rolls

  12. Generalized quantum statistics

    International Nuclear Information System (INIS)

    Chou, C.

    1992-01-01

    In the paper, a non-anyonic generalization of quantum statistics is presented, in which Fermi-Dirac statistics (FDS) and Bose-Einstein statistics (BES) appear as two special cases. The new quantum statistics, which is characterized by the dimension of its single particle Fock space, contains three consistent parts, namely the generalized bilinear quantization, the generalized quantum mechanical description and the corresponding statistical mechanics

  13. A statistical description of the types and severities of accidents involving tractor semi-trailers

    International Nuclear Information System (INIS)

    Clauss, D.B.; Wilson, R.K.; Blower, D.F.; Campbell, K.L.

    1994-06-01

    This report provides a statistical description of the types and severities of tractor semi-trailer accidents involving at least one fatality. The data were developed for use in risk assessments of hazardous materials transportation. Several accident databases were reviewed to determine their suitability to the task. The TIFA (Trucks Involved in Fatal Accidents) database created at the University of Michigan Transportation Research Institute was extensively utilized. Supplementary data on collision and fire severity, which was not available in the TIFA database, were obtained by reviewing police reports for selected TIFA accidents. The results are described in terms of frequencies of different accident types and cumulative distribution functions for the peak contact velocity, rollover skid distance, fire temperature, fire size, fire separation, and fire duration

  14. Statistical Methods for Fuzzy Data

    CERN Document Server

    Viertl, Reinhard

    2011-01-01

    Statistical data are not always precise numbers, or vectors, or categories. Real data are frequently what is called fuzzy. Examples where this fuzziness is obvious are quality of life data, environmental, biological, medical, sociological and economics data. Also the results of measurements can be best described by using fuzzy numbers and fuzzy vectors respectively. Statistical analysis methods have to be adapted for the analysis of fuzzy data. In this book, the foundations of the description of fuzzy data are explained, including methods on how to obtain the characterizing function of fuzzy m

  15. Statistical Description of Segregation in a Powder Mixture

    DEFF Research Database (Denmark)

    Chapiro, Alexander; Stenby, Erling Halfdan

    1996-01-01

    In this paper we apply the statistical mechanics of powders to describe a segregated state in a mixture of grains of different sizes. Variation of the density of a packing with depth arising due to changes of particle configurations is studied. The statistical mechanics of powders is generalized...

  16. A Fuzzy Modeling Approach for Replicated Response Measures Based on Fuzzification of Replications with Descriptive Statistics and Golden Ratio

    Directory of Open Access Journals (Sweden)

    Özlem TÜRKŞEN

    2018-03-01

    Full Text Available Some of the experimental designs can be composed of replicated response measures in which the replications cannot be identified exactly and may have uncertainty different than randomness. Then, the classical regression analysis may not be proper to model the designed data because of the violation of probabilistic modeling assumptions. In this case, fuzzy regression analysis can be used as a modeling tool. In this study, the replicated response values are newly formed to fuzzy numbers by using descriptive statistics of replications and golden ratio. The main aim of the study is obtaining the most suitable fuzzy model for replicated response measures through fuzzification of the replicated values by taking into account the data structure of the replications in statistical framework. Here, the response and unknown model coefficients are considered as triangular type-1 fuzzy numbers (TT1FNs whereas the inputs are crisp. Predicted fuzzy models are obtained according to the proposed fuzzification rules by using Fuzzy Least Squares (FLS approach. The performances of the predicted fuzzy models are compared by using Root Mean Squared Error (RMSE criteria. A data set from the literature, called wheel cover component data set, is used to illustrate the performance of the proposed approach and the obtained results are discussed. The calculation results show that the combined formulation of the descriptive statistics and the golden ratio is the most preferable fuzzification rule according to the well-known decision making method, called TOPSIS, for the data set.

  17. Statistical methods in personality assessment research.

    Science.gov (United States)

    Schinka, J A; LaLone, L; Broeckel, J A

    1997-06-01

    Emerging models of personality structure and advances in the measurement of personality and psychopathology suggest that research in personality and personality assessment has entered a stage of advanced development, in this article we examine whether researchers in these areas have taken advantage of new and evolving statistical procedures. We conducted a review of articles published in the Journal of Personality, Assessment during the past 5 years. Of the 449 articles that included some form of data analysis, 12.7% used only descriptive statistics, most employed only univariate statistics, and fewer than 10% used multivariate methods of data analysis. We discuss the cost of using limited statistical methods, the possible reasons for the apparent reluctance to employ advanced statistical procedures, and potential solutions to this technical shortcoming.

  18. Statistical Analysis of Research Data | Center for Cancer Research

    Science.gov (United States)

    Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. The Statistical Analysis of Research Data (SARD) course will be held on April 5-6, 2018 from 9 a.m.-5 p.m. at the National Institutes of Health's Natcher Conference Center, Balcony C on the Bethesda Campus. SARD is designed to provide an overview on the general principles of statistical analysis of research data.  The first day will feature univariate data analysis, including descriptive statistics, probability distributions, one- and two-sample inferential statistics.

  19. Experimental observations of Lagrangian sand grain kinematics under bedload transport: statistical description of the step and rest regimes

    Science.gov (United States)

    Guala, M.; Liu, M.

    2017-12-01

    The kinematics of sediment particles is investigated by non-intrusive imaging methods to provide a statistical description of bedload transport in conditions near the threshold of motion. In particular, we focus on the cyclic transition between motion and rest regimes to quantify the waiting time statistics inferred to be responsible for anomalous diffusion, and so far elusive. Despite obvious limitations in the spatio-temporal domain of the observations, we are able to identify the probability distributions of the particle step time and length, velocity, acceleration, waiting time, and thus distinguish which quantities exhibit well converged mean values, based on the thickness of their respective tails. The experimental results shown here for four different transport conditions highlight the importance of the waiting time distribution and represent a benchmark dataset for the stochastic modeling of bedload transport.

  20. Quality of reporting statistics in two Indian pharmacology journals.

    Science.gov (United States)

    Jaykaran; Yadav, Preeti

    2011-04-01

    To evaluate the reporting of the statistical methods in articles published in two Indian pharmacology journals. All original articles published since 2002 were downloaded from the journals' (Indian Journal of Pharmacology (IJP) and Indian Journal of Physiology and Pharmacology (IJPP)) website. These articles were evaluated on the basis of appropriateness of descriptive statistics and inferential statistics. Descriptive statistics was evaluated on the basis of reporting of method of description and central tendencies. Inferential statistics was evaluated on the basis of fulfilling of assumption of statistical methods and appropriateness of statistical tests. Values are described as frequencies, percentage, and 95% confidence interval (CI) around the percentages. Inappropriate descriptive statistics was observed in 150 (78.1%, 95% CI 71.7-83.3%) articles. Most common reason for this inappropriate descriptive statistics was use of mean ± SEM at the place of "mean (SD)" or "mean ± SD." Most common statistical method used was one-way ANOVA (58.4%). Information regarding checking of assumption of statistical test was mentioned in only two articles. Inappropriate statistical test was observed in 61 (31.7%, 95% CI 25.6-38.6%) articles. Most common reason for inappropriate statistical test was the use of two group test for three or more groups. Articles published in two Indian pharmacology journals are not devoid of statistical errors.

  1. Writing to Learn Statistics in an Advanced Placement Statistics Course

    Science.gov (United States)

    Northrup, Christian Glenn

    2012-01-01

    This study investigated the use of writing in a statistics classroom to learn if writing provided a rich description of problem-solving processes of students as they solved problems. Through analysis of 329 written samples provided by students, it was determined that writing provided a rich description of problem-solving processes and enabled…

  2. Statistical correlation of structural mode shapes from test measurements and NASTRAN analytical values

    Science.gov (United States)

    Purves, L.; Strang, R. F.; Dube, M. P.; Alea, P.; Ferragut, N.; Hershfeld, D.

    1983-01-01

    The software and procedures of a system of programs used to generate a report of the statistical correlation between NASTRAN modal analysis results and physical tests results from modal surveys are described. Topics discussed include: a mathematical description of statistical correlation, a user's guide for generating a statistical correlation report, a programmer's guide describing the organization and functions of individual programs leading to a statistical correlation report, and a set of examples including complete listings of programs, and input and output data.

  3. Brownian quasi-particles in statistical physics

    International Nuclear Information System (INIS)

    Tellez-Arenas, A.; Fronteau, J.; Combis, P.

    1979-01-01

    The idea of a Brownian quasi-particle and the associated differentiable flow (with nonselfadjoint forces) are used here in the context of a stochastic description of the approach towards statistical equilibrium. We show that this quasi-particle flow acquires, at equilibrium, the principal properties of a conservative Hamiltonian flow. Thus the model of Brownian quasi-particles permits us to establish a link between the stochastic description and the Gibbs description of statistical equilibrium

  4. The descriptive statistics for the input parameters in the new selectiv galena and spalerite flotation in Sasa mine, Macedonia

    OpenAIRE

    Krstev, Boris; Golomeov, Blagoj; Krstev, Aleksandar; Vuckovski, Zoran; Vuckovski, Goce; Krstev, Dejan

    2011-01-01

    In this paper the descriptive statistics of the obtained results in the selective galena and sphalerite flotation from the Sasa mine, Macedonia will be shown. The consumption of the flotation reagents, bails and rods grinding media in the flotation flowsheet, lead and zinc feed contents, lead and zinc concentrate contents, the appropriate recoveries of the mentioned minerals with estimation of the correlation for reagents regime, recoveries, contents in the lead and zinc feeds and concentrate...

  5. The Relationship between Test Anxiety and Academic Performance of Students in Vital Statistics Course

    Directory of Open Access Journals (Sweden)

    Shirin Iranfar

    2013-12-01

    Full Text Available Introduction: Test anxiety is a common phenomenon among students and is one of the problems of educational system. The present study was conducted to investigate the test anxiety in vital statistics course and its association with academic performance of students at Kermanshah University of Medical Sciences. This study was descriptive-analytical and the study sample included the students studying in nursing and midwifery, paramedicine and health faculties that had taken vital statistics course and were selected through census method. Sarason questionnaire was used to analyze the test anxiety. Data were analyzed by descriptive and inferential statistics. The findings indicated no significant correlation between test anxiety and score of vital statistics course.

  6. Statistical methods used in the public health literature and implications for training of public health professionals.

    Science.gov (United States)

    Hayat, Matthew J; Powell, Amanda; Johnson, Tessa; Cadwell, Betsy L

    2017-01-01

    Statistical literacy and knowledge is needed to read and understand the public health literature. The purpose of this study was to quantify basic and advanced statistical methods used in public health research. We randomly sampled 216 published articles from seven top tier general public health journals. Studies were reviewed by two readers and a standardized data collection form completed for each article. Data were analyzed with descriptive statistics and frequency distributions. Results were summarized for statistical methods used in the literature, including descriptive and inferential statistics, modeling, advanced statistical techniques, and statistical software used. Approximately 81.9% of articles reported an observational study design and 93.1% of articles were substantively focused. Descriptive statistics in table or graphical form were reported in more than 95% of the articles, and statistical inference reported in more than 76% of the studies reviewed. These results reveal the types of statistical methods currently used in the public health literature. Although this study did not obtain information on what should be taught, information on statistical methods being used is useful for curriculum development in graduate health sciences education, as well as making informed decisions about continuing education for public health professionals.

  7. Quantum mechanics from classical statistics

    International Nuclear Information System (INIS)

    Wetterich, C.

    2010-01-01

    Quantum mechanics can emerge from classical statistics. A typical quantum system describes an isolated subsystem of a classical statistical ensemble with infinitely many classical states. The state of this subsystem can be characterized by only a few probabilistic observables. Their expectation values define a density matrix if they obey a 'purity constraint'. Then all the usual laws of quantum mechanics follow, including Heisenberg's uncertainty relation, entanglement and a violation of Bell's inequalities. No concepts beyond classical statistics are needed for quantum physics - the differences are only apparent and result from the particularities of those classical statistical systems which admit a quantum mechanical description. Born's rule for quantum mechanical probabilities follows from the probability concept for a classical statistical ensemble. In particular, we show how the non-commuting properties of quantum operators are associated to the use of conditional probabilities within the classical system, and how a unitary time evolution reflects the isolation of the subsystem. As an illustration, we discuss a classical statistical implementation of a quantum computer.

  8. ODM Data Analysis-A tool for the automatic validation, monitoring and generation of generic descriptive statistics of patient data.

    Science.gov (United States)

    Brix, Tobias Johannes; Bruland, Philipp; Sarfraz, Saad; Ernsting, Jan; Neuhaus, Philipp; Storck, Michael; Doods, Justin; Ständer, Sonja; Dugas, Martin

    2018-01-01

    A required step for presenting results of clinical studies is the declaration of participants demographic and baseline characteristics as claimed by the FDAAA 801. The common workflow to accomplish this task is to export the clinical data from the used electronic data capture system and import it into statistical software like SAS software or IBM SPSS. This software requires trained users, who have to implement the analysis individually for each item. These expenditures may become an obstacle for small studies. Objective of this work is to design, implement and evaluate an open source application, called ODM Data Analysis, for the semi-automatic analysis of clinical study data. The system requires clinical data in the CDISC Operational Data Model format. After uploading the file, its syntax and data type conformity of the collected data is validated. The completeness of the study data is determined and basic statistics, including illustrative charts for each item, are generated. Datasets from four clinical studies have been used to evaluate the application's performance and functionality. The system is implemented as an open source web application (available at https://odmanalysis.uni-muenster.de) and also provided as Docker image which enables an easy distribution and installation on local systems. Study data is only stored in the application as long as the calculations are performed which is compliant with data protection endeavors. Analysis times are below half an hour, even for larger studies with over 6000 subjects. Medical experts have ensured the usefulness of this application to grant an overview of their collected study data for monitoring purposes and to generate descriptive statistics without further user interaction. The semi-automatic analysis has its limitations and cannot replace the complex analysis of statisticians, but it can be used as a starting point for their examination and reporting.

  9. Optimage central organised image quality control including statistics and reporting

    International Nuclear Information System (INIS)

    Jahnen, A.; Schilz, C.; Shannoun, F.; Schreiner, A.; Hermen, J.; Moll, C.

    2008-01-01

    Quality control of medical imaging systems is performed using dedicated phantoms. As the imaging systems are more and more digital, adequate image processing methods might help to save evaluation time and to receive objective results. The developed software package OPTIMAGE is focusing on this with a central approach: On one hand, OPTIMAGE provides a framework, which includes functions like database integration, DICOM data sources, multilingual user interface and image processing functionality. On the other hand, the test methods are implemented using modules which are able to process the images automatically for the common imaging systems. The integration of statistics and reporting into this environment is paramount: This is the only way to provide these functions in an interactive, user-friendly way. These features enable the users to discover degradation in performance quickly and document performed measurements easily. (authors)

  10. Mortality variation across Australia: descriptive data for states and territories, and statistical divisions.

    Science.gov (United States)

    Wilkinson, D; Hiller, J; Moss, J; Ryan, P; Worsley, T

    2000-06-01

    To describe variation in all cause and selected cause-specific mortality rates across Australia. Mortality and population data for 1997 were obtained from the Australian Bureau of Statistics. All cause and selected cause-specific mortality rates were calculated and directly standardised to the 1997 Australian population in 5-year age groups. Selected major causes of death included cancer, coronary artery disease, cerebrovascular disease, diabetes, accidents and suicide. Rates are reported by statistical division, and State and Territory. All cause age-standardised mortality was 6.98 per 1000 in 1997 and this varied 2-fold from a low in the statistical division of Pilbara, Western Australia (5.78, 95% confidence interval 5.06-6.56), to a high in Northern Territory--excluding Darwin (11.30, 10.67-11.98). Similar mortality variation (all p killers. Larger variation (all p suicide (0.6-3.8 per 10,000). Less marked variation was observed when analysed by State and Territory, but Northern Territory consistently has the highest age-standardised mortality rates. Analysed by statistical division, substantial mortality gradients exist across Australia, suggesting an inequitable distribution of the determinants of health. Further research is required to better understand this heterogeneity.

  11. Analysis of statistical misconception in terms of statistical reasoning

    Science.gov (United States)

    Maryati, I.; Priatna, N.

    2018-05-01

    Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.

  12. TD-S-HF single determinantal reaction theory and the description of many-body processes, including fission

    International Nuclear Information System (INIS)

    Griffin, J.J.; Lichtner, P.C.; Dworzecka, M.; Kan, K.K.

    1979-01-01

    The restrictions implied for the time dependent many-body reaction theory by the (TDHF) single determinantal assumption are explored by constructive analysis. A restructured TD-S-HF reaction theory is modelled, not after the initial-value form of the Schroedinger reaction theory, but after the (fully equivalent) S-matrix form, under the conditions that only self-consistent TDHF solutions occur in the theory, every wave function obeys the fundamental statistical interpretation of quantum mechanics, and the theory reduces to the exact Schroedinger theory for exact solutions which are single determinantal. All of these conditions can be accomodated provided that the theory is interpreted on a time-averaged basis, i.e., physical constants of the Schroedinger theory which are time-dependent in the TDHF theory, are interpreted in TD-S-HF in terms of their time averaged values. The resulting reaction theory, although formulated heuristically, prescribes a well defined and unambiguous calculational program which, although somewhat more demanding technically than the conventional initial-value TDHF method, is nevertheless more consonant with first principles, structurally and mechanistically. For its physical predictions do not depend upon the precise location of the distant measuring apparatus, and are in no way influenced by the spurious cross channel correlations which arise whenever the description of many reaction channels is imposed upon one single-determinantal solution. For nuclear structure physics, the TDHF-eigenfunctions provide the first plausible description of exact eigenstates in the time-dependent framework; moreover, they are unencumbered by any restriction to small amplitudes. 14 references

  13. Neural networks and statistical learning

    CERN Document Server

    Du, Ke-Lin

    2014-01-01

    Providing a broad but in-depth introduction to neural network and machine learning in a statistical framework, this book provides a single, comprehensive resource for study and further research. All the major popular neural network models and statistical learning approaches are covered with examples and exercises in every chapter to develop a practical working understanding of the content. Each of the twenty-five chapters includes state-of-the-art descriptions and important research results on the respective topics. The broad coverage includes the multilayer perceptron, the Hopfield network, associative memory models, clustering models and algorithms, the radial basis function network, recurrent neural networks, principal component analysis, nonnegative matrix factorization, independent component analysis, discriminant analysis, support vector machines, kernel methods, reinforcement learning, probabilistic and Bayesian networks, data fusion and ensemble learning, fuzzy sets and logic, neurofuzzy models, hardw...

  14. Quality of reporting statistics in two Indian pharmacology journals

    OpenAIRE

    Jaykaran,; Yadav, Preeti

    2011-01-01

    Objective: To evaluate the reporting of the statistical methods in articles published in two Indian pharmacology journals. Materials and Methods: All original articles published since 2002 were downloaded from the journals′ (Indian Journal of Pharmacology (IJP) and Indian Journal of Physiology and Pharmacology (IJPP)) website. These articles were evaluated on the basis of appropriateness of descriptive statistics and inferential statistics. Descriptive statistics was evaluated on the basis of...

  15. Applied statistical methods in agriculture, health and life sciences

    CERN Document Server

    Lawal, Bayo

    2014-01-01

    This textbook teaches crucial statistical methods to answer research questions using a unique range of statistical software programs, including MINITAB and R. This textbook is developed for undergraduate students in agriculture, nursing, biology and biomedical research. Graduate students will also find it to be a useful way to refresh their statistics skills and to reference software options. The unique combination of examples is approached using MINITAB and R for their individual strengths. Subjects covered include among others data description, probability distributions, experimental design, regression analysis, randomized design and biological assay. Unlike other biostatistics textbooks, this text also includes outliers, influential observations in regression and an introduction to survival analysis. Material is taken from the author's extensive teaching and research in Africa, USA and the UK. Sample problems, references and electronic supplementary material accompany each chapter.

  16. STATLIB, Interactive Statistics Program Library of Tutorial System

    International Nuclear Information System (INIS)

    Anderson, H.E.

    1986-01-01

    1 - Description of program or function: STATLIB is a conversational statistical program library developed in conjunction with a Sandia National Laboratories applied statistics course intended for practicing engineers and scientists. STATLIB is a group of 15 interactive, argument-free, statistical routines. Included are analysis of sensitivity tests; sample statistics for the normal, exponential, hypergeometric, Weibull, and extreme value distributions; three models of multiple regression analysis; x-y data plots; exact probabilities for RxC tables; n sets of m permuted integers in the range 1 to m; simple linear regression and correlation; K different random integers in the range m to n; and Fisher's exact test of independence for a 2 by 2 contingency table. Forty-five other subroutines in the library support the basic 15

  17. Practical Statistics for LHC Physicists: Bayesian Inference (3/3)

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    These lectures cover those principles and practices of statistics that are most relevant for work at the LHC. The first lecture discusses the basic ideas of descriptive statistics, probability and likelihood. The second lecture covers the key ideas in the frequentist approach, including confidence limits, profile likelihoods, p-values, and hypothesis testing. The third lecture covers inference in the Bayesian approach. Throughout, real-world examples will be used to illustrate the practical application of the ideas. No previous knowledge is assumed.

  18. Practical Statistics for LHC Physicists: Frequentist Inference (2/3)

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    These lectures cover those principles and practices of statistics that are most relevant for work at the LHC. The first lecture discusses the basic ideas of descriptive statistics, probability and likelihood. The second lecture covers the key ideas in the frequentist approach, including confidence limits, profile likelihoods, p-values, and hypothesis testing. The third lecture covers inference in the Bayesian approach. Throughout, real-world examples will be used to illustrate the practical application of the ideas. No previous knowledge is assumed.

  19. Using R for Data Management, Statistical Analysis, and Graphics

    CERN Document Server

    Horton, Nicholas J

    2010-01-01

    This title offers quick and easy access to key element of documentation. It includes worked examples across a wide variety of applications, tasks, and graphics. "Using R for Data Management, Statistical Analysis, and Graphics" presents an easy way to learn how to perform an analytical task in R, without having to navigate through the extensive, idiosyncratic, and sometimes unwieldy software documentation and vast number of add-on packages. Organized by short, clear descriptive entries, the book covers many common tasks, such as data management, descriptive summaries, inferential proc

  20. Statistics in a nutshell

    CERN Document Server

    Boslaugh, Sarah

    2013-01-01

    Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.

  1. Statistical measures of galaxy clustering

    International Nuclear Information System (INIS)

    Porter, D.H.

    1988-01-01

    Consideration is given to the large-scale distribution of galaxies and ways in which this distribution may be statistically measured. Galaxy clustering is hierarchical in nature, so that the positions of clusters of galaxies are themselves spatially clustered. A simple identification of groups of galaxies would be an inadequate description of the true richness of galaxy clustering. Current observations of the large-scale structure of the universe and modern theories of cosmology may be studied with a statistical description of the spatial and velocity distributions of galaxies. 8 refs

  2. Statistical yearbook 2005. Data available as of March 2006. 50 ed

    International Nuclear Information System (INIS)

    2006-08-01

    The Statistical Yearbook is an annual compilation of a wide range of international economic, social and environmental statistics on over 200 countries and areas, compiled from sources including UN agencies and other international, national and specialized organizations. The 50th issue contains data available to the Statistics Division as of March 2006 and presents them in 76 tables. The number of years of data shown in the tables varies from one to ten, with the ten-year tables covering 1994 to 2003 or 1995 to 2004. Accompanying the tables are technical notes providing brief descriptions of major statistical concepts, definitions and classifications

  3. Fermi-Dirac statistics plus liquid description of quark partons

    International Nuclear Information System (INIS)

    Buccella, F.; Migliore, G.; Tibullo, V.

    1995-01-01

    A previous approach with Fermi-Dirac distributions for fermion partons is here improved to comply with the expected low x behaviour of structure functions. We are so able to get a fair description of the unpolarized and polarized structure functions of the nucleons as well as of neutrino data. We cannot reach definite conclusions, but confirm our suspicion of a relationship between the defects in Gottfried and spin sum rules. (orig.)

  4. Representation-free description of light-pulse atom interferometry including non-inertial effects

    Energy Technology Data Exchange (ETDEWEB)

    Kleinert, Stephan, E-mail: stephan.kleinert@uni-ulm.de [Institut für Quantenphysik and Center for Integrated Quantum Science and Technology (IQST), Universität Ulm, Albert-Einstein-Allee 11, D-89081 Ulm (Germany); Kajari, Endre; Roura, Albert [Institut für Quantenphysik and Center for Integrated Quantum Science and Technology (IQST), Universität Ulm, Albert-Einstein-Allee 11, D-89081 Ulm (Germany); Schleich, Wolfgang P. [Institut für Quantenphysik and Center for Integrated Quantum Science and Technology (IQST), Universität Ulm, Albert-Einstein-Allee 11, D-89081 Ulm (Germany); Texas A& M University Institute for Advanced Study (TIAS), Institute for Quantum Science and Engineering (IQSE) and Department of Physics and Astronomy, Texas A& M University College Station, TX 77843-4242 (United States)

    2015-12-30

    Light-pulse atom interferometers rely on the wave nature of matter and its manipulation with coherent laser pulses. They are used for precise gravimetry and inertial sensing as well as for accurate measurements of fundamental constants. Reaching higher precision requires longer interferometer times which are naturally encountered in microgravity environments such as drop-tower facilities, sounding rockets and dedicated satellite missions aiming at fundamental quantum physics in space. In all those cases, it is necessary to consider arbitrary trajectories and varying orientations of the interferometer set-up in non-inertial frames of reference. Here we provide a versatile representation-free description of atom interferometry entirely based on operator algebra to address this general situation. We show how to analytically determine the phase shift as well as the visibility of interferometers with an arbitrary number of pulses including the effects of local gravitational accelerations, gravity gradients, the rotation of the lasers and non-inertial frames of reference. Our method conveniently unifies previous results and facilitates the investigation of novel interferometer geometries.

  5. Introduction to statistics using interactive MM*Stat elements

    CERN Document Server

    Härdle, Wolfgang Karl; Rönz, Bernd

    2015-01-01

    MM*Stat, together with its enhanced online version with interactive examples, offers a flexible tool that facilitates the teaching of basic statistics. It covers all the topics found in introductory descriptive statistics courses, including simple linear regression and time series analysis, the fundamentals of inferential statistics (probability theory, random sampling and estimation theory), and inferential statistics itself (confidence intervals, testing). MM*Stat is also designed to help students rework class material independently and to promote comprehension with the help of additional examples. Each chapter starts with the necessary theoretical background, which is followed by a variety of examples. The core examples are based on the content of the respective chapter, while the advanced examples, designed to deepen students’ knowledge, also draw on information and material from previous chapters. The enhanced online version helps students grasp the complexity and the practical relevance of statistical...

  6. Toward a statistical description of methane emissions from arctic wetlands

    DEFF Research Database (Denmark)

    Pirk, Norbert; Mastepanov, Mikhail; López-Blanco, Efrén

    2017-01-01

    , where the hypothesized slow-turnover carbon peaked at a time significantly related to the timing of snowmelt. The temporally wider component from fast-turnover carbon dominated the emissions in W Greenland and Svalbard. Altogether, we found no dependence of the total seasonal CH4 budget to the timing......Methane (CH4) emissions from arctic tundra typically follow relations with soil temperature and water table depth, but these process-based descriptions can be difficult to apply to areas where no measurements exist. We formulated a description of the broader temporal flux pattern in the growing...... season based on two distinct CH4 source components from slow and fast-turnover carbon. We used automatic closed chamber flux measurements from NE Greenland (74°N), W Greenland (64°N), and Svalbard (78°N) to identify and discuss these components. The temporal separation was well-suited in NE Greenland...

  7. Notices about using elementary statistics in psychology

    OpenAIRE

    松田, 文子; 三宅, 幹子; 橋本, 優花里; 山崎, 理央; 森田, 愛子; 小嶋, 佳子

    2003-01-01

    Improper uses of elementary statistics that were often observed in beginners' manuscripts and papers were collected and better ways were suggested. This paper consists of three parts: About descriptive statistics, multivariate analyses, and statistical tests.

  8. Limiting processes in non-equilibrium classical statistical mechanics

    International Nuclear Information System (INIS)

    Jancel, R.

    1983-01-01

    After a recall of the basic principles of the statistical mechanics, the results of ergodic theory, the transient at the thermodynamic limit and his link with the transport theory near the equilibrium are analyzed. The fundamental problems put by the description of non-equilibrium macroscopic systems are investigated and the kinetic methods are stated. The problems of the non-equilibrium statistical mechanics are analyzed: irreversibility and coarse-graining, macroscopic variables and kinetic description, autonomous reduced descriptions, limit processes, BBGKY hierarchy, limit theorems [fr

  9. Characteristics of AKR sources: A statistical description

    International Nuclear Information System (INIS)

    Hilgers, A.; Roux, A.; Lundin, R.

    1991-01-01

    A description of plasma properties within the sources of the Auroral Kilometric Radiation (AKR) is given. It is based on data collected during ∼ 50 AKR source crossings in the altitude range between 4,000 and 9,000 km by the Swedish spacecraft Viking. The following results are obtained; (i) the frequency of the lowest frequency peak of the AKR f peak is found to be very close to f ce , the electron gyrofrequency ((f peak -f ce )/f ce ≤ 0.08), on the average, (ii) the lower cutoff frequency f LC is on the average at f ce ((f LC -f ce )/f ce ≅ 0), (iii) in the sources the density is typically less than 1.5 cm -3 , which is of the order of the density of hot electrons and (iv) the source is located within an acceleration region, as evidenced by electrons accelerated above and ions accelerated below

  10. Asymptotic expansion and statistical description of turbulent systems

    International Nuclear Information System (INIS)

    Hagan, W.K. III.

    1986-01-01

    A new approach to studying turbulent systems is presented in which an asymptotic expansion of the general dynamical equations is performed prior to the application of statistical methods for describing the evolution of the system. This approach has been applied to two specific systems: anomalous drift wave turbulence in plasmas and homogeneous, isotropic turbulence in fluids. For the plasma case, the time and length scales of the turbulent state result in the asymptotic expansion of the Vlasov/Poisson equations taking the form of nonlinear gyrokinetic theory. Questions regarding this theory and modern Hamiltonian perturbation methods are discussed and resolved. A new alternative Hamiltonian method is described. The Eulerian Direct Interaction Approximation (EDIA) is slightly reformulated and applied to the equations of nonlinear gyrokinetic theory. Using a similarity transformation technique, expressions for the thermal diffusivity are derived from the EDIA equations for various geometries, including a tokamak. In particular, the unique result for generalized geometry may be of use in evaluating fusion reactor designs and theories of anomalous thermal transport in tokamaks. Finally, a new and useful property of the EDIA is pointed out. For the fluid case, an asymptotic expansion is applied to the Navier-Stokes equation and the results lead to the speculation that such an approach may resolve the problem of predicting the Kolmogorov inertial range energy spectrum for homogeneous, isotropic turbulence. 45 refs., 3 figs

  11. Beyond quantum microcanonical statistics

    International Nuclear Information System (INIS)

    Fresch, Barbara; Moro, Giorgio J.

    2011-01-01

    Descriptions of molecular systems usually refer to two distinct theoretical frameworks. On the one hand the quantum pure state, i.e., the wavefunction, of an isolated system is determined to calculate molecular properties and their time evolution according to the unitary Schroedinger equation. On the other hand a mixed state, i.e., a statistical density matrix, is the standard formalism to account for thermal equilibrium, as postulated in the microcanonical quantum statistics. In the present paper an alternative treatment relying on a statistical analysis of the possible wavefunctions of an isolated system is presented. In analogy with the classical ergodic theory, the time evolution of the wavefunction determines the probability distribution in the phase space pertaining to an isolated system. However, this alone cannot account for a well defined thermodynamical description of the system in the macroscopic limit, unless a suitable probability distribution for the quantum constants of motion is introduced. We present a workable formalism assuring the emergence of typical values of thermodynamic functions, such as the internal energy and the entropy, in the large size limit of the system. This allows the identification of macroscopic properties independently of the specific realization of the quantum state. A description of material systems in agreement with equilibrium thermodynamics is then derived without constraints on the physical constituents and interactions of the system. Furthermore, the canonical statistics is recovered in all generality for the reduced density matrix of a subsystem.

  12. Application of pedagogy reflective in statistical methods course and practicum statistical methods

    Science.gov (United States)

    Julie, Hongki

    2017-08-01

    Subject Elementary Statistics, Statistical Methods and Statistical Methods Practicum aimed to equip students of Mathematics Education about descriptive statistics and inferential statistics. The students' understanding about descriptive and inferential statistics were important for students on Mathematics Education Department, especially for those who took the final task associated with quantitative research. In quantitative research, students were required to be able to present and describe the quantitative data in an appropriate manner, to make conclusions from their quantitative data, and to create relationships between independent and dependent variables were defined in their research. In fact, when students made their final project associated with quantitative research, it was not been rare still met the students making mistakes in the steps of making conclusions and error in choosing the hypothetical testing process. As a result, they got incorrect conclusions. This is a very fatal mistake for those who did the quantitative research. There were some things gained from the implementation of reflective pedagogy on teaching learning process in Statistical Methods and Statistical Methods Practicum courses, namely: 1. Twenty two students passed in this course and and one student did not pass in this course. 2. The value of the most accomplished student was A that was achieved by 18 students. 3. According all students, their critical stance could be developed by them, and they could build a caring for each other through a learning process in this course. 4. All students agreed that through a learning process that they undergo in the course, they can build a caring for each other.

  13. Multiple commodities in statistical microeconomics: Model and market

    Science.gov (United States)

    Baaquie, Belal E.; Yu, Miao; Du, Xin

    2016-11-01

    A statistical generalization of microeconomics has been made in Baaquie (2013). In Baaquie et al. (2015), the market behavior of single commodities was analyzed and it was shown that market data provides strong support for the statistical microeconomic description of commodity prices. The case of multiple commodities is studied and a parsimonious generalization of the single commodity model is made for the multiple commodities case. Market data shows that the generalization can accurately model the simultaneous correlation functions of up to four commodities. To accurately model five or more commodities, further terms have to be included in the model. This study shows that the statistical microeconomics approach is a comprehensive and complete formulation of microeconomics, and which is independent to the mainstream formulation of microeconomics.

  14. On the perceived usefulness of risk descriptions for decision-making in disaster risk management

    International Nuclear Information System (INIS)

    Lin, Lexin; Nilsson, Anders; Sjölin, Johan; Abrahamsson, Marcus; Tehler, Henrik

    2015-01-01

    Managing risk using an “all-hazards” and “whole of society”-approach involves extensive communication of risk descriptions among many stakeholders. In the present study we investigate how professionals working with disaster risk management in such contexts perceive the usefulness of different descriptions of risk. Empirical data from the Swedish disaster risk management system were used in an attempt to investigate the aspects of a risk description that affect its usefulness (as perceived by professionals). Thirty-three local municipal risk and vulnerability assessments (RVA documents) produced in the region of Scania in 2012 were analyzed in terms of six variables. The documents were then ranked by professionals based on their perceived usefulness for decision-making. Statistical analysis was conducted to identify any possible correlations between the overall ranking of the usefulness of the municipal RVA:s and each of the variables. We conclude that the way the likelihood and consequences of scenarios are described influence the perceived usefulness of a risk description. Furthermore, whether descriptions of scenarios are included in a risk description or not, and whether background information concerning the likelihood of scenarios are included also influence perceived usefulness of risk descriptions. - Highlights: • Written communication of risk between professionals is investigated. • The way likelihood is described influences a risk description's usefulness. • The way consequences are described influence a risk description's usefulness. • Whether background information is included in a risk description influences its usefulness

  15. Analysis of laparoscopic port site complications: A descriptive study

    Directory of Open Access Journals (Sweden)

    Somu Karthik

    2013-01-01

    Full Text Available Context: The rate of port site complications following conventional laparoscopic surgery is about 21 per 100,000 cases. It has shown a proportional rise with increase in the size of the port site incision and trocar. Although rare, complications that occur at the port site include infection, bleeding, and port site hernia. Aims: To determine the morbidity associated with ports at the site of their insertion in laparoscopic surgery and to identify risk factors for complications. Settings and Design: Prospective descriptive study. Materials and Methods: In the present descriptive study, a total of 570 patients who underwent laparoscopic surgeries for various ailments between August 2009 and July 2011 at our institute were observed for port site complications prospectively and the complications were reviewed. Statistical Analysis Used: Descriptive statistical analysis was carried out in the present study. The statistical software, namely, SPSS 15.0 was used for the analysis of the data. Results: Of the 570 patients undergoing laparoscopic surgery, 17 (3% had developed complications specifically related to the port site during a minimum follow-up of three months; port site infection (PSI was the most frequent (n = 10, 1.8%, followed by port site bleeding (n = 4, 0.7%, omentum-related complications (n = 2; 0.35%, and port site metastasis (n = 1, 0.175%. Conclusions: Laparoscopic surgeries are associated with minimal port site complications. Complications are related to the increased number of ports. Umbilical port involvement is the commonest. Most complications are manageable with minimal morbidity, and can be further minimized with meticulous surgical technique during entry and exit.

  16. Analysis of laparoscopic port site complications: A descriptive study

    Science.gov (United States)

    Karthik, Somu; Augustine, Alfred Joseph; Shibumon, Mundunadackal Madhavan; Pai, Manohar Varadaraya

    2013-01-01

    CONTEXT: The rate of port site complications following conventional laparoscopic surgery is about 21 per 100,000 cases. It has shown a proportional rise with increase in the size of the port site incision and trocar. Although rare, complications that occur at the port site include infection, bleeding, and port site hernia. AIMS: To determine the morbidity associated with ports at the site of their insertion in laparoscopic surgery and to identify risk factors for complications. SETTINGS AND DESIGN: Prospective descriptive study. MATERIALS AND METHODS: In the present descriptive study, a total of 570 patients who underwent laparoscopic surgeries for various ailments between August 2009 and July 2011 at our institute were observed for port site complications prospectively and the complications were reviewed. STATISTICAL ANALYSIS USED: Descriptive statistical analysis was carried out in the present study. The statistical software, namely, SPSS 15.0 was used for the analysis of the data. RESULTS: Of the 570 patients undergoing laparoscopic surgery, 17 (3%) had developed complications specifically related to the port site during a minimum follow-up of three months; port site infection (PSI) was the most frequent (n = 10, 1.8%), followed by port site bleeding (n = 4, 0.7%), omentum-related complications (n = 2; 0.35%), and port site metastasis (n = 1, 0.175%). CONCLUSIONS: Laparoscopic surgeries are associated with minimal port site complications. Complications are related to the increased number of ports. Umbilical port involvement is the commonest. Most complications are manageable with minimal morbidity, and can be further minimized with meticulous surgical technique during entry and exit. PMID:23741110

  17. Statistics in a Nutshell

    CERN Document Server

    Boslaugh, Sarah

    2008-01-01

    Need to learn statistics as part of your job, or want some help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference that's perfect for anyone with no previous background in the subject. This book gives you a solid understanding of statistics without being too simple, yet without the numbing complexity of most college texts. You get a firm grasp of the fundamentals and a hands-on understanding of how to apply them before moving on to the more advanced material that follows. Each chapter presents you with easy-to-follow descriptions illustrat

  18. A Multidisciplinary Approach for Teaching Statistics and Probability

    Science.gov (United States)

    Rao, C. Radhakrishna

    1971-01-01

    The author presents a syllabus for an introductory (first year after high school) course in statistics and probability and some methods of teaching statistical techniques. The description comes basically from the procedures used at the Indian Statistical Institute, Calcutta. (JG)

  19. Methods of statistical physics

    CERN Document Server

    Akhiezer, Aleksandr I

    1981-01-01

    Methods of Statistical Physics is an exposition of the tools of statistical mechanics, which evaluates the kinetic equations of classical and quantized systems. The book also analyzes the equations of macroscopic physics, such as the equations of hydrodynamics for normal and superfluid liquids and macroscopic electrodynamics. The text gives particular attention to the study of quantum systems. This study begins with a discussion of problems of quantum statistics with a detailed description of the basics of quantum mechanics along with the theory of measurement. An analysis of the asymptotic be

  20. Understanding common statistical methods, Part I: descriptive methods, probability, and continuous data.

    Science.gov (United States)

    Skinner, Carl G; Patel, Manish M; Thomas, Jerry D; Miller, Michael A

    2011-01-01

    Statistical methods are pervasive in medical research and general medical literature. Understanding general statistical concepts will enhance our ability to critically appraise the current literature and ultimately improve the delivery of patient care. This article intends to provide an overview of the common statistical methods relevant to medicine.

  1. Statistical description of heavy truck accidents on representative segments of interstate highway

    International Nuclear Information System (INIS)

    Hartman, W.F.; Davidson, C.A.; Foley, J.T.

    1977-01-01

    Any quantitative analysis of the risk of transportation accidents requires the use of many different statistical distributions. Included among these are the types of accidents which occur and the severity of these when they do occur. Several previous studies have derived this type of information for truck traffic over U. S. highways in general; these data are not necessarily applicable for the anticipated LMFBR spent fuel cask routes. This report presents data for highway segments representative of the specific LMFBR cask routes which are anticipated. These data are based upon a detailed record-by-record review of filed reports for accidents which occurred along the specified route segments

  2. Comparison of Tsallis statistics with the Tsallis-factorized statistics in the ultrarelativistic pp collisions

    International Nuclear Information System (INIS)

    Parvan, A.S.

    2016-01-01

    The Tsallis statistics was applied to describe the experimental data on the transverse momentum distributions of hadrons. We considered the energy dependence of the parameters of the Tsallis-factorized statistics, which is now widely used for the description of the experimental transverse momentum distributions of hadrons, and the Tsallis statistics for the charged pions produced in pp collisions at high energies. We found that the results of the Tsallis-factorized statistics deviate from the results of the Tsallis statistics only at low NA61/SHINE energies when the value of the entropic parameter is close to unity. At higher energies, when the value of the entropic parameter deviates essentially from unity, the Tsallis-factorized statistics satisfactorily recovers the results of the Tsallis statistics. (orig.)

  3. Statistical mechanics of violent relaxation

    International Nuclear Information System (INIS)

    Shu, F.H.

    1978-01-01

    We reexamine the foundations of Lynden-Bell's statistical mechanical discussion of violent relaxation in collisionless stellar systems. We argue that Lynden-Bell's formulation in terms of a continuum description introduces unnecessary complications, and we consider a more conventional formulation in terms of particles. We then find the exclusion principle discovered by Lynden-Bell to be quantitatively important only at phase densities where two-body encounters are no longer negligible. Since the edynamical basis for the exclusion principle vanishes in such cases anyway, Lynden-Bell statistics always reduces in practice to Maxwell-Boltzmann statistics when applied to stellar systems. Lynden-Bell also found the equilibrium distribution function generally to be a sum of Maxwellians with velocity dispersions dependent on the phase density at star formation. We show that this difficulty vanishes in the particulate description for an encounterless stellar system as long as stars of different masses are initially well mixed in phase space. Our methods also demonstrate the equivalence between Gibbs's formalism which uses the microcanonical ensemble and Boltzmann's formalism which uses a coarse-grained continuum description. In addition, we clarify the concept of irreversible behavior on a macroscopic scale for an encounterless stellar system. Finally, we comment on the use of unusual macroscopic constraints to simulate the effects of incomplete relaxation

  4. A new kinetic description for turbulent collisions including mode-coupling

    International Nuclear Information System (INIS)

    Misguich, J.H.; Tchen, C.M.

    1982-07-01

    The usual introduction of higher-order mode-coupling terms in the description of turbulent collisions beyond usual Renormalized Quasi-Linear approximation (RQL) is briefly analyzed. Here new results are derived in the framework of the general kinetic theory, and the equivalence is proved with the long time limit of simple results deduced from the Vlasov equation. The correction to the RQL turbulent collision term is analyzed and a new approximation is proposed. Turbulent collisions are also described by perturbation around the Lagrangian autocorrelation of fluctuating fields. For an homogeneous turbulence, however, the asymptotic integral of this Lagrangian autocorrelation vanishes identically, similarly to what occurs in Brownian motion. For inhomogeneous turbulence this method can nevertheless be used, and higher-order mode-coupling terms can be interpreted as a shielding of elementary Lagrangian turbulent collisions

  5. Thermal site descriptive model. A strategy for the model development during site investigations - version 2

    International Nuclear Information System (INIS)

    Back, Paer-Erik; Sundberg, Jan

    2007-09-01

    This report presents a strategy for describing, predicting and visualising the thermal aspects of the site descriptive model. The strategy is an updated version of an earlier strategy applied in all SDM versions during the initial site investigation phase at the Forsmark and Oskarshamn areas. The previous methodology for thermal modelling did not take the spatial correlation fully into account during simulation. The result was that the variability of thermal conductivity in the rock mass was not sufficiently well described. Experience from earlier thermal SDMs indicated that development of the methodology was required in order describe the spatial distribution of thermal conductivity in the rock mass in a sufficiently reliable way, taking both variability within rock types and between rock types into account. A good description of the thermal conductivity distribution is especially important for the lower tail. This tail is important for the design of a repository because it affects the canister spacing. The presented approach is developed to be used for final SDM regarding thermal properties, primarily thermal conductivity. Specific objectives for the strategy of thermal stochastic modelling are: Description: statistical description of the thermal conductivity of a rock domain. Prediction: prediction of thermal conductivity in a specific rock volume. Visualisation: visualisation of the spatial distribution of thermal conductivity. The thermal site descriptive model should include the temperature distribution and thermal properties of the rock mass. The temperature is the result of the thermal processes in the repository area. Determination of thermal transport properties can be made using different methods, such as laboratory investigations, field measurements, modelling from mineralogical composition and distribution, modelling from density logging and modelling from temperature logging. The different types of data represent different scales, which has to be

  6. Thermal site descriptive model. A strategy for the model development during site investigations - version 2

    Energy Technology Data Exchange (ETDEWEB)

    Back, Paer-Erik; Sundberg, Jan [Geo Innova AB (Sweden)

    2007-09-15

    This report presents a strategy for describing, predicting and visualising the thermal aspects of the site descriptive model. The strategy is an updated version of an earlier strategy applied in all SDM versions during the initial site investigation phase at the Forsmark and Oskarshamn areas. The previous methodology for thermal modelling did not take the spatial correlation fully into account during simulation. The result was that the variability of thermal conductivity in the rock mass was not sufficiently well described. Experience from earlier thermal SDMs indicated that development of the methodology was required in order describe the spatial distribution of thermal conductivity in the rock mass in a sufficiently reliable way, taking both variability within rock types and between rock types into account. A good description of the thermal conductivity distribution is especially important for the lower tail. This tail is important for the design of a repository because it affects the canister spacing. The presented approach is developed to be used for final SDM regarding thermal properties, primarily thermal conductivity. Specific objectives for the strategy of thermal stochastic modelling are: Description: statistical description of the thermal conductivity of a rock domain. Prediction: prediction of thermal conductivity in a specific rock volume. Visualisation: visualisation of the spatial distribution of thermal conductivity. The thermal site descriptive model should include the temperature distribution and thermal properties of the rock mass. The temperature is the result of the thermal processes in the repository area. Determination of thermal transport properties can be made using different methods, such as laboratory investigations, field measurements, modelling from mineralogical composition and distribution, modelling from density logging and modelling from temperature logging. The different types of data represent different scales, which has to be

  7. Introductory statistical mechanics for electron storage rings

    International Nuclear Information System (INIS)

    Jowett, J.M.

    1986-07-01

    These lectures introduce the beam dynamics of electron-positron storage rings with particular emphasis on the effects due to synchrotron radiation. They differ from most other introductions in their systematic use of the physical principles and mathematical techniques of the non-equilibrium statistical mechanics of fluctuating dynamical systems. A self-contained exposition of the necessary topics from this field is included. Throughout the development, a Hamiltonian description of the effects of the externally applied fields is maintained in order to preserve the links with other lectures on beam dynamics and to show clearly the extent to which electron dynamics in non-Hamiltonian. The statistical mechanical framework is extended to a discussion of the conceptual foundations of the treatment of collective effects through the Vlasov equation

  8. The quantum theory of statistical multistep nucleus reactions

    CERN Document Server

    Zhivopistsev, F A

    2002-01-01

    The phenomenological models and quantum approaches to the description of the statistical multistep nuclear reactions are discussed. The basic advantages and deficiencies of various modifications of the quantum theory of the statistical multistep direct reactions: Feshbach-Kerman-Koonin formalism, the generalized model of the statistical multistep reactions (GMSMR) are considered in detail. The possibility of obtaining the consistent description of the experimental spectra for the reactions with nucleons is shown by the particular examples. Further improvement and development of the quantum formalism for the more complete and consecutive description of various mechanisms of the component particle formalism in the output channel, the correct of the unbound state densities of the intermediate and finite nuclei are needed for the analysis of the inclusive reactions with participation of the component particles, (and with an account of the contributions to the cross sections of the nucleus cluster and shell areas)...

  9. Towards a more accurate microscopic description of the moving contact line problem - incorporating nonlocal effects through a statistical mechanics framework

    Science.gov (United States)

    Nold, Andreas; Goddard, Ben; Sibley, David; Kalliadasis, Serafim

    2014-03-01

    Multiscale effects play a predominant role in wetting phenomena such as the moving contact line. An accurate description is of paramount interest for a wide range of industrial applications, yet it is a matter of ongoing research, due to the difficulty of incorporating different physical effects in one model. Important small-scale phenomena are corrections to the attractive fluid-fluid and wall-fluid forces in inhomogeneous density distributions, which often previously have been accounted for by the disjoining pressure in an ad-hoc manner. We systematically derive a novel model for the description of a single-component liquid-vapor multiphase system which inherently incorporates these nonlocal effects. This derivation, which is inspired by statistical mechanics in the framework of colloidal density functional theory, is critically discussed with respect to its assumptions and restrictions. The model is then employed numerically to study a moving contact line of a liquid fluid displacing its vapor phase. We show how nonlocal physical effects are inherently incorporated by the model and describe how classical macroscopic results for the contact line motion are retrieved. We acknowledge financial support from ERC Advanced Grant No. 247031 and Imperial College through a DTG International Studentship.

  10. Six sigma for organizational excellence a statistical approach

    CERN Document Server

    Muralidharan, K

    2015-01-01

    This book discusses the integrated concepts of statistical quality engineering and management tools. It will help readers to understand and apply the concepts of quality through project management and technical analysis, using statistical methods. Prepared in a ready-to-use form, the text will equip practitioners to implement the Six Sigma principles in projects. The concepts discussed are all critically assessed and explained, allowing them to be practically applied in managerial decision-making, and in each chapter, the objectives and connections to the rest of the work are clearly illustrated. To aid in understanding, the book includes a wealth of tables, graphs, descriptions and checklists, as well as charts and plots, worked-out examples and exercises. Perhaps the most unique feature of the book is its approach, using statistical tools, to explain the science behind Six Sigma project management and integrated in engineering concepts. The material on quality engineering and statistical management tools of...

  11. Statistical inference a short course

    CERN Document Server

    Panik, Michael J

    2012-01-01

    A concise, easily accessible introduction to descriptive and inferential techniques Statistical Inference: A Short Course offers a concise presentation of the essentials of basic statistics for readers seeking to acquire a working knowledge of statistical concepts, measures, and procedures. The author conducts tests on the assumption of randomness and normality, provides nonparametric methods when parametric approaches might not work. The book also explores how to determine a confidence interval for a population median while also providing coverage of ratio estimation, randomness, and causal

  12. A Framework for Assessing High School Students' Statistical Reasoning.

    Science.gov (United States)

    Chan, Shiau Wei; Ismail, Zaleha; Sumintono, Bambang

    2016-01-01

    Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students' statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students' statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework's cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments.

  13. AP statistics crash course

    CERN Document Server

    D'Alessio, Michael

    2012-01-01

    AP Statistics Crash Course - Gets You a Higher Advanced Placement Score in Less Time Crash Course is perfect for the time-crunched student, the last-minute studier, or anyone who wants a refresher on the subject. AP Statistics Crash Course gives you: Targeted, Focused Review - Study Only What You Need to Know Crash Course is based on an in-depth analysis of the AP Statistics course description outline and actual Advanced Placement test questions. It covers only the information tested on the exam, so you can make the most of your valuable study time. Our easy-to-read format covers: exploring da

  14. Statistical electromagnetics: Complex cavities

    NARCIS (Netherlands)

    Naus, H.W.L.

    2008-01-01

    A selection of the literature on the statistical description of electromagnetic fields and complex cavities is concisely reviewed. Some essential concepts, for example, the application of the central limit theorem and the maximum entropy principle, are scrutinized. Implicit assumptions, biased

  15. Experimental uncertainty estimation and statistics for data having interval uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)

    2007-05-01

    This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

  16. Statistical mechanics of directed models of polymers in the square lattice

    CERN Document Server

    Rensburg, J V

    2003-01-01

    Directed square lattice models of polymers and vesicles have received considerable attention in the recent mathematical and physical sciences literature. These are idealized geometric directed lattice models introduced to study phase behaviour in polymers, and include Dyck paths, partially directed paths, directed trees and directed vesicles models. Directed models are closely related to models studied in the combinatorics literature (and are often exactly solvable). They are also simplified versions of a number of statistical mechanics models, including the self-avoiding walk, lattice animals and lattice vesicles. The exchange of approaches and ideas between statistical mechanics and combinatorics have considerably advanced the description and understanding of directed lattice models, and this will be explored in this review. The combinatorial nature of directed lattice path models makes a study using generating function approaches most natural. In contrast, the statistical mechanics approach would introduce...

  17. SIMON. A computer program for reliability and statistical analysis using Monte Carlo simulation. Program description and manual

    International Nuclear Information System (INIS)

    Kongsoe, H.E.; Lauridsen, K.

    1993-09-01

    SIMON is a program for calculation of reliability and statistical analysis. The program is of the Monte Carlo type, and it is designed with high flexibility, and has a large potential for application to complex problems like reliability analyses of very large systems and of systems, where complex modelling or knowledge of special details are required. Examples of application of the program, including input and output, for reliability and statistical analysis are presented. (au) (3 tabs., 3 ills., 5 refs.)

  18. Analysis of laparoscopic port site complications: A descriptive study.

    Science.gov (United States)

    Karthik, Somu; Augustine, Alfred Joseph; Shibumon, Mundunadackal Madhavan; Pai, Manohar Varadaraya

    2013-04-01

    The rate of port site complications following conventional laparoscopic surgery is about 21 per 100,000 cases. It has shown a proportional rise with increase in the size of the port site incision and trocar. Although rare, complications that occur at the port site include infection, bleeding, and port site hernia. To determine the morbidity associated with ports at the site of their insertion in laparoscopic surgery and to identify risk factors for complications. Prospective descriptive study. In the present descriptive study, a total of 570 patients who underwent laparoscopic surgeries for various ailments between August 2009 and July 2011 at our institute were observed for port site complications prospectively and the complications were reviewed. Descriptive statistical analysis was carried out in the present study. The statistical software, namely, SPSS 15.0 was used for the analysis of the data. Of the 570 patients undergoing laparoscopic surgery, 17 (3%) had developed complications specifically related to the port site during a minimum follow-up of three months; port site infection (PSI) was the most frequent (n = 10, 1.8%), followed by port site bleeding (n = 4, 0.7%), omentum-related complications (n = 2; 0.35%), and port site metastasis (n = 1, 0.175%). Laparoscopic surgeries are associated with minimal port site complications. Complications are related to the increased number of ports. Umbilical port involvement is the commonest. Most complications are manageable with minimal morbidity, and can be further minimized with meticulous surgical technique during entry and exit.

  19. Parallel auto-correlative statistics with VTK.

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre; Bennett, Janine Camille

    2013-08-01

    This report summarizes existing statistical engines in VTK and presents both the serial and parallel auto-correlative statistics engines. It is a sequel to [PT08, BPRT09b, PT09, BPT09, PT10] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k-means, and order statistics engines. The ease of use of the new parallel auto-correlative statistics engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the autocorrelative statistics engine.

  20. Possible Solution to Publication Bias Through Bayesian Statistics, Including Proper Null Hypothesis Testing

    NARCIS (Netherlands)

    Konijn, Elly A.; van de Schoot, Rens; Winter, Sonja D.; Ferguson, Christopher J.

    2015-01-01

    The present paper argues that an important cause of publication bias resides in traditional frequentist statistics forcing binary decisions. An alternative approach through Bayesian statistics provides various degrees of support for any hypothesis allowing balanced decisions and proper null

  1. Understanding Statistics - Cancer Statistics

    Science.gov (United States)

    Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.

  2. [The research protocol VI: How to choose the appropriate statistical test. Inferential statistics].

    Science.gov (United States)

    Flores-Ruiz, Eric; Miranda-Novales, María Guadalupe; Villasís-Keever, Miguel Ángel

    2017-01-01

    The statistical analysis can be divided in two main components: descriptive analysis and inferential analysis. An inference is to elaborate conclusions from the tests performed with the data obtained from a sample of a population. Statistical tests are used in order to establish the probability that a conclusion obtained from a sample is applicable to the population from which it was obtained. However, choosing the appropriate statistical test in general poses a challenge for novice researchers. To choose the statistical test it is necessary to take into account three aspects: the research design, the number of measurements and the scale of measurement of the variables. Statistical tests are divided into two sets, parametric and nonparametric. Parametric tests can only be used if the data show a normal distribution. Choosing the right statistical test will make it easier for readers to understand and apply the results.

  3. The research protocol VI: How to choose the appropriate statistical test. Inferential statistics

    Directory of Open Access Journals (Sweden)

    Eric Flores-Ruiz

    2017-10-01

    Full Text Available The statistical analysis can be divided in two main components: descriptive analysis and inferential analysis. An inference is to elaborate conclusions from the tests performed with the data obtained from a sample of a population. Statistical tests are used in order to establish the probability that a conclusion obtained from a sample is applicable to the population from which it was obtained. However, choosing the appropriate statistical test in general poses a challenge for novice researchers. To choose the statistical test it is necessary to take into account three aspects: the research design, the number of measurements and the scale of measurement of the variables. Statistical tests are divided into two sets, parametric and nonparametric. Parametric tests can only be used if the data show a normal distribution. Choosing the right statistical test will make it easier for readers to understand and apply the results.

  4. Statistical mechanics of economics I

    Energy Technology Data Exchange (ETDEWEB)

    Kusmartsev, F.V., E-mail: F.Kusmartsev@lboro.ac.u [Department of Physics, Loughborough University, Leicestershire, LE11 3TU (United Kingdom)

    2011-02-07

    We show that statistical mechanics is useful in the description of financial crisis and economics. Taking a large amount of instant snapshots of a market over an interval of time we construct their ensembles and study their statistical interference. This results in a probability description of the market and gives capital, money, income, wealth and debt distributions, which in the most cases takes the form of the Bose-Einstein distribution. In addition, statistical mechanics provides the main market equations and laws which govern the correlations between the amount of money, debt, product, prices and number of retailers. We applied the found relations to a study of the evolution of the economics in USA between the years 1996 to 2008 and observe that over that time the income of a major population is well described by the Bose-Einstein distribution which parameters are different for each year. Each financial crisis corresponds to a peak in the absolute activity coefficient. The analysis correctly indicates the past crises and predicts the future one.

  5. Statistical mechanics of economics I

    International Nuclear Information System (INIS)

    Kusmartsev, F.V.

    2011-01-01

    We show that statistical mechanics is useful in the description of financial crisis and economics. Taking a large amount of instant snapshots of a market over an interval of time we construct their ensembles and study their statistical interference. This results in a probability description of the market and gives capital, money, income, wealth and debt distributions, which in the most cases takes the form of the Bose-Einstein distribution. In addition, statistical mechanics provides the main market equations and laws which govern the correlations between the amount of money, debt, product, prices and number of retailers. We applied the found relations to a study of the evolution of the economics in USA between the years 1996 to 2008 and observe that over that time the income of a major population is well described by the Bose-Einstein distribution which parameters are different for each year. Each financial crisis corresponds to a peak in the absolute activity coefficient. The analysis correctly indicates the past crises and predicts the future one.

  6. Guidelines for Description

    NARCIS (Netherlands)

    Links, P.; Horsman, Peter; Kühnel, Karsten; Priddy, M.; Reijnhoudt, Linda; Merenmies, Mark

    2013-01-01

    The Guidelines follow the conceptual metadata model (deliverable 17.2). They include guidelines for description of collection-holding institutions, document collections, organisations, personalities, events, camps and ghettos. As much as possible the guidelines comply with the descriptive standards

  7. Statistical models for optimizing mineral exploration

    International Nuclear Information System (INIS)

    Wignall, T.K.; DeGeoffroy, J.

    1987-01-01

    The primary purpose of mineral exploration is to discover ore deposits. The emphasis of this volume is on the mathematical and computational aspects of optimizing mineral exploration. The seven chapters that make up the main body of the book are devoted to the description and application of various types of computerized geomathematical models. These chapters include: (1) the optimal selection of ore deposit types and regions of search, as well as prospecting selected areas, (2) designing airborne and ground field programs for the optimal coverage of prospecting areas, and (3) delineating and evaluating exploration targets within prospecting areas by means of statistical modeling. Many of these statistical programs are innovative and are designed to be useful for mineral exploration modeling. Examples of geomathematical models are applied to exploring for six main types of base and precious metal deposits, as well as other mineral resources (such as bauxite and uranium)

  8. Physics 3204. Course Description.

    Science.gov (United States)

    Newfoundland and Labrador Dept. of Education.

    A description of the physics 3204 course in Newfoundland and Labrador is provided. The description includes: (1) statement of purpose, including general objectives of science education; (2) a list of six course objectives; (3) course content for units on sound, light, optical instruments, electrostatics, current electricity, Michael Faraday and…

  9. Evidence-based orthodontics. Current statistical trends in published articles in one journal.

    Science.gov (United States)

    Law, Scott V; Chudasama, Dipak N; Rinchuse, Donald J

    2010-09-01

    To ascertain the number, type, and overall usage of statistics in American Journal of Orthodontics and Dentofacial (AJODO) articles for 2008. These data were then compared to data from three previous years: 1975, 1985, and 2003. The frequency and distribution of statistics used in the AJODO original articles for 2008 were dichotomized into those using statistics and those not using statistics. Statistical procedures were then broadly divided into descriptive statistics (mean, standard deviation, range, percentage) and inferential statistics (t-test, analysis of variance). Descriptive statistics were used to make comparisons. In 1975, 1985, 2003, and 2008, AJODO published 72, 87, 134, and 141 original articles, respectively. The percentage of original articles using statistics was 43.1% in 1975, 75.9% in 1985, 94.0% in 2003, and 92.9% in 2008; original articles using statistics stayed relatively the same from 2003 to 2008, with only a small 1.1% decrease. The percentage of articles using inferential statistical analyses was 23.7% in 1975, 74.2% in 1985, 92.9% in 2003, and 84.4% in 2008. Comparing AJODO publications in 2003 and 2008, there was an 8.5% increase in the use of descriptive articles (from 7.1% to 15.6%), and there was an 8.5% decrease in articles using inferential statistics (from 92.9% to 84.4%).

  10. Is there a statistical mechanics of turbulence?

    International Nuclear Information System (INIS)

    Kraichnan, R.H.; Chen, S.Y.

    1988-09-01

    The statistical-mechanical treatment of turbulence is made questionable by strong nonlinearity and strong disequilibrium that result in the creation of ordered structures imbedded in disorder. Model systems are described which may provide some hope that a compact, yet faithful, statistical description of turbulence nevertheless is possible. Some essential dynamic features of the models are captured by low-order statistical approximations despite strongly non-Gaussian behavior. 31 refs., 5 figs

  11. Statistical description of multipion production in diffractive hadronic reactions

    International Nuclear Information System (INIS)

    Gagnon, R.

    1980-01-01

    A statistical model in which higher-multiplicity enhancements are generated from lower ones in a completely determined fashion is presented. Full account is taken of isospin and G-parity conservation as well as the finite width of the produced resonances. It is applied to diffractive dissociation on nucleon and deuteron targets, for which multipion mass distributions and relative cross sections are calculated. Agreement with available experimental data is seen to be excellent

  12. A Descriptive Study of Individual and Cross-Cultural Differences in Statistics Anxiety

    Science.gov (United States)

    Baloglu, Mustafa; Deniz, M. Engin; Kesici, Sahin

    2011-01-01

    The present study investigated individual and cross-cultural differences in statistics anxiety among 223 Turkish and 237 American college students. A 2 x 2 between-subjects factorial multivariate analysis of covariance (MANCOVA) was performed on the six dependent variables which are the six subscales of the Statistical Anxiety Rating Scale.…

  13. Statistics of high-level scene context.

    Science.gov (United States)

    Greene, Michelle R

    2013-01-01

    CONTEXT IS CRITICAL FOR RECOGNIZING ENVIRONMENTS AND FOR SEARCHING FOR OBJECTS WITHIN THEM: contextual associations have been shown to modulate reaction time and object recognition accuracy, as well as influence the distribution of eye movements and patterns of brain activations. However, we have not yet systematically quantified the relationships between objects and their scene environments. Here I seek to fill this gap by providing descriptive statistics of object-scene relationships. A total of 48, 167 objects were hand-labeled in 3499 scenes using the LabelMe tool (Russell et al., 2008). From these data, I computed a variety of descriptive statistics at three different levels of analysis: the ensemble statistics that describe the density and spatial distribution of unnamed "things" in the scene; the bag of words level where scenes are described by the list of objects contained within them; and the structural level where the spatial distribution and relationships between the objects are measured. The utility of each level of description for scene categorization was assessed through the use of linear classifiers, and the plausibility of each level for modeling human scene categorization is discussed. Of the three levels, ensemble statistics were found to be the most informative (per feature), and also best explained human patterns of categorization errors. Although a bag of words classifier had similar performance to human observers, it had a markedly different pattern of errors. However, certain objects are more useful than others, and ceiling classification performance could be achieved using only the 64 most informative objects. As object location tends not to vary as a function of category, structural information provided little additional information. Additionally, these data provide valuable information on natural scene redundancy that can be exploited for machine vision, and can help the visual cognition community to design experiments guided by statistics

  14. Statistics at a glance.

    Science.gov (United States)

    Ector, Hugo

    2010-12-01

    I still remember my first book on statistics: "Elementary statistics with applications in medicine and the biological sciences" by Frederick E. Croxton. For me, it has been the start of pursuing understanding statistics in daily life and in medical practice. It was the first volume in a long row of books. In his introduction, Croxton pretends that"nearly everyone involved in any aspect of medicine needs to have some knowledge of statistics". The reality is that for many clinicians, statistics are limited to a "P statistical methods. They have never had the opportunity to learn concise and clear descriptions of the key features. I have experienced how some authors can describe difficult methods in a well understandable language. Others fail completely. As a teacher, I tell my students that life is impossible without a basic knowledge of statistics. This feeling has resulted in an annual seminar of 90 minutes. This tutorial is the summary of this seminar. It is a summary and a transcription of the best pages I have detected.

  15. Model for safety reports including descriptive examples

    International Nuclear Information System (INIS)

    1995-12-01

    Several safety reports will be produced in the process of planning and constructing the system for disposal of high-level radioactive waste in Sweden. The present report gives a model, with detailed examples, of how these reports should be organized and what steps they should include. In the near future safety reports will deal with the encapsulation plant and the repository. Later reports will treat operation of the handling systems and the repository

  16. Zubarev's Nonequilibrium Statistical Operator Method in the Generalized Statistics of Multiparticle Systems

    Science.gov (United States)

    Glushak, P. A.; Markiv, B. B.; Tokarchuk, M. V.

    2018-01-01

    We present a generalization of Zubarev's nonequilibrium statistical operator method based on the principle of maximum Renyi entropy. In the framework of this approach, we obtain transport equations for the basic set of parameters of the reduced description of nonequilibrium processes in a classical system of interacting particles using Liouville equations with fractional derivatives. For a classical systems of particles in a medium with a fractal structure, we obtain a non-Markovian diffusion equation with fractional spatial derivatives. For a concrete model of the frequency dependence of a memory function, we obtain generalized Kettano-type diffusion equation with the spatial and temporal fractality taken into account. We present a generalization of nonequilibrium thermofield dynamics in Zubarev's nonequilibrium statistical operator method in the framework of Renyi statistics.

  17. On the statistical description of the inbound air traffic over Heathrow airport

    NARCIS (Netherlands)

    Caccavale, M.V.; Iovanella, A.; Lancia, C.; Lulli, G.; Scoppola, B.

    2013-01-01

    We present a model to describe the inbound air traffic over a congested hub. We show that this model gives a very accurate description of the traffic by the comparison of our theoretical distribution of the queue with the actual distribution observed over Heathrow airport. We discuss also the

  18. 2012 aerospace medical certification statistical handbook.

    Science.gov (United States)

    2013-12-01

    The annual Aerospace Medical Certification Statistical Handbook reports descriptive : characteristics of all active U.S. civil aviation airmen and the aviation medical examiners (AMEs) that : perform the required medical examinations. The 2012 annual...

  19. An introduction to inferential statistics: A review and practical guide

    International Nuclear Information System (INIS)

    Marshall, Gill; Jonker, Leon

    2011-01-01

    Building on the first part of this series regarding descriptive statistics, this paper demonstrates why it is advantageous for radiographers to understand the role of inferential statistics in deducing conclusions from a sample and their application to a wider population. This is necessary so radiographers can understand the work of others, can undertake their own research and evidence base their practice. This article explains p values and confidence intervals. It introduces the common statistical tests that comprise inferential statistics, and explains the use of parametric and non-parametric statistics. To do this, the paper reviews relevant literature, and provides a checklist of points to consider before and after applying statistical tests to a data set. The paper provides a glossary of relevant terms and the reader is advised to refer to this when any unfamiliar terms are used in the text. Together with the information provided on descriptive statistics in an earlier article, it can be used as a starting point for applying statistics in radiography practice and research.

  20. An introduction to inferential statistics: A review and practical guide

    Energy Technology Data Exchange (ETDEWEB)

    Marshall, Gill, E-mail: gill.marshall@cumbria.ac.u [Faculty of Health, Medical Sciences and Social Care, University of Cumbria, Lancaster LA1 3JD (United Kingdom); Jonker, Leon [Faculty of Health, Medical Sciences and Social Care, University of Cumbria, Lancaster LA1 3JD (United Kingdom)

    2011-02-15

    Building on the first part of this series regarding descriptive statistics, this paper demonstrates why it is advantageous for radiographers to understand the role of inferential statistics in deducing conclusions from a sample and their application to a wider population. This is necessary so radiographers can understand the work of others, can undertake their own research and evidence base their practice. This article explains p values and confidence intervals. It introduces the common statistical tests that comprise inferential statistics, and explains the use of parametric and non-parametric statistics. To do this, the paper reviews relevant literature, and provides a checklist of points to consider before and after applying statistical tests to a data set. The paper provides a glossary of relevant terms and the reader is advised to refer to this when any unfamiliar terms are used in the text. Together with the information provided on descriptive statistics in an earlier article, it can be used as a starting point for applying statistics in radiography practice and research.

  1. 2011 aerospace medical certification statistical handbook.

    Science.gov (United States)

    2013-01-01

    The annual Aerospace Medical Certification Statistical Handbook reports descriptive characteristics of all active U.S. civil aviation airmen and the aviation medical examiners (AMEs) that perform the required medical examinations. The 2011 annual han...

  2. Theoretical approaches to the steady-state statistical physics of interacting dissipative units

    Science.gov (United States)

    Bertin, Eric

    2017-02-01

    The aim of this review is to provide a concise overview of some of the generic approaches that have been developed to deal with the statistical description of large systems of interacting dissipative ‘units’. The latter notion includes, e.g. inelastic grains, active or self-propelled particles, bubbles in a foam, low-dimensional dynamical systems like driven oscillators, or even spatially extended modes like Fourier modes of the velocity field in a fluid. We first review methods based on the statistical properties of a single unit, starting with elementary mean-field approximations, either static or dynamic, that describe a unit embedded in a ‘self-consistent’ environment. We then discuss how this basic mean-field approach can be extended to account for spatial dependences, in the form of space-dependent mean-field Fokker-Planck equations, for example. We also briefly review the use of kinetic theory in the framework of the Boltzmann equation, which is an appropriate description for dilute systems. We then turn to descriptions in terms of the full N-body distribution, starting from exact solutions of one-dimensional models, using a matrix-product ansatz method when correlations are present. Since exactly solvable models are scarce, we also present some approximation methods which can be used to determine the N-body distribution in a large system of dissipative units. These methods include the Edwards approach for dense granular matter and the approximate treatment of multiparticle Langevin equations with colored noise, which models systems of self-propelled particles. Throughout this review, emphasis is put on methodological aspects of the statistical modeling and on formal similarities between different physical problems, rather than on the specific behavior of a given system.

  3. STATCAT, Statistical Analysis of Parametric and Non-Parametric Data

    International Nuclear Information System (INIS)

    David, Hugh

    1990-01-01

    1 - Description of program or function: A suite of 26 programs designed to facilitate the appropriate statistical analysis and data handling of parametric and non-parametric data, using classical and modern univariate and multivariate methods. 2 - Method of solution: Data is read entry by entry, using a choice of input formats, and the resultant data bank is checked for out-of- range, rare, extreme or missing data. The completed STATCAT data bank can be treated by a variety of descriptive and inferential statistical methods, and modified, using other standard programs as required

  4. Description of surface systems. Preliminary site description. Forsmark area Version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Lindborg, Tobias [ed.

    2005-06-01

    the biosphere. Methodologies for developing descriptive- and ecosystem models are only described briefly in this report, but for thorough methodology descriptions see references. The work has been conducted by the project group SurfaceNet together with other discipline-specific collaborators, engaged by members of the project group. The members of the project group represent the disciplines ecology, hydrology, Quaternary geology, soil science, limnology, oceanography, hydrogeology, hydrogeochemistry, environmental science, physical geography and human geography. In addition, some group members have specific qualifications of importance, e.g. experts in GIS modelling and in statistical data analysis.

  5. Description of surface systems. Preliminary site description. Forsmark area Version 1.2

    International Nuclear Information System (INIS)

    Lindborg, Tobias

    2005-06-01

    the biosphere. Methodologies for developing descriptive- and ecosystem models are only described briefly in this report, but for thorough methodology descriptions see references. The work has been conducted by the project group SurfaceNet together with other discipline-specific collaborators, engaged by members of the project group. The members of the project group represent the disciplines ecology, hydrology, Quaternary geology, soil science, limnology, oceanography, hydrogeology, hydrogeochemistry, environmental science, physical geography and human geography. In addition, some group members have specific qualifications of importance, e.g. experts in GIS modelling and in statistical data analysis

  6. Statistical Thermodynamics of Disperse Systems

    DEFF Research Database (Denmark)

    Shapiro, Alexander

    1996-01-01

    Principles of statistical physics are applied for the description of thermodynamic equilibrium in disperse systems. The cells of disperse systems are shown to possess a number of non-standard thermodynamic parameters. A random distribution of these parameters in the system is determined....... On the basis of this distribution, it is established that the disperse system has an additional degree of freedom called the macro-entropy. A large set of bounded ideal disperse systems allows exact evaluation of thermodynamic characteristics. The theory developed is applied to the description of equilibrium...

  7. A Review of Modeling Bioelectrochemical Systems: Engineering and Statistical Aspects

    Directory of Open Access Journals (Sweden)

    Shuai Luo

    2016-02-01

    Full Text Available Bioelectrochemical systems (BES are promising technologies to convert organic compounds in wastewater to electrical energy through a series of complex physical-chemical, biological and electrochemical processes. Representative BES such as microbial fuel cells (MFCs have been studied and advanced for energy recovery. Substantial experimental and modeling efforts have been made for investigating the processes involved in electricity generation toward the improvement of the BES performance for practical applications. However, there are many parameters that will potentially affect these processes, thereby making the optimization of system performance hard to be achieved. Mathematical models, including engineering models and statistical models, are powerful tools to help understand the interactions among the parameters in BES and perform optimization of BES configuration/operation. This review paper aims to introduce and discuss the recent developments of BES modeling from engineering and statistical aspects, including analysis on the model structure, description of application cases and sensitivity analysis of various parameters. It is expected to serves as a compass for integrating the engineering and statistical modeling strategies to improve model accuracy for BES development.

  8. Accuracy of physical self-description among chronic exercisers and non-exercisers

    Directory of Open Access Journals (Sweden)

    Joseph M. Berning

    2014-10-01

    Full Text Available This study addressed the role of chronic exercise to enhance physical self-description as measured by self-estimated percent body fat. Accuracy of physical self-description was determined in normal-weight, regularly exercising and non-exercising males with similar body mass index (BMI’s and females with similar BMI’s (n=42 males and 45 females of which 23 males and 23 females met criteria to be considered chronic exercisers. Statistical analyses were conducted to determine the degree of agreement between self-estimated percent body fat and actual laboratory measurements (hydrostatic weighing. Three statistical techniques were employed: Pearson correlation coefficients, Bland and Altman plots, and regression analysis. Agreement between measured and self-estimated percent body fat was superior for males and females who exercised chronically, compared to non-exercisers. The clinical implications are as follows. Satisfaction with one’s body can be influenced by several factors, including self-perceived body composition. Dissatisfaction can contribute to maladaptive and destructive weight management behaviors. The present study suggests that regular exercise provides a basis for more positive weight management behaviors by enhancing the accuracy of self-assessed body composition.

  9. Methodological quality and descriptive characteristics of prosthodontic-related systematic reviews.

    Science.gov (United States)

    Aziz, T; Compton, S; Nassar, U; Matthews, D; Ansari, K; Flores-Mir, C

    2013-04-01

    Ideally, healthcare systematic reviews (SRs) should be beneficial to practicing professionals in making evidence-based clinical decisions. However, the conclusions drawn from SRs are directly related to the quality of the SR and of the included studies. The aim was to investigate the methodological quality and key descriptive characteristics of SRs published in prosthodontics. Methodological quality was analysed using the Assessment of Multiple Reviews (AMSTAR) tool. Several electronic resources (MEDLINE, EMBASE, Web of Science and American Dental Association's Evidence-based Dentistry website) were searched. In total 106 SRs were located. Key descriptive characteristics and methodological quality features were gathered and assessed, and descriptive and inferential statistical testing performed. Most SRs in this sample originated from the European continent followed by North America. Two to five authors conducted most SRs; the majority was affiliated with academic institutions and had prior experience publishing SRs. The majority of SRs were published in specialty dentistry journals, with implant or implant-related topics, the primary topics of interest for most. According to AMSTAR, most quality aspects were adequately fulfilled by less than half of the reviews. Publication bias and grey literature searches were the most poorly adhered components. Overall, the methodological quality of the prosthodontic-related systematic was deemed limited. Future recommendations would include authors to have prior training in conducting SRs and for journals to include a universal checklist that should be adhered to address all key characteristics of an unbiased SR process. © 2013 Blackwell Publishing Ltd.

  10. Applied Statistics Using SPSS, STATISTICA, MATLAB and R

    CERN Document Server

    De Sá, Joaquim P Marques

    2007-01-01

    This practical reference provides a comprehensive introduction and tutorial on the main statistical analysis topics, demonstrating their solution with the most common software package. Intended for anyone needing to apply statistical analysis to a large variety of science and enigineering problems, the book explains and shows how to use SPSS, MATLAB, STATISTICA and R for analysis such as data description, statistical inference, classification and regression, factor analysis, survival data and directional statistics. It concisely explains key concepts and methods, illustrated by practical examp

  11. A Catalog of Solar White-Light Flares (1859-1982), Including Their Statistical Properties and Associated Emissions.

    Science.gov (United States)

    1983-09-20

    comprehensively studied. Among those aspects in need of further observational descriptions are (1) morphology (a description of the various WLF forms in terms of...Sacramento Peak, we compiled the list O ourselves. To the best of our knowledge the final list of events is comprehensive , although we have recently...and Conway, M. (1950) The solar, flare of 1949 November 19, Observatory .70.77. 33. Porret, M. (1952) Communications Ecrites : Soelil, I’Astronomie

  12. Statistical quality management using miniTAB 14

    International Nuclear Information System (INIS)

    An, Seong Jin

    2007-01-01

    This book explains statistical quality management giving descriptions of definition of quality, quality management, quality cost, basic methods of quality management, principles of control chart, control chart for variables, control chart for attributes, capability analysis, other issues of statistical process control, acceptance sampling, sampling for variable acceptance, design and analysis of experiment, Taguchi quality engineering, reaction surface methodology reliability analysis.

  13. Practicing Statistics by Creating Exercises for Fellow Students

    Science.gov (United States)

    Bebermeier, Sarah; Reiss, Katharina

    2016-01-01

    This article outlines the execution of a workshop in which students were encouraged to actively review the course contents on descriptive statistics by creating exercises for their fellow students. In a first-year statistics course in psychology, 39 out of 155 students participated in the workshop. In a subsequent evaluation, the workshop was…

  14. Chinese legal texts – Quantitative Description

    Directory of Open Access Journals (Sweden)

    Ľuboš GAJDOŠ

    2017-06-01

    Full Text Available The aim of the paper is to provide a quantitative description of legal Chinese. This study adopts the approach of corpus-based analyses and it shows basic statistical parameters of legal texts in Chinese, namely the length of a sentence, the proportion of part of speech etc. The research is conducted on the Chinese monolingual corpus Hanku. The paper also discusses the issues of statistical data processing from various corpora, e.g. the tokenisation and part of speech tagging and their relevance to study of registers variation.

  15. Fundamentals of modern statistical methods substantially improving power and accuracy

    CERN Document Server

    Wilcox, Rand R

    2001-01-01

    Conventional statistical methods have a very serious flaw They routinely miss differences among groups or associations among variables that are detected by more modern techniques - even under very small departures from normality Hundreds of journal articles have described the reasons standard techniques can be unsatisfactory, but simple, intuitive explanations are generally unavailable Improved methods have been derived, but they are far from obvious or intuitive based on the training most researchers receive Situations arise where even highly nonsignificant results become significant when analyzed with more modern methods Without assuming any prior training in statistics, Part I of this book describes basic statistical principles from a point of view that makes their shortcomings intuitive and easy to understand The emphasis is on verbal and graphical descriptions of concepts Part II describes modern methods that address the problems covered in Part I Using data from actual studies, many examples are include...

  16. Review of Naked Statistics: Stripping the Dread from Data by Charles Wheelan

    Directory of Open Access Journals (Sweden)

    Michael T. Catalano

    2015-01-01

    Full Text Available Wheelan, Charles. Naked Statistics: Stripping the Dread from Data (New York, NY, W. W. Norton & Company, 2014. 282 pp. ISBN 978-0-393-07195-5 In his review of What Numbers Say and The Numbers Game, Rob Root (Numeracy 3(1: 9 writes “Popular books on quantitative literacy need to be easy to read, reasonably comprehensive in scope, and include examples that are thought-provoking and memorable.” Wheelan’s book certainly meets this description, and should be of interest to both the general public and those with a professional interest in numeracy. A moderately diligent learner can get a decent understanding of basic statistics from the book. Teachers of statistics and quantitative literacy will find a wealth of well-related examples and stories to use in their classes.

  17. Preliminary site description Forsmark area - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Skagius, Kristina [ed.

    2005-06-01

    The Swedish Nuclear Fuel and Waste Management Company (SKB) is undertaking site characterisation at two different locations, the Forsmark and Simpevarp areas, with the objective of siting a geological repository for spent nuclear fuel. An integrated component in the characterisation work is the development of a site descriptive model that constitutes a description of the site and its regional setting, covering the current state of the geosphere and the biosphere as well as those ongoing natural processes that affect their long-term evolution. The present report documents the site descriptive modelling activities (version 1.2) for the Forsmark area. The overall objectives of the version 1.2 site descriptive modelling are to produce and document an integrated description of the site and its regional environments based on the site-specific data available from the initial site investigations and to give recommendations on continued investigations. The modelling work is based on primary data, i.e. quality-assured, geoscientific and ecological field data available in the SKB databases SICADA and GIS, available July 31, 2004. The work has been conducted by a project group and associated discipline-specific working groups. The members of the project group represent the disciplines of geology, rock mechanics, thermal properties, hydrogeology, hydrogeochemistry, transport properties and surface ecosystems (including overburden, surface hydrogeochemistry and hydrology). In addition, some group members have specific qualifications of importance in this type of project e.g. expertise in RVS (Rock Visualisation System) modelling, GIS-modelling and in statistical data analysis. The overall strategy to achieve a site description is to develop discipline-specific models by interpretation and analyses of the primary data. The different discipline-specific models are then integrated into a site description. Methodologies for developing the discipline-specific models are documented in

  18. Preliminary site description Forsmark area - version 1.2

    International Nuclear Information System (INIS)

    Skagius, Kristina

    2005-06-01

    The Swedish Nuclear Fuel and Waste Management Company (SKB) is undertaking site characterisation at two different locations, the Forsmark and Simpevarp areas, with the objective of siting a geological repository for spent nuclear fuel. An integrated component in the characterisation work is the development of a site descriptive model that constitutes a description of the site and its regional setting, covering the current state of the geosphere and the biosphere as well as those ongoing natural processes that affect their long-term evolution. The present report documents the site descriptive modelling activities (version 1.2) for the Forsmark area. The overall objectives of the version 1.2 site descriptive modelling are to produce and document an integrated description of the site and its regional environments based on the site-specific data available from the initial site investigations and to give recommendations on continued investigations. The modelling work is based on primary data, i.e. quality-assured, geoscientific and ecological field data available in the SKB databases SICADA and GIS, available July 31, 2004. The work has been conducted by a project group and associated discipline-specific working groups. The members of the project group represent the disciplines of geology, rock mechanics, thermal properties, hydrogeology, hydrogeochemistry, transport properties and surface ecosystems (including overburden, surface hydrogeochemistry and hydrology). In addition, some group members have specific qualifications of importance in this type of project e.g. expertise in RVS (Rock Visualisation System) modelling, GIS-modelling and in statistical data analysis. The overall strategy to achieve a site description is to develop discipline-specific models by interpretation and analyses of the primary data. The different discipline-specific models are then integrated into a site description. Methodologies for developing the discipline-specific models are documented in

  19. CHEMICAL REACTIONS ON ADSORBING SURFACE: KINETIC LEVEL OF DESCRIPTION

    Directory of Open Access Journals (Sweden)

    P.P.Kostrobii

    2003-01-01

    Full Text Available Based on the effective Hubbard model we suggest a statistical description of reaction-diffusion processes for bimolecular chemical reactions of gas particles adsorbed on the metallic surface. The system of transport equations for description of particles diffusion as well as reactions is obtained. We carry out the analysis of the contributions of all physical processes to the formation of diffusion coefficients and chemical reactions constants.

  20. Danish electricity supply. Statistics 2003

    International Nuclear Information System (INIS)

    2004-01-01

    The Association of Danish Electric Utilities each year issues the statistical yearbook 'Danish electricity supply'. By means of brief text, figures, and tables a description is given of the electric supply sector. The report presents data for the year 2003 for consumption, prices of electric power, power generation and transmission, and trade. (ln)

  1. Danish electricity supply. Statistics 2000

    International Nuclear Information System (INIS)

    2001-07-01

    The Association of Danish Electric Utilities each year issues the statistical yearbook 'Danish electricity supply'. By means of brief text, figures, and tables a description is given of the electric supply sector. The report presents data for the year 2000 for consumption, prices of electric power; power generation and transmission, and trade. (ln)

  2. Danish electricity supply. Statistics 2002

    International Nuclear Information System (INIS)

    2003-01-01

    The Association of Danish Electric Utilities each year issues the statistical yearbook 'Danish electricity supply'. By means of brief text, figures, and tables a description is given of the electric supply sector. The report presents data for the year 2002 for consumption, prices of electric power; power generation and transmission, and trade. (ln)

  3. Application of fractal theory in refined reservoir description for EOR pilot area

    Energy Technology Data Exchange (ETDEWEB)

    Yue Li; Yonggang Duan; Yun Li; Yuan Lu

    1997-08-01

    A reliable reservoir description is essential to investigate scenarios for successful EOR pilot test. Reservoir characterization includes formation composition, permeability, porosity, reservoir fluids and other petrophysical parameters. In this study, various new tools have been applied to characterize Kilamayi conglomerate formation. This paper examines the merits of various statistical methods for recognizing rock property correlation in vertical columns and gives out methods to determine fractal dimension including R/S analysis and power spectral analysis. The paper also demonstrates that there is obvious fractal characteristics in conglomerate reservoirs of Kilamayi oil fields. Well log data in EOR pilot area are used to get distribution profile of parameters including permeability, porosity, water saturation and shale content.

  4. Boltzmann and Einstein: Statistics and dynamics –An unsolved ...

    Indian Academy of Sciences (India)

    The struggle of Boltzmann with the proper description of the behavior of classical macroscopic bodies in equilibrium in terms of the properties of the particles out of which they consist will be sketched. He used both a dynamical and a statistical method. However, Einstein strongly disagreed with Boltzmann's statistical method ...

  5. Fracture criterion for brittle materials based on statistical cells of finite volume

    International Nuclear Information System (INIS)

    Cords, H.; Kleist, G.; Zimmermann, R.

    1986-06-01

    An analytical consideration of the Weibull Statistical Analysis of brittle materials established the necessity of including one additional material constant for a more comprehensive description of the failure behaviour. The Weibull analysis is restricted to infinitesimal volume elements in consequence of the differential calculus applied. It was found that infinitesimally small elements are in conflict with the basic statistical assumption and that the differential calculus is not needed in fact since nowadays most of the stress analyses are based on finite element calculations, and these are most suitable for a subsequent statistical analysis of strength. The size of a finite statistical cell has been introduced as the third material parameter. It should represent the minimum volume containing all statistical features of the material such as distribution of pores, flaws and grains. The new approach also contains a unique treatment of failure under multiaxial stresses. The quantity responsible for failure under multiaxial stresses is introduced as a modified strain energy. Sixteen different tensile specimens including CT-specimens have been investigated experimentally and analyzed with the probabilistic fracture criterion. As a result it can be stated that the failure rates of all types of specimens made from three different grades of graphite are predictable. The accuracy of the prediction is one standard deviation. (orig.) [de

  6. Applied statistics for social and management sciences

    CERN Document Server

    Miah, Abdul Quader

    2016-01-01

    This book addresses the application of statistical techniques and methods across a wide range of disciplines. While its main focus is on the application of statistical methods, theoretical aspects are also provided as fundamental background information. It offers a systematic interpretation of results often discovered in general descriptions of methods and techniques such as linear and non-linear regression. SPSS is also used in all the application aspects. The presentation of data in the form of tables and graphs throughout the book not only guides users, but also explains the statistical application and assists readers in interpreting important features. The analysis of statistical data is presented consistently throughout the text. Academic researchers, practitioners and other users who work with statistical data will benefit from reading Applied Statistics for Social and Management Sciences. .

  7. Density by Moduli and Lacunary Statistical Convergence

    Directory of Open Access Journals (Sweden)

    Vinod K. Bhardwaj

    2016-01-01

    Full Text Available We have introduced and studied a new concept of f-lacunary statistical convergence, where f is an unbounded modulus. It is shown that, under certain conditions on a modulus f, the concepts of lacunary strong convergence with respect to a modulus f and f-lacunary statistical convergence are equivalent on bounded sequences. We further characterize those θ for which Sθf=Sf, where Sθf and Sf denote the sets of all f-lacunary statistically convergent sequences and f-statistically convergent sequences, respectively. A general description of inclusion between two arbitrary lacunary methods of f-statistical convergence is given. Finally, we give an Sθf-analog of the Cauchy criterion for convergence and a Tauberian theorem for Sθf-convergence is also proved.

  8. Identifying heat-related deaths by using medical examiner and vital statistics data: Surveillance analysis and descriptive epidemiology - Oklahoma, 1990-2011.

    Science.gov (United States)

    Johnson, Matthew G; Brown, Sheryll; Archer, Pam; Wendelboe, Aaron; Magzamen, Sheryl; Bradley, Kristy K

    2016-10-01

    Approximately 660 deaths occur annually in the United States associated with excess natural heat. A record heat wave in Oklahoma during 2011 generated increased interest concerning heat-related mortality among public health preparedness partners. We aimed to improve surveillance for heat-related mortality and better characterize heat-related deaths in Oklahoma during 1990-2011, and to enhance public health messaging during future heat emergencies. Heat-related deaths were identified by querying vital statistics (VS) and medical examiner (ME) data during 1990-2011. Case inclusion criteria were developed by using heat-related International Classification of Diseases codes, cause-of-death nomenclature, and ME investigation narrative. We calculated sensitivity and predictive value positive (PVP) for heat-related mortality surveillance by using VS and ME data and performed a descriptive analysis. During the study period, 364 confirmed and probable heat-related deaths were identified when utilizing both data sets. ME reports had 87% sensitivity and 74% PVP; VS reports had 80% sensitivity and 52% PVP. Compared to Oklahoma's general population, decedents were disproportionately male (67% vs. 49%), aged ≥65 years (46% vs. 14%), and unmarried (78% vs. 47%). Higher rates of heat-related mortality were observed among Blacks. Of 95 decedents with available information, 91 (96%) did not use air conditioning. Linking ME and VS data sources together and using narrative description for case classification allows for improved case ascertainment and surveillance data quality. Males, Blacks, persons aged ≥65 years, unmarried persons, and those without air conditioning carry a disproportionate burden of the heat-related deaths in Oklahoma. Published by Elsevier Inc.

  9. Thermodynamical description of excited nuclei

    International Nuclear Information System (INIS)

    Bonche, P.

    1989-01-01

    In heavy ion collisions it has been possible to obtain composite systems at rather high excitation energies corresponding to temperatures of several MeV. The theoretical studies of these systems are based on concepts borrowed from thermodynamics or statistical physics, such as the temperature. In these lectures, we present the concepts of statistical physics which are involved in the physics of heavy ion as they are produced nowadays in the laboratory and also during the final stage of a supernova collapse. We do not attempt to describe the reaction mechanisms which yield such nuclear systems nor their decay by evaporation or fragmentation. We shall only study their static properties. The content of these lectures is organized in four main sections. The first one gives the basic features of statistical physics and thermodynamics necessary to understand quantum mechanics at finite temperature. In the second one, we present a study of the liquid-gas phase transition in nuclear physics. A phenomenological approach of the stability of hot nuclei follows. The microscopic point of view is proposed in the third part. Starting from the basic concepts derived in the first part, it provides a description of excited or hot nuclei which confirms the qualitative results of the second part. Furthermore it gives a full description of most properties of these nuclei as a function of temperature. Finally in the last part, a microscopic derivation of the equation of state of nuclear matter is proposed to study the collapse of a supernova core

  10. Three dimensional model for particle saltation close to stream beds, including a detailed description of the particle interaction with turbulence and inter-particle collisions

    KAUST Repository

    Moreno, Pablo M.

    2011-05-19

    We present in this paper a new three-dimensional (3-D) model for bed-load sediment transport, based on a Lagrangian description. We analyze generalized sub-models for the velocities after collision and the representation of the bed-roughness. The free-flight sub-model includes the effect of several forces, such as buoyancy, drag, virtual mass, lift, Basset and Magnus, and also addresses the particle rotation. A recent methodology for saving computational time in the Basset force is also employed. The sub-models for the post-collision velocity and rotation are based on the conservation of linear and angular momentum during the collision with the bed. We develop a new 3-D representation for the bed roughness by using geometric considerations. In order to address the interaction of particles with the turbulent flow, we tracked the particles through a computed turbulent velocity field for a smooth flat plate. This velocity field was used as a surrogate of the 3-D turbulent conditions close to the bed in streams. We first checked that the basic turbulence statistics for this velocity field could be used to approximate those in an open-channel flow. We then analyzed the interaction of the sediment and the turbulence for a single and multiple particles. We compared numerical results with experimental data obtained by Niño and García (1998b). We show that model predictions are in good agreement with existing data, in the sand size range. © 2011 ASCE.

  11. Three dimensional model for particle saltation close to stream beds, including a detailed description of the particle interaction with turbulence and inter-particle collisions

    KAUST Repository

    Moreno, Pablo M.; Bombardelli, Fabiá n A.; Gonzá lez, Andrea E.; Calo, Victor M.

    2011-01-01

    We present in this paper a new three-dimensional (3-D) model for bed-load sediment transport, based on a Lagrangian description. We analyze generalized sub-models for the velocities after collision and the representation of the bed-roughness. The free-flight sub-model includes the effect of several forces, such as buoyancy, drag, virtual mass, lift, Basset and Magnus, and also addresses the particle rotation. A recent methodology for saving computational time in the Basset force is also employed. The sub-models for the post-collision velocity and rotation are based on the conservation of linear and angular momentum during the collision with the bed. We develop a new 3-D representation for the bed roughness by using geometric considerations. In order to address the interaction of particles with the turbulent flow, we tracked the particles through a computed turbulent velocity field for a smooth flat plate. This velocity field was used as a surrogate of the 3-D turbulent conditions close to the bed in streams. We first checked that the basic turbulence statistics for this velocity field could be used to approximate those in an open-channel flow. We then analyzed the interaction of the sediment and the turbulence for a single and multiple particles. We compared numerical results with experimental data obtained by Niño and García (1998b). We show that model predictions are in good agreement with existing data, in the sand size range. © 2011 ASCE.

  12. Using Facebook Data to Turn Introductory Statistics Students into Consultants

    Science.gov (United States)

    Childers, Adam F.

    2017-01-01

    Facebook provides businesses and organizations with copious data that describe how users are interacting with their page. This data affords an excellent opportunity to turn introductory statistics students into consultants to analyze the Facebook data using descriptive and inferential statistics. This paper details a semester-long project that…

  13. Statistics Anxiety and Business Statistics: The International Student

    Science.gov (United States)

    Bell, James A.

    2008-01-01

    Does the international student suffer from statistics anxiety? To investigate this, the Statistics Anxiety Rating Scale (STARS) was administered to sixty-six beginning statistics students, including twelve international students and fifty-four domestic students. Due to the small number of international students, nonparametric methods were used to…

  14. Including the Tukey Mean-Difference (Bland-Altman) Plot in a Statistics Course

    Science.gov (United States)

    Kozak, Marcin; Wnuk, Agnieszka

    2014-01-01

    The Tukey mean-difference plot, also called the Bland-Altman plot, is a recognized graphical tool in the exploration of biometrical data. We show that this technique deserves a place on an introductory statistics course by encouraging students to think about the kind of graph they wish to create, rather than just creating the default graph for the…

  15. Mathematical statistics essays on history and methodology

    CERN Document Server

    Pfanzagl, Johann

    2017-01-01

    This book presents a detailed description of the development of statistical theory. In the mid twentieth century, the development of mathematical statistics underwent an enduring change, due to the advent of more refined mathematical tools. New concepts like sufficiency, superefficiency, adaptivity etc. motivated scholars to reflect upon the interpretation of mathematical concepts in terms of their real-world relevance. Questions concerning the optimality of estimators, for instance, had remained unanswered for decades, because a meaningful concept of optimality (based on the regularity of the estimators, the representation of their limit distribution and assertions about their concentration by means of Anderson’s Theorem) was not yet available. The rapidly developing asymptotic theory provided approximate answers to questions for which non-asymptotic theory had found no satisfying solutions. In four engaging essays, this book presents a detailed description of how the use of mathematical methods stimulated...

  16. Description of rainfall variability in Br hat -samhita of Varâha-mihira

    OpenAIRE

    Iyengar, RN

    2004-01-01

    Br hat -samhita of Varâha-mihira (5–6th century AD) provides valuable information on the approach in ancient India towards monsoon rainfall, including its measurement and forecasting. In this context, we come across a description of the expected amount of total seasonal rainfall depending on the first rains under the 27 naks atras of Indian astronomy. This provides a rough statistical picture of what might have been the rainfall and its variability in the region around Ujjain, where Varâha-mi...

  17. Statistical theory of neutron-nuclear reactions

    International Nuclear Information System (INIS)

    Moldauer, P.A.

    1981-01-01

    In addition to the topics dealt with by the author in his lectures at the Joint IAEA/ICTP Course held at Trieste in 1978, recent developments in the statistical theory of multistep reactions are reviewed as well as the transport theory and intranuclear cascade approaches to the description of nuclear multi-step processes. (author)

  18. Study designs, use of statistical tests, and statistical analysis software choice in 2015: Results from two Pakistani monthly Medline indexed journals.

    Science.gov (United States)

    Shaikh, Masood Ali

    2017-09-01

    Assessment of research articles in terms of study designs used, statistical tests applied and the use of statistical analysis programmes help determine research activity profile and trends in the country. In this descriptive study, all original articles published by Journal of Pakistan Medical Association (JPMA) and Journal of the College of Physicians and Surgeons Pakistan (JCPSP), in the year 2015 were reviewed in terms of study designs used, application of statistical tests, and the use of statistical analysis programmes. JPMA and JCPSP published 192 and 128 original articles, respectively, in the year 2015. Results of this study indicate that cross-sectional study design, bivariate inferential statistical analysis entailing comparison between two variables/groups, and use of statistical software programme SPSS to be the most common study design, inferential statistical analysis, and statistical analysis software programmes, respectively. These results echo previously published assessment of these two journals for the year 2014.

  19. A statistical model for estimation of fish density including correlation in size, space, time and between species from research survey data

    DEFF Research Database (Denmark)

    Nielsen, J. Rasmus; Kristensen, Kasper; Lewy, Peter

    2014-01-01

    Trawl survey data with high spatial and seasonal coverage were analysed using a variant of the Log Gaussian Cox Process (LGCP) statistical model to estimate unbiased relative fish densities. The model estimates correlations between observations according to time, space, and fish size and includes...

  20. Statistical thermodynamics understanding the properties of macroscopic systems

    CERN Document Server

    Fai, Lukong Cornelius

    2012-01-01

    Basic Principles of Statistical PhysicsMicroscopic and Macroscopic Description of StatesBasic PostulatesGibbs Ergodic AssumptionGibbsian EnsemblesExperimental Basis of Statistical MechanicsDefinition of Expectation ValuesErgodic Principle and Expectation ValuesProperties of Distribution FunctionRelative Fluctuation of an Additive Macroscopic ParameterLiouville TheoremGibbs Microcanonical EnsembleMicrocanonical Distribution in Quantum MechanicsDensity MatrixDensity Matrix in Energy RepresentationEntropyThermodynamic FunctionsTemperatureAdiabatic ProcessesPressureThermodynamic IdentityLaws of Th

  1. Visualizing Summary Statistics and Uncertainty

    KAUST Repository

    Potter, K.; Kniss, J.; Riesenfeld, R.; Johnson, C.R.

    2010-01-01

    The graphical depiction of uncertainty information is emerging as a problem of great importance. Scientific data sets are not considered complete without indications of error, accuracy, or levels of confidence. The visual portrayal of this information is a challenging task. This work takes inspiration from graphical data analysis to create visual representations that show not only the data value, but also important characteristics of the data including uncertainty. The canonical box plot is reexamined and a new hybrid summary plot is presented that incorporates a collection of descriptive statistics to highlight salient features of the data. Additionally, we present an extension of the summary plot to two dimensional distributions. Finally, a use-case of these new plots is presented, demonstrating their ability to present high-level overviews as well as detailed insight into the salient features of the underlying data distribution. © 2010 The Eurographics Association and Blackwell Publishing Ltd.

  2. Visualizing Summary Statistics and Uncertainty

    KAUST Repository

    Potter, K.

    2010-08-12

    The graphical depiction of uncertainty information is emerging as a problem of great importance. Scientific data sets are not considered complete without indications of error, accuracy, or levels of confidence. The visual portrayal of this information is a challenging task. This work takes inspiration from graphical data analysis to create visual representations that show not only the data value, but also important characteristics of the data including uncertainty. The canonical box plot is reexamined and a new hybrid summary plot is presented that incorporates a collection of descriptive statistics to highlight salient features of the data. Additionally, we present an extension of the summary plot to two dimensional distributions. Finally, a use-case of these new plots is presented, demonstrating their ability to present high-level overviews as well as detailed insight into the salient features of the underlying data distribution. © 2010 The Eurographics Association and Blackwell Publishing Ltd.

  3. All of statistics a concise course in statistical inference

    CERN Document Server

    Wasserman, Larry

    2004-01-01

    This book is for people who want to learn probability and statistics quickly It brings together many of the main ideas in modern statistics in one place The book is suitable for students and researchers in statistics, computer science, data mining and machine learning This book covers a much wider range of topics than a typical introductory text on mathematical statistics It includes modern topics like nonparametric curve estimation, bootstrapping and classification, topics that are usually relegated to follow-up courses The reader is assumed to know calculus and a little linear algebra No previous knowledge of probability and statistics is required The text can be used at the advanced undergraduate and graduate level Larry Wasserman is Professor of Statistics at Carnegie Mellon University He is also a member of the Center for Automated Learning and Discovery in the School of Computer Science His research areas include nonparametric inference, asymptotic theory, causality, and applications to astrophysics, bi...

  4. Statistical methods for including two-body forces in large system calculations

    International Nuclear Information System (INIS)

    Grimes, S.M.

    1980-07-01

    Large systems of interacting particles are often treated by assuming that the effect on any one particle of the remaining N-1 may be approximated by an average potential. This approach reduces the problem to that of finding the bound-state solutions for a particle in a potential; statistical mechanics is then used to obtain the properties of the many-body system. In some physical systems this approach may not be acceptable, because the two-body force component cannot be treated in this one-body limit. A technique for incorporating two-body forces in such calculations in a more realistic fashion is described. 1 figure

  5. Statistics of DNA Markers - RGP gmap | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us RGP gmap Statistics of DNA Markers Data detail Data name Statistics of DNA Markers DOI 10.18...908/lsdba.nbdc00318-01-001 Description of data contents Statistics of DNA markers that were used to create t...iption Download License Update History of This Database Site Policy | Contact Us Statistics of DNA Markers - RGP gmap | LSDB Archive ...

  6. Linking the Resource Description Framework to cheminformatics and proteochemometrics

    Directory of Open Access Journals (Sweden)

    Willighagen Egon L

    2011-03-01

    Full Text Available Abstract Background Semantic web technologies are finding their way into the life sciences. Ontologies and semantic markup have already been used for more than a decade in molecular sciences, but have not found widespread use yet. The semantic web technology Resource Description Framework (RDF and related methods show to be sufficiently versatile to change that situation. Results The work presented here focuses on linking RDF approaches to existing molecular chemometrics fields, including cheminformatics, QSAR modeling and proteochemometrics. Applications are presented that link RDF technologies to methods from statistics and cheminformatics, including data aggregation, visualization, chemical identification, and property prediction. They demonstrate how this can be done using various existing RDF standards and cheminformatics libraries. For example, we show how IC50 and Ki values are modeled for a number of biological targets using data from the ChEMBL database. Conclusions We have shown that existing RDF standards can suitably be integrated into existing molecular chemometrics methods. Platforms that unite these technologies, like Bioclipse, makes this even simpler and more transparent. Being able to create and share workflows that integrate data aggregation and analysis (visual and statistical is beneficial to interoperability and reproducibility. The current work shows that RDF approaches are sufficiently powerful to support molecular chemometrics workflows.

  7. From statistic mechanic outside equilibrium to transport equations

    International Nuclear Information System (INIS)

    Balian, R.

    1995-01-01

    This lecture notes give a synthetic view on the foundations of non-equilibrium statistical mechanics. The purpose is to establish the transport equations satisfied by the relevant variables, starting from the microscopic dynamics. The Liouville representation is introduced, and a projection associates with any density operator , for given choice of relevant observables, a reduced density operator. An exact integral-differential equation for the relevant variables is thereby derived. A short-memory approximation then yields the transport equations. A relevant entropy which characterizes the coarseness of the description is associated with each level of description. As an illustration, the classical gas, with its three levels of description and with the Chapman-Enskog method, is discussed. (author). 3 figs., 5 refs

  8. Mathematics of statistical mechanics and the chaos theory

    International Nuclear Information System (INIS)

    Llave, R. de la; Haro, A.

    2000-01-01

    Statistical mechanics requires a language that unifies probabilistic and deterministic description of physical systems. We describe briefly some of the mathematical ideas needed for this unification. These ideas have also proved important in the study of chaotic systems. (Author) 17 refs

  9. Performing Inferential Statistics Prior to Data Collection

    Science.gov (United States)

    Trafimow, David; MacDonald, Justin A.

    2017-01-01

    Typically, in education and psychology research, the investigator collects data and subsequently performs descriptive and inferential statistics. For example, a researcher might compute group means and use the null hypothesis significance testing procedure to draw conclusions about the populations from which the groups were drawn. We propose an…

  10. Crop identification technology assessment for remote sensing. (CITARS) Volume 9: Statistical analysis of results

    Science.gov (United States)

    Davis, B. J.; Feiveson, A. H.

    1975-01-01

    Results are presented of CITARS data processing in raw form. Tables of descriptive statistics are given along with descriptions and results of inferential analyses. The inferential results are organized by questions which CITARS was designed to answer.

  11. Parametric description of the quantum measurement process

    Science.gov (United States)

    Liuzzo-Scorpo, P.; Cuccoli, A.; Verrucchi, P.

    2015-08-01

    We present a description of the measurement process based on the parametric representation with environmental coherent states. This representation is specifically tailored for studying quantum systems whose environment needs being considered through the quantum-to-classical crossover. Focusing upon projective measures, and exploiting the connection between large-N quantum theories and the classical limit of related ones, we manage to push our description beyond the pre-measurement step. This allows us to show that the outcome production follows from a global-symmetry breaking, entailing the observed system's state reduction, and that the statistical nature of the process is brought about, together with the Born's rule, by the macroscopic character of the measuring apparatus.

  12. CAN'T MISS--conquer any number task by making important statistics simple. Part 2. Probability, populations, samples, and normal distributions.

    Science.gov (United States)

    Hansen, John P

    2003-01-01

    Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.

  13. A Case Study on Teaching the Topic "Experimental Unit" and How It Is Presented in Advanced Placement Statistics Textbooks

    Science.gov (United States)

    Perrett, Jamis J.

    2012-01-01

    This article demonstrates how textbooks differ in their description of the term "experimental unit". Advanced Placement Statistics teachers and students are often limited in their statistical knowledge by the information presented in their classroom textbook. Definitions and descriptions differ among textbooks as well as among different…

  14. Mathematical statistics

    CERN Document Server

    Pestman, Wiebe R

    2009-01-01

    This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

  15. The history and library statistics of JAEA library activities

    Energy Technology Data Exchange (ETDEWEB)

    Itabashi, Keizo [Japan Atomic Energy Agency, Intellectual Resources Dept., Tokai, Ibaraki (Japan)

    2012-03-15

    The history and library statistics of the Japan Atomic Energy Agency library activity were summarized. Former Japan Atomic Energy Research Institute and the former Japan Nuclear Cycle Development Institute merged in October, 2005, and Japan Atomic Energy Agency is established. Properly speaking, the library statistics of old two corporations should have been summarized, but statistics of the Japan Nuclear Cycle Development Institute is not yet obtained. Then, although it is stated as the Japan Atomic Energy Agency library, it limits to the description about the old Japan Atomic Energy Research Institute library before 2004. (author)

  16. The history and library statistics of JAEA library activities

    International Nuclear Information System (INIS)

    Itabashi, Keizo

    2012-03-01

    The history and library statistics of the Japan Atomic Energy Agency library activity were summarized. Former Japan Atomic Energy Research Institute and the former Japan Nuclear Cycle Development Institute merged in October, 2005, and Japan Atomic Energy Agency is established. Properly speaking, the library statistics of old two corporations should have been summarized, but statistics of the Japan Nuclear Cycle Development Institute is not yet obtained. Then, although it is stated as the Japan Atomic Energy Agency library, it limits to the description about the old Japan Atomic Energy Research Institute library before 2004. (author)

  17. Biometry: the principles and practice of statistics in biological research

    National Research Council Canada - National Science Library

    Sokal, R.R; Rohlf, F.J

    1969-01-01

    In this introductory textbook, with its companion volume of tables, the authors provide a balanced presentation of statistical methodology for the descriptive, experimental, and analytical study of biological phenomena...

  18. Teaching Statistics Using Classic Psychology Research: An Activities-Based Approach

    Science.gov (United States)

    Holmes, Karen Y.; Dodd, Brett A.

    2012-01-01

    In this article, we discuss a collection of active learning activities derived from classic psychology studies that illustrate the appropriate use of descriptive and inferential statistics. (Contains 2 tables.)

  19. Quantum versus classical statistical dynamics of an ultracold Bose gas

    International Nuclear Information System (INIS)

    Berges, Juergen; Gasenzer, Thomas

    2007-01-01

    We investigate the conditions under which quantum fluctuations are relevant for the quantitative interpretation of experiments with ultracold Bose gases. This requires to go beyond the description in terms of the Gross-Pitaevskii and Hartree-Fock-Bogoliubov mean-field theories, which can be obtained as classical (statistical) field-theory approximations of the quantum many-body problem. We employ functional-integral techniques based on the two-particle irreducible (2PI) effective action. The role of quantum fluctuations is studied within the nonperturbative 2PI 1/N expansion to next-to-leading order. At this accuracy level memory integrals enter the dynamic equations, which differ for quantum and classical statistical descriptions. This can be used to obtain a classicality condition for the many-body dynamics. We exemplify this condition by studying the nonequilibrium evolution of a one-dimensional Bose gas of sodium atoms, and discuss some distinctive properties of quantum versus classical statistical dynamics

  20. A Stochastic Fractional Dynamics Model of Rainfall Statistics

    Science.gov (United States)

    Kundu, Prasun; Travis, James

    2013-04-01

    Rainfall varies in space and time in a highly irregular manner and is described naturally in terms of a stochastic process. A characteristic feature of rainfall statistics is that they depend strongly on the space-time scales over which rain data are averaged. A spectral model of precipitation has been developed based on a stochastic differential equation of fractional order for the point rain rate, that allows a concise description of the second moment statistics of rain at any prescribed space-time averaging scale. The model is designed to faithfully reflect the scale dependence and is thus capable of providing a unified description of the statistics of both radar and rain gauge data. The underlying dynamical equation can be expressed in terms of space-time derivatives of fractional orders that are adjusted together with other model parameters to fit the data. The form of the resulting spectrum gives the model adequate flexibility to capture the subtle interplay between the spatial and temporal scales of variability of rain but strongly constrains the predicted statistical behavior as a function of the averaging length and times scales. The main restriction is the assumption that the statistics of the precipitation field is spatially homogeneous and isotropic and stationary in time. We test the model with radar and gauge data collected contemporaneously at the NASA TRMM ground validation sites located near Melbourne, Florida and in Kwajalein Atoll, Marshall Islands in the tropical Pacific. We estimate the parameters by tuning them to the second moment statistics of the radar data. The model predictions are then found to fit the second moment statistics of the gauge data reasonably well without any further adjustment. Some data sets containing periods of non-stationary behavior that involves occasional anomalously correlated rain events, present a challenge for the model.

  1. A statistical background noise correction sensitive to the steadiness of background noise.

    Science.gov (United States)

    Oppenheimer, Charles H

    2016-10-01

    A statistical background noise correction is developed for removing background noise contributions from measured source levels, producing a background noise-corrected source level. Like the standard background noise corrections of ISO 3741, ISO 3744, ISO 3745, and ISO 11201, the statistical background correction increases as the background level approaches the measured source level, decreasing the background noise-corrected source level. Unlike the standard corrections, the statistical background correction increases with steadiness of the background and is excluded from use when background fluctuation could be responsible for measured differences between the source and background noise levels. The statistical background noise correction has several advantages over the standard correction: (1) enveloping the true source with known confidence, (2) assuring physical source descriptions when measuring sources in fluctuating backgrounds, (3) reducing background corrected source descriptions by 1 to 8 dB for sources in steady backgrounds, and (4) providing a means to replace standardized background correction caps that incentivize against high precision grade methods.

  2. Matrix algebra theory, computations and applications in statistics

    CERN Document Server

    Gentle, James E

    2017-01-01

    This textbook for graduate and advanced undergraduate students presents the theory of matrix algebra for statistical applications, explores various types of matrices encountered in statistics, and covers numerical linear algebra. Matrix algebra is one of the most important areas of mathematics in data science and in statistical theory, and the second edition of this very popular textbook provides essential updates and comprehensive coverage on critical topics in mathematics in data science and in statistical theory. Part I offers a self-contained description of relevant aspects of the theory of matrix algebra for applications in statistics. It begins with fundamental concepts of vectors and vector spaces; covers basic algebraic properties of matrices and analytic properties of vectors and matrices in multivariate calculus; and concludes with a discussion on operations on matrices in solutions of linear systems and in eigenanalysis. Part II considers various types of matrices encountered in statistics, such as...

  3. Statistical fluid mechanics

    CERN Document Server

    Monin, A S

    2007-01-01

    ""If ever a field needed a definitive book, it is the study of turbulence; if ever a book on turbulence could be called definitive, it is this book."" - ScienceWritten by two of Russia's most eminent and productive scientists in turbulence, oceanography, and atmospheric physics, this two-volume survey is renowned for its clarity as well as its comprehensive treatment. The first volume begins with an outline of laminar and turbulent flow. The remainder of the book treats a variety of aspects of turbulence: its statistical and Lagrangian descriptions, shear flows near surfaces and free turbulenc

  4. Statistical analysis of complex systems with nonclassical invariant measures

    KAUST Repository

    Fratalocchi, Andrea

    2011-01-01

    I investigate the problem of finding a statistical description of a complex many-body system whose invariant measure cannot be constructed stemming from classical thermodynamics ensembles. By taking solitons as a reference system and by employing a

  5. Quest for consistent modelling of statistical decay of the compound nucleus

    Science.gov (United States)

    Banerjee, Tathagata; Nath, S.; Pal, Santanu

    2018-01-01

    A statistical model description of heavy ion induced fusion-fission reactions is presented where shell effects, collective enhancement of level density, tilting away effect of compound nuclear spin and dissipation are included. It is shown that the inclusion of all these effects provides a consistent picture of fission where fission hindrance is required to explain the experimental values of both pre-scission neutron multiplicities and evaporation residue cross-sections in contrast to some of the earlier works where a fission hindrance is required for pre-scission neutrons but a fission enhancement for evaporation residue cross-sections.

  6. Textual information access statistical models

    CERN Document Server

    Gaussier, Eric

    2013-01-01

    This book presents statistical models that have recently been developed within several research communities to access information contained in text collections. The problems considered are linked to applications aiming at facilitating information access:- information extraction and retrieval;- text classification and clustering;- opinion mining;- comprehension aids (automatic summarization, machine translation, visualization).In order to give the reader as complete a description as possible, the focus is placed on the probability models used in the applications

  7. Fokker-Planck description for the queue dynamics of large tick stocks

    Science.gov (United States)

    Garèche, A.; Disdier, G.; Kockelkoren, J.; Bouchaud, J.-P.

    2013-09-01

    Motivated by empirical data, we develop a statistical description of the queue dynamics for large tick assets based on a two-dimensional Fokker-Planck (diffusion) equation. Our description explicitly includes state dependence, i.e., the fact that the drift and diffusion depend on the volume present on both sides of the spread. “Jump” events, corresponding to sudden changes of the best limit price, must also be included as birth-death terms in the Fokker-Planck equation. All quantities involved in the equation can be calibrated using high-frequency data on the best quotes. One of our central findings is that the dynamical process is approximately scale invariant, i.e., the only relevant variable is the ratio of the current volume in the queue to its average value. While the latter shows intraday seasonalities and strong variability across stocks and time periods, the dynamics of the rescaled volumes is universal. In terms of rescaled volumes, we found that the drift has a complex two-dimensional structure, which is a sum of a gradient contribution and a rotational contribution, both stable across stocks and time. This drift term is entirely responsible for the dynamical correlations between the ask queue and the bid queue.

  8. Statistics in Matlab a primer

    CERN Document Server

    Cho, MoonJung

    2014-01-01

    List of Tables Preface MATLAB BasicsDesktop Environment Getting Help and Other Documentation Data Import and Export Data I/O via the Command Line The Import Wizard Examples of Data I/O in MATLAB Data I/O with the Statistics Toolbox More Functions for Data I/O Data in MATLAB Data Objects in Base MATLAB Accessing Data Elements Examples of Joining Data Sets Data Types in the Statistics Toolbox Object-Oriented Programming Miscellaneous Topics File and Workspace Management Punctuation in MATLAB Arithmetic Operators Functions in MATLAB Summary and Further Reading Visualizing DataBasic Plot Functions Plotting 2-D Data Plotting 3-D Data Examples Scatter Plots Basic 2-D and 3-D Scatter Plots Scatter Plot Matrix Examples GUIs for Graphics Simple Plot Editing Plotting Tools Interface PLOTS Tab Summary and Further Reading Descriptive StatisticsMeasures of Location Means, Medians, and Modes Examples Measures of Dispersion Range Variance and Standard Deviation Covariance and Correlation Examples Describing the Distribution...

  9. The use of statistics in real and simulated investigations performed by undergraduate health sciences' students

    OpenAIRE

    Pimenta, Rui; Nascimento, Ana; Vieira, Margarida; Costa, Elísio

    2010-01-01

    In previous works, we evaluated the statistical reasoning ability acquired by health sciences’ students carrying out their final undergraduate project. We found that these students achieved a good level of statistical literacy and reasoning in descriptive statistics. However, concerning inferential statistics the students did not reach a similar level. Statistics educators therefore claim for more effective ways to learn statistics such as project based investigations. These can be simulat...

  10. Statistical Theory of the Vector Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune; Ibrahim, S. R.

    1999-01-01

    decays. Due to the speed and/or accuracy of the Vector Random Decrement technique, it was introduced as an attractive alternative to the Random Decrement technique. In this paper, the theory of the Vector Random Decrement technique is extended by applying a statistical description of the stochastic...

  11. Descriptive study of perioperative analgesic medications associated with general anesthesia for dental rehabilitation of children.

    Science.gov (United States)

    Carter, Laura; Wilson, Stephen; Tumer, Erwin G

    2010-01-01

    The purpose of this retrospective chart review was to document sedation and analgesic medications administered preoperotively, intraoperatively, and during postanesthesia care for children undergoing dental rehabilitation using general anesthesia (GA). Patient gender, age, procedure type performed, and ASA status were recorded from the medical charts of children undergoing GA for dental rehabilitation. The sedative and analgesic drugs administered pre-, intra-, and postoperatively were recorded. Statistical analysis included descriptive statistics and cross-tabulation. A sample of 115 patients with a mean age of 64 (+/-30) months was studied; 47% were females, and 71% were healthy. Over 80% of the patients were administered medications primarily during pre- and intraoperative phases, with fewer than 25% receiving medications postoperatively. Morphine and fentanyl were the most frequently administered agents intraoperatively. The procedure type, gender, and health status were not statistically associated with the number of agents administered. Younger patients, however, were statistically more likely to receive additional analgesic medications. Our study suggests that a minority of patients have postoperative discomfort in the postanesthesia care unit; mild to moderate analgesics were administered during intraoperative phases of dental rehabilitation.

  12. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  13. Statistical optics

    CERN Document Server

    Goodman, Joseph W

    2015-01-01

    This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications.  The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i

  14. Statistical physics

    CERN Document Server

    Sadovskii, Michael V

    2012-01-01

    This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.

  15. Functional summary statistics for the Johnson-Mehl model

    DEFF Research Database (Denmark)

    Møller, Jesper; Ghorbani, Mohammad

    The Johnson-Mehl germination-growth model is a spatio-temporal point process model which among other things have been used for the description of neurotransmitters datasets. However, for such datasets parametric Johnson-Mehl models fitted by maximum likelihood have yet not been evaluated by means...... of functional summary statistics. This paper therefore invents four functional summary statistics adapted to the Johnson-Mehl model, with two of them based on the second-order properties and the other two on the nuclei-boundary distances for the associated Johnson-Mehl tessellation. The functional summary...... statistics theoretical properties are investigated, non-parametric estimators are suggested, and their usefulness for model checking is examined in a simulation study. The functional summary statistics are also used for checking fitted parametric Johnson-Mehl models for a neurotransmitters dataset....

  16. Statistical mechanics of complex networks

    CERN Document Server

    Rubi, Miguel; Diaz-Guilera, Albert

    2003-01-01

    Networks can provide a useful model and graphic image useful for the description of a wide variety of web-like structures in the physical and man-made realms, e.g. protein networks, food webs and the Internet. The contributions gathered in the present volume provide both an introduction to, and an overview of, the multifaceted phenomenology of complex networks. Statistical Mechanics of Complex Networks also provides a state-of-the-art picture of current theoretical methods and approaches.

  17. Direct vs statistical decay of nuclear giant multipole resonances

    International Nuclear Information System (INIS)

    Hussein, M.S.

    1986-07-01

    A theoretical framework for the description of the decay of giant multipole resonances is developed. Besides the direct decay, both the pre-equilibrium and statistical (compound) decays are taken into account in a consistent way. It is shown that the statistical decay of the GR is not necessarily correctly described by the Hauser-Feshbach theory owing to the presence of a mixing parameter, which measures the degree of fragmentation. Applications are made to several cases. (Author) [pt

  18. Direct vs statistical decay of nuclear giant multipole resonances

    International Nuclear Information System (INIS)

    Dias, H.; Hussein, M.S.; Carlson, B.V.; Merchant, A.C.; Adhikari, S.K.

    1986-01-01

    A theoretical framework for the description of the decay of giant multipole resonances id developed. Besides the direct decay, both the pre-equilibrium and statistical (compound) decays are taken into account in a consistent way. It is shown that the statistical decay of the giant resonance is not necessarily described by the Hauser-Feshbach theory owing to the presence of a mixing parameter, which measures the degree of fragmentation. Applications are made to several cases. (Author) [pt

  19. Statistical theory of field fluctuations in a reversed-field pinch

    International Nuclear Information System (INIS)

    Turner, L.

    1982-01-01

    A statistical description of three-dimensional, incompressible turbulence in an ideal, current-bearing, bounded magnetofluid is given both analytically and numerically. Our results are then compared with existing data taken from reversed-field pinch experiments

  20. Breakthroughs in statistics

    CERN Document Server

    Johnson, Norman

    This is author-approved bcc: This is the third volume of a collection of seminal papers in the statistical sciences written during the past 110 years. These papers have each had an outstanding influence on the development of statistical theory and practice over the last century. Each paper is preceded by an introduction written by an authority in the field providing background information and assessing its influence. Volume III concerntrates on articles from the 1980's while including some earlier articles not included in Volume I and II. Samuel Kotz is Professor of Statistics in the College of Business and Management at the University of Maryland. Norman L. Johnson is Professor Emeritus of Statistics at the University of North Carolina. Also available: Breakthroughs in Statistics Volume I: Foundations and Basic Theory Samuel Kotz and Norman L. Johnson, Editors 1993. 631 pp. Softcover. ISBN 0-387-94037-5 Breakthroughs in Statistics Volume II: Methodology and Distribution Samuel Kotz and Norman L. Johnson, Edi...

  1. 40 CFR 233.11 - Program description.

    Science.gov (United States)

    2010-07-01

    ... organization and structure of the State agency (agencies) which will have responsibility for administering the... under § 233.10 shall include: (a) A description of the scope and structure of the State's program. The... will coordinate its enforcement strategy with that of the Corps and EPA; (h) A description of the...

  2. Token Economy: A Systematic Review of Procedural Descriptions.

    Science.gov (United States)

    Ivy, Jonathan W; Meindl, James N; Overley, Eric; Robson, Kristen M

    2017-09-01

    The token economy is a well-established and widely used behavioral intervention. A token economy is comprised of six procedural components: the target response(s), a token that functions as a conditioned reinforcer, backup reinforcers, and three interconnected schedules of reinforcement. Despite decades of applied research, the extent to which the procedures of a token economy are described in complete and replicable detail has not been evaluated. Given the inherent complexity of a token economy, an analysis of the procedural descriptions may benefit future token economy research and practice. Articles published between 2000 and 2015 that included implementation of a token economy within an applied setting were identified and reviewed with a focus on evaluating the thoroughness of procedural descriptions. The results show that token economy components are regularly omitted or described in vague terms. Of the articles included in this analysis, only 19% (18 of 96 articles reviewed) included replicable and complete descriptions of all primary components. Missing or vague component descriptions could negatively affect future research or applied practice. Recommendations are provided to improve component descriptions.

  3. RESEARCH OF THE DATA BANK OF STATISTICAL ANALYSIS OF THE ADVERTISING MARKET

    Directory of Open Access Journals (Sweden)

    Ekaterina F. Devochkina

    2014-01-01

    Full Text Available The article contains the description of the process of making statistical accounting of the Russian advertising market. The author pays attention to the forms of state statistical accounting of different years, marks their different features and shortage. Also the article contains analysis of alternative sources of numerical information of Russian advertising market.

  4. Statistical distribution for generalized ideal gas of fractional-statistics particles

    International Nuclear Information System (INIS)

    Wu, Y.

    1994-01-01

    We derive the occupation-number distribution in a generalized ideal gas of particles obeying fractional statistics, including mutual statistics, by adopting a state-counting definition. When there is no mutual statistics, the statistical distribution interpolates between bosons and fermions, and respects a fractional exclusion principle (except for bosons). Anyons in a strong magnetic field at low temperatures constitute such a physical system. Applications to the thermodynamic properties of quasiparticle excitations in the Laughlin quantum Hall fluid are discussed

  5. Statistics concerning the Apollo command module water landing, including the probability of occurrence of various impact conditions, sucessful impact, and body X-axis loads

    Science.gov (United States)

    Whitnah, A. M.; Howes, D. B.

    1971-01-01

    Statistical information for the Apollo command module water landings is presented. This information includes the probability of occurrence of various impact conditions, a successful impact, and body X-axis loads of various magnitudes.

  6. Layer Construction of 3D Topological States and String Braiding Statistics

    Directory of Open Access Journals (Sweden)

    Chao-Ming Jian

    2014-12-01

    Full Text Available While the topological order in two dimensions has been studied extensively since the discovery of the integer and fractional quantum Hall systems, topological states in three spatial dimensions are much less understood. In this paper, we propose a general formalism for constructing a large class of three-dimensional topological states by stacking layers of 2D topological states and introducing coupling between them. Using this construction, different types of topological states can be obtained, including those with only surface topological order and no bulk topological quasiparticles, and those with topological order both in the bulk and at the surface. For both classes of states, we study its generic properties and present several explicit examples. As an interesting consequence of this construction, we obtain example systems with nontrivial braiding statistics between string excitations. In addition to studying the string-string braiding in the example system, we propose a topological field-theory description for the layer-constructed systems, which captures not only the string-particle braiding statistics but also the string-string braiding statistics when the coupling is twisted. Last, we provide a proof of a general identity for Abelian string statistics and discuss an example system with non-Abelian strings.

  7. Application of image recognition algorithms for statistical description of nano- and microstructured surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Mărăscu, V.; Dinescu, G. [National Institute for Lasers, Plasma and Radiation Physics, 409 Atomistilor Street, Bucharest– Magurele (Romania); Faculty of Physics, University of Bucharest, 405 Atomistilor Street, Bucharest-Magurele (Romania); Chiţescu, I. [Faculty of Mathematics and Computer Science, University of Bucharest, 14 Academiei Street, Bucharest (Romania); Barna, V. [Faculty of Physics, University of Bucharest, 405 Atomistilor Street, Bucharest-Magurele (Romania); Ioniţă, M. D.; Lazea-Stoyanova, A.; Mitu, B., E-mail: mitub@infim.ro [National Institute for Lasers, Plasma and Radiation Physics, 409 Atomistilor Street, Bucharest– Magurele (Romania)

    2016-03-25

    In this paper we propose a statistical approach for describing the self-assembling of sub-micronic polystyrene beads on silicon surfaces, as well as the evolution of surface topography due to plasma treatments. Algorithms for image recognition are used in conjunction with Scanning Electron Microscopy (SEM) imaging of surfaces. In a first step, greyscale images of the surface covered by the polystyrene beads are obtained. Further, an adaptive thresholding method was applied for obtaining binary images. The next step consisted in automatic identification of polystyrene beads dimensions, by using Hough transform algorithm, according to beads radius. In order to analyze the uniformity of the self–assembled polystyrene beads, the squared modulus of 2-dimensional Fast Fourier Transform (2- D FFT) was applied. By combining these algorithms we obtain a powerful and fast statistical tool for analysis of micro and nanomaterials with aspect features regularly distributed on surface upon SEM examination.

  8. Application of image recognition algorithms for statistical description of nano- and microstructured surfaces

    International Nuclear Information System (INIS)

    Mărăscu, V.; Dinescu, G.; Chiţescu, I.; Barna, V.; Ioniţă, M. D.; Lazea-Stoyanova, A.; Mitu, B.

    2016-01-01

    In this paper we propose a statistical approach for describing the self-assembling of sub-micronic polystyrene beads on silicon surfaces, as well as the evolution of surface topography due to plasma treatments. Algorithms for image recognition are used in conjunction with Scanning Electron Microscopy (SEM) imaging of surfaces. In a first step, greyscale images of the surface covered by the polystyrene beads are obtained. Further, an adaptive thresholding method was applied for obtaining binary images. The next step consisted in automatic identification of polystyrene beads dimensions, by using Hough transform algorithm, according to beads radius. In order to analyze the uniformity of the self–assembled polystyrene beads, the squared modulus of 2-dimensional Fast Fourier Transform (2- D FFT) was applied. By combining these algorithms we obtain a powerful and fast statistical tool for analysis of micro and nanomaterials with aspect features regularly distributed on surface upon SEM examination.

  9. Statistical distributions applications and parameter estimates

    CERN Document Server

    Thomopoulos, Nick T

    2017-01-01

    This book gives a description of the group of statistical distributions that have ample application to studies in statistics and probability.  Understanding statistical distributions is fundamental for researchers in almost all disciplines.  The informed researcher will select the statistical distribution that best fits the data in the study at hand.  Some of the distributions are well known to the general researcher and are in use in a wide variety of ways.  Other useful distributions are less understood and are not in common use.  The book describes when and how to apply each of the distributions in research studies, with a goal to identify the distribution that best applies to the study.  The distributions are for continuous, discrete, and bivariate random variables.  In most studies, the parameter values are not known a priori, and sample data is needed to estimate parameter values.  In other scenarios, no sample data is available, and the researcher seeks some insight that allows the estimate of ...

  10. Analysis of photon statistics with Silicon Photomultiplier

    International Nuclear Information System (INIS)

    D'Ascenzo, N.; Saveliev, V.; Wang, L.; Xie, Q.

    2015-01-01

    The Silicon Photomultiplier (SiPM) is a novel silicon-based photodetector, which represents the modern perspective of low photon flux detection. The aim of this paper is to provide an introduction on the statistical analysis methods needed to understand and estimate in quantitative way the correct features and description of the response of the SiPM to a coherent source of light

  11. CRAC2 model description

    International Nuclear Information System (INIS)

    Ritchie, L.T.; Alpert, D.J.; Burke, R.P.; Johnson, J.D.; Ostmeyer, R.M.; Aldrich, D.C.; Blond, R.M.

    1984-03-01

    The CRAC2 computer code is a revised version of CRAC (Calculation of Reactor Accident Consequences) which was developed for the Reactor Safety Study. This document provides an overview of the CRAC2 code and a description of each of the models used. Significant improvements incorporated into CRAC2 include an improved weather sequence sampling technique, a new evacuation model, and new output capabilities. In addition, refinements have been made to the atmospheric transport and deposition model. Details of the modeling differences between CRAC2 and CRAC are emphasized in the model descriptions

  12. Generalizing: The descriptive struggle

    Directory of Open Access Journals (Sweden)

    Barney G. Glaser, Ph.D.; Hon Ph.D.

    2006-11-01

    Full Text Available The literature is not kind to the use of descriptive generalizations. Authors struggle and struggle to find and rationalize a way to use them and then fail in spite of trying a myriad of work-arounds. And then we have Lincoln and Guba’s famous statement: “The only generalization is: there is no generalization” in referring to qualitative research. (op cit, p. 110 They are referring to routine QDA yielding extensive descriptions, but which tacitly include conceptual generalizations without any real thought of knowledge about them. In this chapter I wish to explore this struggle for the purpose of explaining that the various contra arguments to using descriptive generalizations DO NOT apply to the ease of using conceptual generalizations yielded in SGT and especially FGT. I will not argue for the use of descriptive generalization. I agree with Lincoln and Guba with respect to QDA, “the only generalization is: there is no generalization.” It is up to the QDA methodologists, of whom there are many; to continue the struggle and I wish them well.

  13. Long-term strategy for the statistical design of a forest health monitoring system

    Science.gov (United States)

    Hans T. Schreuder; Raymond L. Czaplewski

    1993-01-01

    A conceptual framework is given for a broad-scale survey of forest health that accomplishes three objectives: generate descriptive statistics; detect changes in such statistics; and simplify analytical inferences that identify, and possibly establish cause-effect relationships. Our paper discusses the development of sampling schemes to satisfy these three objectives,...

  14. CMS Program Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics has developed CMS Program Statistics, which includes detailed summary statistics on national health care, Medicare...

  15. Plasma Soliton Turbulence and Statistical Mechanics

    International Nuclear Information System (INIS)

    Treumann, R.A.; Pottelette, R.

    1999-01-01

    Collisionless kinetic plasma turbulence is described approximately in terms of a superposition of non-interacting solitary waves. We discuss the relevance of such a description under astrophysical conditions. Several types of solitary waves may be of interest in this relation as generators of turbulence and turbulent transport. A consistent theory of turbulence can be given only in a few particular cases when the description can be reduced to the Korteweg-de Vries equation or some other simple equation like the Kadomtsev-Petviashvili equation. It turns out that the soliton turbulence is usually energetically harder than the ordinary weakly turbulent plasma description. This implies that interaction of particles with such kinds of turbulence can lead to stronger acceleration than in ordinary turbulence. However, the description in our model is only classical and non-relativistic. Transport in solitary turbulence is most important for drift wave turbulence. Such waves form solitary drift wave vortices which may provide cross-field transport. A more general discussion is given on transport. In a model of Levy flight trapping of particles in solitons (or solitary turbulence) one finds that the residence time of particles in the region of turbulence may be described by a generalized Lorentzian probability distribution. It is shown that under collisionless equilibrium conditions far away from thermal equilibrium such distributions are natural equilibrium distributions. A consistent thermodynamic description of such media can be given in terms of a generalized Lorentzian statistical mechanics and thermodynamics. (author)

  16. Emergence of quantum mechanics from classical statistics

    International Nuclear Information System (INIS)

    Wetterich, C

    2009-01-01

    The conceptual setting of quantum mechanics is subject to an ongoing debate from its beginnings until now. The consequences of the apparent differences between quantum statistics and classical statistics range from the philosophical interpretations to practical issues as quantum computing. In this note we demonstrate how quantum mechanics can emerge from classical statistical systems. We discuss conditions and circumstances for this to happen. Quantum systems describe isolated subsystems of classical statistical systems with infinitely many states. While infinitely many classical observables 'measure' properties of the subsystem and its environment, the state of the subsystem can be characterized by the expectation values of only a few probabilistic observables. They define a density matrix, and all the usual laws of quantum mechanics follow. No concepts beyond classical statistics are needed for quantum physics - the differences are only apparent and result from the particularities of those classical statistical systems which admit a quantum mechanical description. In particular, we show how the non-commuting properties of quantum operators are associated to the use of conditional probabilities within the classical system, and how a unitary time evolution reflects the isolation of the subsystem.

  17. Features of statistical dynamics in a finite system

    International Nuclear Information System (INIS)

    Yan, Shiwei; Sakata, Fumihiko; Zhuo Yizhong

    2002-01-01

    We study features of statistical dynamics in a finite Hamilton system composed of a relevant one degree of freedom coupled to an irrelevant multidegree of freedom system through a weak interaction. Special attention is paid on how the statistical dynamics changes depending on the number of degrees of freedom in the irrelevant system. It is found that the macrolevel statistical aspects are strongly related to an appearance of the microlevel chaotic motion, and a dissipation of the relevant motion is realized passing through three distinct stages: dephasing, statistical relaxation, and equilibrium regimes. It is clarified that the dynamical description and the conventional transport approach provide us with almost the same macrolevel and microlevel mechanisms only for the system with a very large number of irrelevant degrees of freedom. It is also shown that the statistical relaxation in the finite system is an anomalous diffusion and the fluctuation effects have a finite correlation time

  18. Statistical calculation of complete events in medium-energy nuclear collisions

    International Nuclear Information System (INIS)

    Randrup, J.

    1984-01-01

    Several heavy-ion accelerators throughout the world are presently able to deliver beams of heavy nuclei with kinetic energies in the range from tens to hundreds of MeV per nucleon, the so-called medium or intermediate energy range. At such energies a large number of final channels are open, each consisting of many nuclear fragments. The disassembly of the collision system is expected to be a very complicated process and a detailed dynamical description is beyond their present capability. However, by virtue of the complexity of the process, statistical considerations may be useful. A statistical description of the disassembly yields the least biased expectations about the outcome of a collision process and provides a meaningful reference against which more specific dynamical models, as well as the data, can be discussed. This lecture presents the essential tools for formulating a statistical model for the nuclear disassembly process. The authors consider the quick disassembly (explosion) of a hot nuclear system, a so-called source, into multifragment final states, which complete according to their statistical weight. First some useful notation is introduced. Then the expressions for exclusive and inclusive distributions are given and the factorization of an exclusive distribution into inclusive ones is carried out. In turn, the grand canonical approximation for one-fragment inclusive distributions is introduced. Finally, it is outlined how to generate a statistical sample of complete final states. On this basis, a model for statistical simulation of complete events in medium-energy nuclear collisions has been developed

  19. Football fever: goal distributions and non-Gaussian statistics

    Science.gov (United States)

    Bittner, E.; Nußbaumer, A.; Janke, W.; Weigel, M.

    2009-02-01

    Analyzing football score data with statistical techniques, we investigate how the not purely random, but highly co-operative nature of the game is reflected in averaged properties such as the probability distributions of scored goals for the home and away teams. As it turns out, especially the tails of the distributions are not well described by the Poissonian or binomial model resulting from the assumption of uncorrelated random events. Instead, a good effective description of the data is provided by less basic distributions such as the negative binomial one or the probability densities of extreme value statistics. To understand this behavior from a microscopical point of view, however, no waiting time problem or extremal process need be invoked. Instead, modifying the Bernoulli random process underlying the Poissonian model to include a simple component of self-affirmation seems to describe the data surprisingly well and allows to understand the observed deviation from Gaussian statistics. The phenomenological distributions used before can be understood as special cases within this framework. We analyzed historical football score data from many leagues in Europe as well as from international tournaments, including data from all past tournaments of the “FIFA World Cup” series, and found the proposed models to be applicable rather universally. In particular, here we analyze the results of the German women’s premier football league and consider the two separate German men’s premier leagues in the East and West during the cold war times as well as the unified league after 1990 to see how scoring in football and the component of self-affirmation depend on cultural and political circumstances.

  20. Microscopical descriptions of the fission fragmentation developed at CEA Bruyeres (France)

    International Nuclear Information System (INIS)

    Sida, J. L.

    2007-01-01

    The fission process has been studied from 1939 but there is no full theoretical description of the process. Two approaches have been developped at CEA Bruyeres le Chatel (France) in the basis of microscopic calculations with the Gogny Force. The first one is based on mean field calculations of the fission parameters (Potential enegy landscape, inertial parameters). The evolution of the wave function of the system is followed from the saddle point to the scission line in an adiabatic dynamical approach in order to determine the fission fragment distributions [GOU04]. The second one used the theoretical nuclear database AMEDEE (http://www-phynu.cea.fr/science_en_ligne/carte_potentiels_microscopiques/carte_potentiel_nucleaire.htm) which includes the mean field potential of more than 7000 nuclei. A precise energy balance is done at the scission point in order to define the available energy for each possible fragmentation. A statistical model is than used to determine the fragments distributions [HEI06]. This work is an improvement of the statistical scission point model of Wilkins et al [WIL76]. The free parameters of the previous description have been reduced to the minimum and there is still one parameter value that define the scission configuration which is not used ass a free parameter but has been fixed for the systematic that will be presented. This two microscopical models will be presented and the results will be discussed and compared to experiments. We will also point on their possible use to realize data evaluation for the burn-up of minor actinides, wastes of nuclear plants. (Author)

  1. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  2. Statistical limitations in functional neuroimaging. I. Non-inferential methods and statistical models.

    Science.gov (United States)

    Petersson, K M; Nichols, T E; Poline, J B; Holmes, A P

    1999-01-01

    Functional neuroimaging (FNI) provides experimental access to the intact living brain making it possible to study higher cognitive functions in humans. In this review and in a companion paper in this issue, we discuss some common methods used to analyse FNI data. The emphasis in both papers is on assumptions and limitations of the methods reviewed. There are several methods available to analyse FNI data indicating that none is optimal for all purposes. In order to make optimal use of the methods available it is important to know the limits of applicability. For the interpretation of FNI results it is also important to take into account the assumptions, approximations and inherent limitations of the methods used. This paper gives a brief overview over some non-inferential descriptive methods and common statistical models used in FNI. Issues relating to the complex problem of model selection are discussed. In general, proper model selection is a necessary prerequisite for the validity of the subsequent statistical inference. The non-inferential section describes methods that, combined with inspection of parameter estimates and other simple measures, can aid in the process of model selection and verification of assumptions. The section on statistical models covers approaches to global normalization and some aspects of univariate, multivariate, and Bayesian models. Finally, approaches to functional connectivity and effective connectivity are discussed. In the companion paper we review issues related to signal detection and statistical inference. PMID:10466149

  3. Difficulties in Learning and Teaching Statistics: Teacher Views

    Science.gov (United States)

    Koparan, Timur

    2015-01-01

    The purpose of this study is to define teacher views about the difficulties in learning and teaching middle school statistics subjects. To serve this aim, a number of interviews were conducted with 10 middle school maths teachers in 2011-2012 school year in the province of Trabzon. Of the qualitative descriptive research methods, the…

  4. Intuitive introductory statistics

    CERN Document Server

    Wolfe, Douglas A

    2017-01-01

    This textbook is designed to give an engaging introduction to statistics and the art of data analysis. The unique scope includes, but also goes beyond, classical methodology associated with the normal distribution. What if the normal model is not valid for a particular data set? This cutting-edge approach provides the alternatives. It is an introduction to the world and possibilities of statistics that uses exercises, computer analyses, and simulations throughout the core lessons. These elementary statistical methods are intuitive. Counting and ranking features prominently in the text. Nonparametric methods, for instance, are often based on counts and ranks and are very easy to integrate into an introductory course. The ease of computation with advanced calculators and statistical software, both of which factor into this text, allows important techniques to be introduced earlier in the study of statistics. This book's novel scope also includes measuring symmetry with Walsh averages, finding a nonp...

  5. A consistent description of kinetics and hydrodynamics of quantum Bose-systems

    Directory of Open Access Journals (Sweden)

    P.A.Hlushak

    2004-01-01

    Full Text Available A consistent approach to the description of kinetics and hydrodynamics of many-Boson systems is proposed. The generalized transport equations for strongly and weakly nonequilibrium Bose systems are obtained. Here we use the method of nonequilibrium statistical operator by D.N. Zubarev. New equations for the time distribution function of the quantum Bose system with a separate contribution from both the kinetic and potential energies of particle interactions are obtained. The generalized transport coefficients are determined accounting for the consistent description of kinetic and hydrodynamic processes.

  6. Statistical methods for forecasting

    CERN Document Server

    Abraham, Bovas

    2009-01-01

    The Wiley-Interscience Paperback Series consists of selected books that have been made more accessible to consumers in an effort to increase global appeal and general circulation. With these new unabridged softcover volumes, Wiley hopes to extend the lives of these works by making them available to future generations of statisticians, mathematicians, and scientists."This book, it must be said, lives up to the words on its advertising cover: ''Bridging the gap between introductory, descriptive approaches and highly advanced theoretical treatises, it provides a practical, intermediate level discussion of a variety of forecasting tools, and explains how they relate to one another, both in theory and practice.'' It does just that!"-Journal of the Royal Statistical Society"A well-written work that deals with statistical methods and models that can be used to produce short-term forecasts, this book has wide-ranging applications. It could be used in the context of a study of regression, forecasting, and time series ...

  7. Attitude towards statistics and performance among post-graduate students

    Science.gov (United States)

    Rosli, Mira Khalisa; Maat, Siti Mistima

    2017-05-01

    For student to master Statistics is a necessity, especially for those post-graduates that are involved in the research field. The purpose of this research was to identify the attitude towards Statistics among the post-graduates and to determine the relationship between the attitude towards Statistics and post-graduates' of Faculty of Education, UKM, Bangi performance. 173 post-graduate students were chosen randomly to participate in the study. These students registered in Research Methodology II course that was introduced by faculty. A survey of attitude toward Statistics using 5-points Likert scale was used for data collection purposes. The instrument consists of four components such as affective, cognitive competency, value and difficulty. The data was analyzed using the SPSS version 22 in producing the descriptive and inferential Statistics output. The result of this research showed that there is a medium and positive relation between attitude towards statistics and students' performance. As a conclusion, educators need to access students' attitude towards the course to accomplish the learning outcomes.

  8. Findings From a Nursing Care Audit Based on the Nursing Process: A Descriptive Study.

    Science.gov (United States)

    Poortaghi, Sarieh; Salsali, Mahvash; Ebadi, Abbas; Rahnavard, Zahra; Maleki, Farzaneh

    2015-09-01

    Although using the nursing process improves nursing care quality, few studies have evaluated nursing performance in accordance with nursing process steps either nationally or internationally. This study aimed to audit nursing care based on a nursing process model. This was a cross-sectional descriptive study in which a nursing audit checklist was designed and validated for assessing nurses' compliance with nursing process. A total of 300 nurses from various clinical settings of Tehran university of medical sciences were selected. Data were analyzed using descriptive and inferential statistics, including frequencies, Pearson correlation coefficient and independent samples t-tests. The compliance rate of nursing process indicators was 79.71 ± 0.87. Mean compliance scores did not significantly differ by education level and gender. However, overall compliance scores were correlated with nurses' age (r = 0.26, P = 0.001) and work experience (r = 0.273, P = 0.001). Nursing process indicators can be used to audit nursing care. Such audits can be used as quality assurance tools.

  9. Statistical density of nuclear excited states

    Directory of Open Access Journals (Sweden)

    V. M. Kolomietz

    2015-10-01

    Full Text Available A semi-classical approximation is applied to the calculations of single-particle and statistical level densities in excited nuclei. Landau's conception of quasi-particles with the nucleon effective mass m* < m is used. The approach provides the correct description of the continuum contribution to the level density for realistic finite-depth potentials. It is shown that the continuum states does not affect significantly the thermodynamic calculations for sufficiently small temperatures T ≤ 1 MeV but reduce strongly the results for the excitation energy at high temperatures. By use of standard Woods - Saxon potential and nucleon effective mass m* = 0.7m the A-dependency of the statistical level density parameter K was evaluated in a good qualitative agreement with experimental data.

  10. The κ parameter and κ-distribution in κ-deformed statistics for the systems in an external field

    International Nuclear Information System (INIS)

    Guo, Lina; Du, Jiulin

    2007-01-01

    It is naturally important question for us to ask under what physical situation should the κ-deformed statistics be suitable for the statistical description of a system and what should the κ parameter stand for. In this Letter, a formula expression of κ parameter is derived on the basis of the κ-H theorem, the κ-velocity distribution and the generalized Boltzmann equation in the framework of κ-deformed statistics. We thus obtain a physical interpretation for the parameter κ 0 with regard to the temperature gradient and the external force field. We show, as the q-statistics based on Tsallis entropy, the κ-deformed statistics may also be the candidate one suitable for the statistical description of the systems in external fields when being in the nonequilibrium stationary state, but has different physical characteristics. Namely, the κ-distribution is found to describe the nonequilibrium stationary state of the system where the external force should be vertical to the temperature gradient

  11. Illinois' Forests, 2005: Statistics, Methods, and Quality Assurance

    Science.gov (United States)

    Susan J. Crocker; Charles J. Barnett; Mark A. Hatfield

    2013-01-01

    The first full annual inventory of Illinois' forests was completed in 2005. This report contains 1) descriptive information on methods, statistics, and quality assurance of data collection, 2) a glossary of terms, 3) tables that summarize quality assurance, and 4) a core set of tabular estimates for a variety of forest resources. A detailed analysis of inventory...

  12. The transportation operations system: A description

    International Nuclear Information System (INIS)

    Best, R.E.; Danese, F.L.; Dixon, L.D.; Peterson, R.W.; Pope, R.B.

    1990-01-01

    This paper presents a description of the system for transporting radioactive waste that may be deployed to accomplish the assigned system mission, which includes accepting spent nuclear fuel (SNF) and high-level radioactive waste (HLW) from waste generator sites and transporting them to the FWMS destination facilities. The system description presented here contains, in part, irradiated fuel and waste casks, ancillary equipments, truck, rail, and barge transporters, cask and vehicle traffic management organizations, maintenance facilities, and other operations elements. The description is for a fully implemented system, which is not expected to be achieved, however, until several years after initial operations. 6 figs

  13. Informing Evidence Based Decisions: Usage Statistics for Online Journal Databases

    Directory of Open Access Journals (Sweden)

    Alexei Botchkarev

    2017-06-01

    Full Text Available Abstract Objective – The primary objective was to examine online journal database usage statistics for a provincial ministry of health in the context of evidence based decision-making. In addition, the study highlights implementation of the Journal Access Centre (JAC that is housed and powered by the Ontario Ministry of Health and Long-Term Care (MOHLTC to inform health systems policy-making. Methods – This was a prospective case study using descriptive analysis of the JAC usage statistics of journal articles from January 2009 to September 2013. Results – JAC enables ministry employees to access approximately 12,000 journals with full-text articles. JAC usage statistics for the 2011-2012 calendar years demonstrate a steady level of activity in terms of searches, with monthly averages of 5,129. In 2009-2013, a total of 4,759 journal titles were accessed including 1,675 journals with full-text. Usage statistics demonstrate that the actual consumption was over 12,790 full-text downloaded articles or approximately 2,700 articles annually. Conclusion – JAC’s steady level of activities, revealed by the study, reflects continuous demand for JAC services and products. It testifies that access to online journal databases has become part of routine government knowledge management processes. MOHLTC’s broad area of responsibilities with dynamically changing priorities translates into the diverse information needs of its employees and a large set of required journals. Usage statistics indicate that MOHLTC information needs cannot be mapped to a reasonably compact set of “core” journals with a subsequent subscription to those.

  14. Statistical methodology for discrete fracture model - including fracture size, orientation uncertainty together with intensity uncertainty and variability

    International Nuclear Information System (INIS)

    Darcel, C.; Davy, P.; Le Goc, R.; Dreuzy, J.R. de; Bour, O.

    2009-11-01

    starting point we built Statistical Fracture Domains whose significance rely exclusively on fracturing statistics, not including explicitly the current Fracture Domains or closeness between one borehole section or the other. Theoretical developments are proposed in order to incorporate the orientation uncertainty and the fracturing variability into a resulting parent distribution density uncertainty. When applied to both sites, it comes that variability prevails in front of uncertainty, thus validating the good level of data accuracy. Moreover, this allows to define a possible range of variation around the mean values of densities. Finally a sorting algorithm is developed for providing, from the initial elementary bricks mentioned above, a division of a site into Statistical Fracture Domains whose internal variability is reduced. The systematic comparison is based on the division of the datasets according to several densities referring to a division of the orientations into 13 subsets (pole zones). The first application of the methodology shows that some main trends can be defined for the orientation/density distributions throughout the site, which are combined with a high level of overlapping. Moreover the final Statistical Fracture Domain definition differ from the Fracture Domains existing at the site. The SFD are an objective comparison of statistical fracturing properties. Several perspectives are proposed in order to bridge the gap between constraints brought by a relevant statistical modeling and modeling specificities of the SKB sites and more generally conditions inherent to geological models

  15. Waste Management Systems Requirements and Descriptions (SRD)

    International Nuclear Information System (INIS)

    Conner, C.W.

    1986-01-01

    The Department of Energy (DOE), Office of Civilian Radioactive Waste Management (OCRWM) is responsible for the development of a system for the management of high-level radioactive waste and spent fuel in accordance with the Nuclear Waste Policy Act of 1982. The Waste Management system requirements and description document is the program-level technical baseline document. The requirements include the functions that must be performed in order to achieve the system mission and performance criteria for those functions. This document covers only the functional requirements of the system; it does not cover programmatic or procedural requirements pertaining to the processes of designing, siting and licensing. The requirements are largely based on the Nuclear Waste Policy Act of 1982, Environmental Protection Agency standards, Nuclear Regulatory Commission regulations, and DOE orders and guidance. However, nothing in this document should be construed as to relieve the DOE or its contractors from their responsibilities to comply with applicable statutes, regulations, and standards. This document also provides a brief description of the system being developed to meet the requirements. In addition to the described ''authorized system,'' a system description is provided for an ''improved-performance system'' which would include a monitored retrievable storage (MRS) facility. In the event that an MRS facility is approved by Congress, the improved-performance system will become the reference system. Neither system description includes Federal Interim Storage (FIS) capabilities. Should the need for FIS be identified, it will be included as an additional system element. The descriptions are focused on the interfaces between the system elements, rather than on the detail of the system elements themselves

  16. A DESCRIPTION OF BUFO PARDALIS TADPOLES (ANURA ...

    African Journals Online (AJOL)

    Tadpoles of Bufo parda/is Hewitt from Kei Road, Cape Province, are described. INTRODUCTION. Although tadpoles of B. pardalis have been included in Van Dijk's (1971) key to the genus Bufa, no adequate description of this taxon has yet been published. Further studies on variability depend upon a complete description ...

  17. System Design Description for the TMAD Code

    International Nuclear Information System (INIS)

    Finfrock, S.H.

    1995-01-01

    This document serves as the System Design Description (SDD) for the TMAD Code System, which includes the TMAD code and the LIBMAKR code. The SDD provides a detailed description of the theory behind the code, and the implementation of that theory. It is essential for anyone who is attempting to review or modify the code or who otherwise needs to understand the internal workings of the code. In addition, this document includes, in Appendix A, the System Requirements Specification for the TMAD System

  18. Capturing rogue waves by multi-point statistics

    International Nuclear Information System (INIS)

    Hadjihosseini, A; Wächter, Matthias; Peinke, J; Hoffmann, N P

    2016-01-01

    As an example of a complex system with extreme events, we investigate ocean wave states exhibiting rogue waves. We present a statistical method of data analysis based on multi-point statistics which for the first time allows the grasping of extreme rogue wave events in a highly satisfactory statistical manner. The key to the success of the approach is mapping the complexity of multi-point data onto the statistics of hierarchically ordered height increments for different time scales, for which we can show that a stochastic cascade process with Markov properties is governed by a Fokker–Planck equation. Conditional probabilities as well as the Fokker–Planck equation itself can be estimated directly from the available observational data. With this stochastic description surrogate data sets can in turn be generated, which makes it possible to work out arbitrary statistical features of the complex sea state in general, and extreme rogue wave events in particular. The results also open up new perspectives for forecasting the occurrence probability of extreme rogue wave events, and even for forecasting the occurrence of individual rogue waves based on precursory dynamics. (paper)

  19. Quantum formalism for classical statistics

    Science.gov (United States)

    Wetterich, C.

    2018-06-01

    In static classical statistical systems the problem of information transport from a boundary to the bulk finds a simple description in terms of wave functions or density matrices. While the transfer matrix formalism is a type of Heisenberg picture for this problem, we develop here the associated Schrödinger picture that keeps track of the local probabilistic information. The transport of the probabilistic information between neighboring hypersurfaces obeys a linear evolution equation, and therefore the superposition principle for the possible solutions. Operators are associated to local observables, with rules for the computation of expectation values similar to quantum mechanics. We discuss how non-commutativity naturally arises in this setting. Also other features characteristic of quantum mechanics, such as complex structure, change of basis or symmetry transformations, can be found in classical statistics once formulated in terms of wave functions or density matrices. We construct for every quantum system an equivalent classical statistical system, such that time in quantum mechanics corresponds to the location of hypersurfaces in the classical probabilistic ensemble. For suitable choices of local observables in the classical statistical system one can, in principle, compute all expectation values and correlations of observables in the quantum system from the local probabilistic information of the associated classical statistical system. Realizing a static memory material as a quantum simulator for a given quantum system is not a matter of principle, but rather of practical simplicity.

  20. Statistics

    International Nuclear Information System (INIS)

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  1. Statistics

    International Nuclear Information System (INIS)

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  2. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  3. A First Assignment to Create Student Buy-In in an Introductory Business Statistics Course

    Science.gov (United States)

    Newfeld, Daria

    2016-01-01

    This paper presents a sample assignment to be administered after the first two weeks of an introductory business focused statistics course in order to promote student buy-in. This assignment integrates graphical displays of data, descriptive statistics and cross-tabulation analysis through the lens of a marketing analysis study. A marketing sample…

  4. A Descriptive, Cross-sectional Survey of Turkish Nurses' Knowledge of Pressure Ulcer Risk, Prevention, and Staging.

    Science.gov (United States)

    Gul, Asiye; Andsoy, Isil Isik; Ozkaya, Birgul; Zeydan, Ayten

    2017-06-01

    Nurses' knowledge of pressure ulcer (PU) prevention and management is an important first step in the provision of optimal care. To evaluate PU prevention/risk, staging, and wound description knowledge, a descriptive, cross-sectional survey was conducted among nurses working in an acute care Turkish hospital. The survey instrument was a modified and translated version of the Pieper Pressure Ulcer Knowledge Test (PUKT), and its validity and reliability were established. Nurses completed a Personal Characteristics Form, including sociodemographic information and exposure to educational presentations and information about and experience with PUs, followed by the 49-item modified PUKT which includes 33 prevention/risk items, 9 staging items, and 7 wound description items. All items are true/false questions with an I don't know option (scoring: minimum 0, maximum 49). Correct answers received 1 point and incorrect/unknown answers received 0 points. The paper-pencil questionnaires were distributed by 2 researchers to all nurses in the participating hospital and completed by those willing to be included. Responses were analyzed using descriptive statistics. Pearson's correlation test was used to examine the relationship between quantitative variables, and mean scores were compared using the Mann-Whitney U and Kruskal-Wallis tests. Among the 308 participating nurses (mean age 29.5 ± 8.1 [range 19-56] years) most were women (257, 83.4%) with 7.3 ± 7.8 (range 1-36) years of experience. The mean knowledge score for the entire sample was 29.7 ± 6.7 (range 8-42). The overall percentage of correct answers was 60.6% to 61.8% for PU prevention/risk assessment, 60% for wound description, and 56.6% for PU staging. Knowledge scores were significantly (P pressure on the heels" (22, 7.1%). The results of this study suggest education and experience caring for patients who are at risk for or have a PU affect nurses' knowledge. This study, and additional research examining nurse

  5. Statistical equilibrium and symplectic geometry in general relativity

    International Nuclear Information System (INIS)

    Iglesias, P.

    1981-09-01

    A geometrical construction is given of the statistical equilibrium states of a system of particles in the gravitational field in general relativity. By a method of localization variables, the expression of thermodynamic values is given and the compatibility of this description is shown with a macroscopic model of a relativistic continuous medium for a given value of the free-energy function [fr

  6. Global description of (n,p) - and (n,2n) - activation cross sections within statistical multistep theory

    International Nuclear Information System (INIS)

    Kalka, H.; Torjman, M.; Seeliger, D.; Lopez, R.

    1989-07-01

    A unique description of (n,p) and (n,2n) activation cross sections as well as emission spectra is proposed within a pure multistep approach. Calculations are presented for 8 nuclei (A=47...65) in the incident energy range from zero up to 20 MeV. (author). 42 refs, 5 figs, 1 tab

  7. Principles of classical statistical mechanics: A perspective from the notion of complementarity

    International Nuclear Information System (INIS)

    Velazquez Abad, Luisberis

    2012-01-01

    Quantum mechanics and classical statistical mechanics are two physical theories that share several analogies in their mathematical apparatus and physical foundations. In particular, classical statistical mechanics is hallmarked by the complementarity between two descriptions that are unified in thermodynamics: (i) the parametrization of the system macrostate in terms of mechanical macroscopic observablesI=(I i ), and (ii) the dynamical description that explains the evolution of a system towards the thermodynamic equilibrium. As expected, such a complementarity is related to the uncertainty relations of classical statistical mechanics ΔI i Δη i ≥k. Here, k is the Boltzmann constant, η i =∂S(I|θ)/∂I i are the restituting generalized forces derived from the entropy S(I|θ) of a closed system, which is found in an equilibrium situation driven by certain control parameters θ=(θ α ). These arguments constitute the central ingredients of a reformulation of classical statistical mechanics from the notion of complementarity. In this new framework, Einstein postulate of classical fluctuation theory dp(I|θ)∼exp[S(I|θ)/k]dI appears as the correspondence principle between classical statistical mechanics and thermodynamics in the limit k→0, while the existence of uncertainty relations can be associated with the non-commuting character of certain operators. - Highlights: ► There exists a direct analogy between quantum and classical statistical mechanics. ► Statistical form of Le Chatellier principle leads to the uncertainty principle. ► Einstein postulate is simply the correspondence principle. ► Complementary quantities are associated with non-commuting operators.

  8. Emended description of Campylobacter sputorum and revision of its infrasubspecific (biovar) divisions, including C-sputorum biovar paraureolyticus, a urease-producing variant from cattle and humans

    DEFF Research Database (Denmark)

    On, S.L.W.; Atabay, H.I.; Corry, J.E.L.

    1998-01-01

    A polyphasic taxonomic study of 15 bovine and human strains assigned to the catalase-negative, urease-positive campylobacter (CNUPC) group identified these bacteria as a novel, ureolytic biovar of Campylobacter sputorum for which we propose the name C. sputorum bv. paraureolyticus: suitable...... should be revised to include by. sputorum for catalase-negative strains; by. fecalis for catalase-positive strains; and by. paraureolyticus for urease-positive strains. Strains classified previously as by. bubulus should be reclassified as by. sputorum. The species description of C. sputorum is revised...

  9. Statistics as Unbiased Estimators: Exploring the Teaching of Standard Deviation

    Science.gov (United States)

    Wasserman, Nicholas H.; Casey, Stephanie; Champion, Joe; Huey, Maryann

    2017-01-01

    This manuscript presents findings from a study about the knowledge for and planned teaching of standard deviation. We investigate how understanding variance as an unbiased (inferential) estimator--not just a descriptive statistic for the variation (spread) in data--is related to teachers' instruction regarding standard deviation, particularly…

  10. Synthetic environmental indicators: A conceptual approach from the multivariate statistics

    International Nuclear Information System (INIS)

    Escobar J, Luis A

    2008-01-01

    This paper presents a general description of multivariate statistical analysis and shows two methodologies: analysis of principal components and analysis of distance, DP2. Both methods use techniques of multivariate analysis to define the true dimension of data, which is useful to estimate indicators of environmental quality.

  11. Statistical analysis of the description accuracy of dependence of flow stresses upon the deformation rate in the state of superplasticity by phenomenological equations

    International Nuclear Information System (INIS)

    Bojtsov, V.V.; Tsepin, M.A.; Karpilyanskij, N.N.; Ershov, A.N.

    1982-01-01

    Results of statistical analysis of the description accuracy of superplasticity S-form curve by different analytic expressions, suggested on the basis of phenomenological and metallophysical concepts about the nature of superplastic deformation, are given. Experimental investigations into the dependence of flow stresses on the deformation rate were conducted on VT3-1 two-phase titanium alloy. Test samples were cut out of a rod, 30 mm in diameter, produced by lengthwise rolling in α+#betta#-region. Optimal temperature of superplasticity manifestation was determined by the method of stress relaxation from a relaxation time value to a given stress. It was established that the Smirnov phemonemological equation describes in the best way the rate dependence of flow stress of superplastic material. This equation can be used for solution of problems of studying mechanism, physical nature of superplastic deformation, analysing strain-stress state and the structure of deformation zone during the processes of pressure shaping of superplastic materials, when considerably wide range (in the limits of 7-8 orders) of deformation rate variation takes place

  12. Statistical analysis of dynamic parameters of the core

    International Nuclear Information System (INIS)

    Ionov, V.S.

    2007-01-01

    The transients of various types were investigated for the cores of zero power critical facilities in RRC KI and NPP. Dynamic parameters of neutron transients were explored by tool statistical analysis. Its have sufficient duration, few channels for currents of chambers and reactivity and also some channels for technological parameters. On these values the inverse period. reactivity, lifetime of neutrons, reactivity coefficients and some effects of a reactivity are determinate, and on the values were restored values of measured dynamic parameters as result of the analysis. The mathematical means of statistical analysis were used: approximation(A), filtration (F), rejection (R), estimation of parameters of descriptive statistic (DSP), correlation performances (kk), regression analysis(KP), the prognosis (P), statistician criteria (SC). The calculation procedures were realized by computer language MATLAB. The reasons of methodical and statistical errors are submitted: inadequacy of model operation, precision neutron-physical parameters, features of registered processes, used mathematical model in reactivity meters, technique of processing for registered data etc. Examples of results of statistical analysis. Problems of validity of the methods used for definition and certification of values of statistical parameters and dynamic characteristics are considered (Authors)

  13. Statistics of Extremes

    KAUST Repository

    Davison, Anthony C.

    2015-04-10

    Statistics of extremes concerns inference for rare events. Often the events have never yet been observed, and their probabilities must therefore be estimated by extrapolation of tail models fitted to available data. Because data concerning the event of interest may be very limited, efficient methods of inference play an important role. This article reviews this domain, emphasizing current research topics. We first sketch the classical theory of extremes for maxima and threshold exceedances of stationary series. We then review multivariate theory, distinguishing asymptotic independence and dependence models, followed by a description of models for spatial and spatiotemporal extreme events. Finally, we discuss inference and describe two applications. Animations illustrate some of the main ideas. © 2015 by Annual Reviews. All rights reserved.

  14. Communication About Sexual Matters With Women Attending a Danish Fertility Clinic: A Descriptive Study

    DEFF Research Database (Denmark)

    Fiil Eldridge, Katrine; Giraldi, Annamaria

    2017-01-01

    Introduction: Several studies have shown that sexuality is an important aspect of life. Nevertheless, sexual matters are only rarely discussed between patients and doctors. Other studies have suggested that women undergoing fertility treatment compose a group of patients with low satisfaction...... in their sexual life. Aim: To investigate how women at a fertility clinic desire and experience communication about sexual matters with doctors and to investigate the sexual function of these women. Methods: A cross-sectional self-administered questionnaire survey of women attending a Danish fertility clinic over...... 4 months was performed. Descriptive statistics were calculated and presented as frequencies. Main Outcome Measure: Communication about sexual matters with doctors included the women’s comfort, preferred and actual frequency of discussion, and initiation of the conversation. Sexual function included...

  15. Statistical approach for collaborative tests, reference material certification procedures

    International Nuclear Information System (INIS)

    Fangmeyer, H.; Haemers, L.; Larisse, J.

    1977-01-01

    The first part introduces the different aspects in organizing and executing intercomparison tests of chemical or physical quantities. It follows a description of a statistical procedure to handle the data collected in a circular analysis. Finally, an example demonstrates how the tool can be applied and which conclusion can be drawn of the results obtained

  16. Trends in statistical methods in articles published in Archives of Plastic Surgery between 2012 and 2017.

    Science.gov (United States)

    Han, Kyunghwa; Jung, Inkyung

    2018-05-01

    This review article presents an assessment of trends in statistical methods and an evaluation of their appropriateness in articles published in the Archives of Plastic Surgery (APS) from 2012 to 2017. We reviewed 388 original articles published in APS between 2012 and 2017. We categorized the articles that used statistical methods according to the type of statistical method, the number of statistical methods, and the type of statistical software used. We checked whether there were errors in the description of statistical methods and results. A total of 230 articles (59.3%) published in APS between 2012 and 2017 used one or more statistical method. Within these articles, there were 261 applications of statistical methods with continuous or ordinal outcomes, and 139 applications of statistical methods with categorical outcome. The Pearson chi-square test (17.4%) and the Mann-Whitney U test (14.4%) were the most frequently used methods. Errors in describing statistical methods and results were found in 133 of the 230 articles (57.8%). Inadequate description of P-values was the most common error (39.1%). Among the 230 articles that used statistical methods, 71.7% provided details about the statistical software programs used for the analyses. SPSS was predominantly used in the articles that presented statistical analyses. We found that the use of statistical methods in APS has increased over the last 6 years. It seems that researchers have been paying more attention to the proper use of statistics in recent years. It is expected that these positive trends will continue in APS.

  17. Methods in pharmacoepidemiology: a review of statistical analyses and data reporting in pediatric drug utilization studies.

    Science.gov (United States)

    Sequi, Marco; Campi, Rita; Clavenna, Antonio; Bonati, Maurizio

    2013-03-01

    To evaluate the quality of data reporting and statistical methods performed in drug utilization studies in the pediatric population. Drug utilization studies evaluating all drug prescriptions to children and adolescents published between January 1994 and December 2011 were retrieved and analyzed. For each study, information on measures of exposure/consumption, the covariates considered, descriptive and inferential analyses, statistical tests, and methods of data reporting was extracted. An overall quality score was created for each study using a 12-item checklist that took into account the presence of outcome measures, covariates of measures, descriptive measures, statistical tests, and graphical representation. A total of 22 studies were reviewed and analyzed. Of these, 20 studies reported at least one descriptive measure. The mean was the most commonly used measure (18 studies), but only five of these also reported the standard deviation. Statistical analyses were performed in 12 studies, with the chi-square test being the most commonly performed test. Graphs were presented in 14 papers. Sixteen papers reported the number of drug prescriptions and/or packages, and ten reported the prevalence of the drug prescription. The mean quality score was 8 (median 9). Only seven of the 22 studies received a score of ≥10, while four studies received a score of statistical methods and reported data in a satisfactory manner. We therefore conclude that the methodology of drug utilization studies needs to be improved.

  18. A theoretical description of inhomogeneous turbulence

    International Nuclear Information System (INIS)

    Turner, L.

    2000-01-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). In this LDRD, we have developed a highly compact and descriptive formalism that allows us to broach the theoretically formidable morass of inhomogeneous turbulence. Our formalism has two novel aspects: (a) an adaptation of helicity basis functions to represent an arbitrary incompressible channel flow and (b) the invocation of a hypothesis of random phase. A result of this compact formalism is that the mathematical description of inhomogeneous turbulence looks much like that of homogeneous turbulence--at the moment, the most rigorously explored terrain in turbulence research. As a result, we can explore the effect of boundaries on such important quantities as the gradients of mean flow, mean pressure, triple-velocity correlations and pressure velocity correlations, all of which vanish under the conventional, but artificial, assumption that the turbulence is statistically spatially uniform. Under suitable conditions, we have predicted that a mean flow gradient can develop even when none is initially present

  19. Equilibrium statistical mechanics

    CERN Document Server

    Jackson, E Atlee

    2000-01-01

    Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t

  20. A Case for Including Atmospheric Thermodynamic Variables in Wind Turbine Fatigue Loading Parameter Identification

    International Nuclear Information System (INIS)

    Kelley, Neil D.

    1999-01-01

    This paper makes the case for establishing efficient predictor variables for atmospheric thermodynamics that can be used to statistically correlate the fatigue accumulation seen on wind turbines. Recently, two approaches to this issue have been reported. One uses multiple linear-regression analysis to establish the relative causality between a number of predictors related to the turbulent inflow and turbine loads. The other approach, using many of the same predictors, applies the technique of principal component analysis. An examination of the ensemble of predictor variables revealed that they were all kinematic in nature; i.e., they were only related to the description of the velocity field. Boundary-layer turbulence dynamics depends upon a description of the thermal field and its interaction with the velocity distribution. We used a series of measurements taken within a multi-row wind farm to demonstrate the need to include atmospheric thermodynamic variables as well as velocity-related ones in the search for efficient turbulence loading predictors in various turbine-operating environments. Our results show that a combination of vertical stability and hub-height mean shearing stress variables meet this need over a period of 10 minutes

  1. 40 CFR 239.4 - Narrative description of state permit program.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Narrative description of state permit program. 239.4 Section 239.4 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID... Narrative description of state permit program. The description of a state's program must include: (a) An...

  2. Statistical modelling of transcript profiles of differentially regulated genes

    Directory of Open Access Journals (Sweden)

    Sergeant Martin J

    2008-07-01

    Full Text Available Abstract Background The vast quantities of gene expression profiling data produced in microarray studies, and the more precise quantitative PCR, are often not statistically analysed to their full potential. Previous studies have summarised gene expression profiles using simple descriptive statistics, basic analysis of variance (ANOVA and the clustering of genes based on simple models fitted to their expression profiles over time. We report the novel application of statistical non-linear regression modelling techniques to describe the shapes of expression profiles for the fungus Agaricus bisporus, quantified by PCR, and for E. coli and Rattus norvegicus, using microarray technology. The use of parametric non-linear regression models provides a more precise description of expression profiles, reducing the "noise" of the raw data to produce a clear "signal" given by the fitted curve, and describing each profile with a small number of biologically interpretable parameters. This approach then allows the direct comparison and clustering of the shapes of response patterns between genes and potentially enables a greater exploration and interpretation of the biological processes driving gene expression. Results Quantitative reverse transcriptase PCR-derived time-course data of genes were modelled. "Split-line" or "broken-stick" regression identified the initial time of gene up-regulation, enabling the classification of genes into those with primary and secondary responses. Five-day profiles were modelled using the biologically-oriented, critical exponential curve, y(t = A + (B + CtRt + ε. This non-linear regression approach allowed the expression patterns for different genes to be compared in terms of curve shape, time of maximal transcript level and the decline and asymptotic response levels. Three distinct regulatory patterns were identified for the five genes studied. Applying the regression modelling approach to microarray-derived time course data

  3. Statistics Clinic

    Science.gov (United States)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  4. Between Certainty and Uncertainty Statistics and Probability in Five Units with Notes on Historical Origins and Illustrative Numerical Examples

    CERN Document Server

    Laudański, Ludomir M

    2013-01-01

    „Between Certainty & Uncertainty” is a one-of–a-kind short course on statistics for students, engineers  and researchers.  It is a fascinating introduction to statistics and probability with notes on historical origins and 80 illustrative numerical examples organized in the five units:   ·         Chapter 1  Descriptive Statistics:  Compressing small samples, basic averages - mean and variance, their main properties including God’s proof; linear transformations and z-scored statistics .   ·         Chapter 2 Grouped data: Udny Yule’s concept of qualitative and quantitative variables. Grouping these two kinds of data. Graphical tools. Combinatorial rules and qualitative variables.  Designing frequency histogram. Direct and coded evaluation of quantitative data. Significance of percentiles.   ·         Chapter 3 Regression and correlation: Geometrical distance and equivalent distances in two orthogonal directions  as a prerequisite to the concept of two regressi...

  5. Thermal and statistical properties of nuclei and nuclear systems

    International Nuclear Information System (INIS)

    Moretto, L.G.; Wozniak, G.J.

    1989-07-01

    The term statistical decay, statistical or thermodynamic equilibrium, thermalization, temperature, etc., have been used in nuclear physics since the introduction of the compound nucleus (CN) concept, and they are still used, perhaps even more frequently, in the context of intermediate- and high-energy heavy-ion reactions. Unfortunately, the increased popularity of these terms has not made them any clearer, and more often than not one encounters sweeping statements about the alleged statisticity of a nuclear process where the ''statistical'' connotation is a more apt description of the state of the speaker's mind than of the nuclear reaction. It is our goal, in this short set of lectures, to set at least some ideas straight on this broad and beautiful subject, on the one hand by clarifying some fundamental concepts, on the other by presenting some interesting applications to actual physical cases. 74 refs., 38 figs

  6. National transportation statistics 2011

    Science.gov (United States)

    2011-04-01

    Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics : (BTS), National Transportation Statistics presents information on the U.S. transportation system, including : its physical components, safety reco...

  7. National Transportation Statistics 2008

    Science.gov (United States)

    2009-01-08

    Compiled and published by the U.S. Department of Transportations Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record...

  8. National Transportation Statistics 2009

    Science.gov (United States)

    2010-01-21

    Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record, ...

  9. Applying contemporary statistical techniques

    CERN Document Server

    Wilcox, Rand R

    2003-01-01

    Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc

  10. Advances in statistics

    Science.gov (United States)

    Howard Stauffer; Nadav Nur

    2005-01-01

    The papers included in the Advances in Statistics section of the Partners in Flight (PIF) 2002 Proceedings represent a small sample of statistical topics of current importance to Partners In Flight research scientists: hierarchical modeling, estimation of detection probabilities, and Bayesian applications. Sauer et al. (this volume) examines a hierarchical model...

  11. Description, prescription and the choice of discount rates

    International Nuclear Information System (INIS)

    Baum, Seth D.

    2009-01-01

    The choice of discount rates is a key issue in the analysis of long-term societal issues, in particular environmental issues such as climate change. Approaches to choosing discount rates are generally placed into two categories: the descriptive approach and the prescriptive approach. The descriptive approach is often justified on grounds that it uses a description of how society discounts instead of having analysts impose their own discounting views on society. This paper analyzes the common forms of the descriptive and prescriptive approaches and finds that, in contrast with customary thinking, both forms are equally descriptive and prescriptive. The prescriptions concern who has standing (i.e. who is included) in society, how the views of these individuals are measured, and how the measurements are aggregated. Such prescriptions are necessary to choose from among the many possible descriptions of how society discounts. The descriptions are the measurements made given a choice of measurement technique. Thus, the labels 'descriptive approach' and 'prescriptive approach' are deeply misleading, as analysts cannot avoid imposing their own views on society. (author)

  12. National transportation statistics 2010

    Science.gov (United States)

    2010-01-01

    National Transportation Statistics presents statistics on the U.S. transportation system, including its physical components, safety record, economic performance, the human and natural environment, and national security. This is a large online documen...

  13. Statistics for economics

    CERN Document Server

    Naghshpour, Shahdad

    2012-01-01

    Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...

  14. Experimental Mathematics and Computational Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, David H.; Borwein, Jonathan M.

    2009-04-30

    The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.

  15. Understanding statistical concepts using S-PLUS

    CERN Document Server

    Schumacker, Randall E

    2001-01-01

    Written as a supplemental text for an introductory or intermediate statistics course, this book is organized along the lines of many popular statistics texts. The chapters provide a good conceptual understanding of basic statistics and include exercises that use S-PLUS simulation programs. Each chapter lists a set of objectives and a summary.The book offers a rich insight into how probability has shaped statistical procedures in the behavioral sciences, as well as a brief history behind the creation of various statistics. Computational skills are kept to a minimum by including S-PLUS programs

  16. Findings From a Nursing Care Audit Based on the Nursing Process: A Descriptive Study

    Science.gov (United States)

    Poortaghi, Sarieh; Salsali, Mahvash; Ebadi, Abbas; Rahnavard, Zahra; Maleki, Farzaneh

    2015-01-01

    Background: Although using the nursing process improves nursing care quality, few studies have evaluated nursing performance in accordance with nursing process steps either nationally or internationally. Objectives: This study aimed to audit nursing care based on a nursing process model. Patients and Methods: This was a cross-sectional descriptive study in which a nursing audit checklist was designed and validated for assessing nurses’ compliance with nursing process. A total of 300 nurses from various clinical settings of Tehran university of medical sciences were selected. Data were analyzed using descriptive and inferential statistics, including frequencies, Pearson correlation coefficient and independent samples t-tests. Results: The compliance rate of nursing process indicators was 79.71 ± 0.87. Mean compliance scores did not significantly differ by education level and gender. However, overall compliance scores were correlated with nurses’ age (r = 0.26, P = 0.001) and work experience (r = 0.273, P = 0.001). Conclusions: Nursing process indicators can be used to audit nursing care. Such audits can be used as quality assurance tools. PMID:26576448

  17. Meta-analyses and Forest plots using a microsoft excel spreadsheet: step-by-step guide focusing on descriptive data analysis.

    Science.gov (United States)

    Neyeloff, Jeruza L; Fuchs, Sandra C; Moreira, Leila B

    2012-01-20

    Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software.

  18. Meta-analyses and Forest plots using a microsoft excel spreadsheet: step-by-step guide focusing on descriptive data analysis

    Directory of Open Access Journals (Sweden)

    Neyeloff Jeruza L

    2012-01-01

    Full Text Available Abstract Background Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. Findings We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. Conclusions It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software.

  19. Mathematical problem solving ability of sport students in the statistical study

    Science.gov (United States)

    Sari, E. F. P.; Zulkardi; Putri, R. I. I.

    2017-12-01

    This study aims to determine the problem-solving ability of sport students of PGRI Palembang semester V in the statistics course. Subjects in this study were sport students of PGRI Palembang semester V which amounted to 31 people. The research method used is quasi experiment type one case shoot study. Data collection techniques in this study use the test and data analysis used is quantitative descriptive statistics. The conclusion of this study shown that the mathematical problem solving ability of PGRI Palembang sport students of V semester in the statistical course is categorized well with the average of the final test score of 80.3.

  20. Statistical data analysis using SAS intermediate statistical methods

    CERN Document Server

    Marasinghe, Mervyn G

    2018-01-01

    The aim of this textbook (previously titled SAS for Data Analytics) is to teach the use of SAS for statistical analysis of data for advanced undergraduate and graduate students in statistics, data science, and disciplines involving analyzing data. The book begins with an introduction beyond the basics of SAS, illustrated with non-trivial, real-world, worked examples. It proceeds to SAS programming and applications, SAS graphics, statistical analysis of regression models, analysis of variance models, analysis of variance with random and mixed effects models, and then takes the discussion beyond regression and analysis of variance to conclude. Pedagogically, the authors introduce theory and methodological basis topic by topic, present a problem as an application, followed by a SAS analysis of the data provided and a discussion of results. The text focuses on applied statistical problems and methods. Key features include: end of chapter exercises, downloadable SAS code and data sets, and advanced material suitab...

  1. 2017 Annual Disability Statistics Supplement

    Science.gov (United States)

    Lauer, E. A; Houtenville, A. J.

    2018-01-01

    The "Annual Disability Statistics Supplement" is a companion report to the "Annual Disability Statistics Compendium." The "Supplement" presents statistics on the same topics as the "Compendium," with additional categorizations by demographic characteristics including age, gender and race/ethnicity. In…

  2. Nuclear medicine statistics

    International Nuclear Information System (INIS)

    Martin, P.M.

    1977-01-01

    Numerical description of medical and biologic phenomena is proliferating. Laboratory studies on patients now yield measurements of at least a dozen indices, each with its own normal limits. Within nuclear medicine, numerical analysis as well as numerical measurement and the use of computers are becoming more common. While the digital computer has proved to be a valuable tool for measurment and analysis of imaging and radioimmunoassay data, it has created more work in that users now ask for more detailed calculations and for indices that measure the reliability of quantified observations. The following material is presented with the intention of providing a straight-forward methodology to determine values for some useful parameters and to estimate the errors involved. The process used is that of asking relevant questions and then providing answers by illustrations. It is hoped that this will help the reader avoid an error of the third kind, that is, the error of statistical misrepresentation or inadvertent deception. This occurs most frequently in cases where the right answer is found to the wrong question. The purposes of this chapter are: (1) to provide some relevant statistical theory, using a terminology suitable for the nuclear medicine field; (2) to demonstrate the application of a number of statistical methods to the kinds of data commonly encountered in nuclear medicine; (3) to provide a framework to assist the experimenter in choosing the method and the questions most suitable for the experiment at hand; and (4) to present a simple approach for a quantitative quality control program for scintillation cameras and other radiation detectors

  3. Australian black coal statistics 1991

    Energy Technology Data Exchange (ETDEWEB)

    1992-01-01

    This third edition of Australian black coal statistics covers anthracite, bituminous and subbituminous coals. It includes maps and figures on resources and coal fields and statistics (mainly based on the calendar year 1991) on coal demand and supply, production, employment and productivity in Australian coal mines, exports, prices and ports, and domestic consumption. A listing of coal producers by state is included. A final section presents key statistics on international world trade in 1991. 54 tabs.

  4. Management control system description

    Energy Technology Data Exchange (ETDEWEB)

    Bence, P. J.

    1990-10-01

    This Management Control System (MCS) description describes the processes used to manage the cost and schedule of work performed by Westinghouse Hanford Company (Westinghouse Hanford) for the US Department of Energy, Richland Operations Office (DOE-RL), Richland, Washington. Westinghouse Hanford will maintain and use formal cost and schedule management control systems, as presented in this document, in performing work for the DOE-RL. This MCS description is a controlled document and will be modified or updated as required. This document must be approved by the DOE-RL; thereafter, any significant change will require DOE-RL concurrence. Westinghouse Hanford is the DOE-RL operations and engineering contractor at the Hanford Site. Activities associated with this contract (DE-AC06-87RL10930) include operating existing plant facilities, managing defined projects and programs, and planning future enhancements. This document is designed to comply with Section I-13 of the contract by providing a description of Westinghouse Hanford's cost and schedule control systems used in managing the above activities. 5 refs., 22 figs., 1 tab.

  5. Statistical planning of experiments applied in zeolite 4A synthesis

    International Nuclear Information System (INIS)

    Santos, Armindo; Santos, Liessi Luiz; Oliveira, Maria Lucia M. de; Pinto, Joao Mario Andrade

    1995-01-01

    Zeolite, an aluminum silicate which can be used in high level radioactive waste immobilization is presented. A brief description of various aspects of 4A Zeolite is made emphasizing the fractioned factorial statistic planning results, with two levels without replication, applied in the synthesis of this compound. (author). 7 refs., 3 figs

  6. Trends in violent crime: a comparison between police statistics and victimization surveys

    NARCIS (Netherlands)

    Wittebrood, Karin; Junger, Marianne

    2002-01-01

    Usually, two measures are used to describetrends in violent crime: police statistics andvictimization surveys. Both are available inthe Netherlands. In this contribution, we willfirst provide a description of the trends inviolent crime. It appears that both types ofstatistics reflect a different

  7. Experimental investigation and physical description of stratified flow in horizontal channels

    International Nuclear Information System (INIS)

    Staebler, T.

    2007-05-01

    The interaction between a liquid film and turbulent gas flows plays an important role in many technical applications (e.g. in hydraulic engineering, process engineering and nuclear engineering). The local kinematic and turbulent time-averaged flow quantities for counter-current stratified flows (supercritical and subcritical flows with and without flow reversal) have been measured for the first time. Therefore, the method of Particle Image Velocimetry was applied. By using fluorescent particles in combination with an optical filter it was possible to determine the flow quantities of the liquid phase up to the free surface. Additionally, the gaseous phase was investigated by using the scattering of light of conventional particles. With a further measurement technique the void fraction distribution along the channel height has been determined. For this purpose, a single-tip conductivity probe was developed. Furthermore, water delivery rates and pressure losses along the test section were measured over a wide range of parameters. The measurements also revealed new details on the hysteresis effect after the occurrence of flow reversal. The experimental findings were used to develop and validate a statistical model in which the liquid phase is considered to be an agglomeration of interacting particles. The statistical consideration of the particle interactions delivers a differential equation which can be used to predict the local void fraction distribution with the local turbulent kinematic energies of the liquid phase. Beyond that, an additional statistical description is presented in which the probability density functions of the local void fraction are described by beta-functions. Both theoretical approaches can be used for numerical modelling whereas the statistical model can be used to describe the phase interactions and the statistical description to describe the turbulent fluctuations of the local void fraction. Thus, this work has made available all necessary

  8. An invariant approach to statistical analysis of shapes

    CERN Document Server

    Lele, Subhash R

    2001-01-01

    INTRODUCTIONA Brief History of MorphometricsFoundations for the Study of Biological FormsDescription of the data SetsMORPHOMETRIC DATATypes of Morphometric DataLandmark Homology and CorrespondenceCollection of Landmark CoordinatesReliability of Landmark Coordinate DataSummarySTATISTICAL MODELS FOR LANDMARK COORDINATE DATAStatistical Models in GeneralModels for Intra-Group VariabilityEffect of Nuisance ParametersInvariance and Elimination of Nuisance ParametersA Definition of FormCoordinate System Free Representation of FormEst

  9. Are medical articles highlighting detailed statistics more cited?

    Directory of Open Access Journals (Sweden)

    Mike Thelwall

    2015-06-01

    Full Text Available When conducting a literature review, it is natural to search for articles and read their abstracts in order to select papers to read fully. Hence, informative abstracts are important to ensure that research is read. The description of a paper's methods may help to give confidence that a study is of high quality. This article assesses whether medical articles that mention three statistical methods, each of which is arguably indicative of a more detailed statistical analysis than average, are more highly cited. The results show that medical articles mentioning Bonferroni corrections, bootstrapping and effect size tend to be 7%, 8% and 15% more highly ranked for citations than average, respectively. Although this is consistent with the hypothesis that mentioning more detailed statistical techniques generate more highly cited research, these techniques may also tend to be used in more highly cited areas of Medicine.

  10. A Divergence Statistics Extension to VTK for Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bennett, Janine Camille [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical, "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.

  11. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  12. Analysis of Statistical Methods Currently used in Toxicology Journals.

    Science.gov (United States)

    Na, Jihye; Yang, Hyeri; Bae, SeungJin; Lim, Kyung-Min

    2014-09-01

    Statistical methods are frequently used in toxicology, yet it is not clear whether the methods employed by the studies are used consistently and conducted based on sound statistical grounds. The purpose of this paper is to describe statistical methods used in top toxicology journals. More specifically, we sampled 30 papers published in 2014 from Toxicology and Applied Pharmacology, Archives of Toxicology, and Toxicological Science and described methodologies used to provide descriptive and inferential statistics. One hundred thirteen endpoints were observed in those 30 papers, and most studies had sample size less than 10, with the median and the mode being 6 and 3 & 6, respectively. Mean (105/113, 93%) was dominantly used to measure central tendency, and standard error of the mean (64/113, 57%) and standard deviation (39/113, 34%) were used to measure dispersion, while few studies provide justifications regarding why the methods being selected. Inferential statistics were frequently conducted (93/113, 82%), with one-way ANOVA being most popular (52/93, 56%), yet few studies conducted either normality or equal variance test. These results suggest that more consistent and appropriate use of statistical method is necessary which may enhance the role of toxicology in public health.

  13. Graphic Description: The Mystery of Ibn Khafaja\\'s Success in Description

    Directory of Open Access Journals (Sweden)

    جواد رنجبر

    2009-12-01

    Full Text Available Graphic Description:   The Mystery of Ibn Khafaja's Success in Description    Ali Bagher Taheriniya *  Javad Ranjbar **      Abstract Ibn Khafaja is one of the poets and men of letters in Spain. He is titled to Sanobari of Spain. He is one of the masters of description. Hence, the analysis of successful techniques he has used in the descriptive art could illuminate the way for others. Al-Taswir al-harfi (graphic description is a term which denotes the highest and most detailed poems. On this basis, the best descriptive poem is one which is closer to a painting. He has used some elements called conforming elements of description which contain: imagination, feeling, faculty, and dialogue as well as three other elements: to be inborn in description, enchanting nature and convenient life. This article is going to give an analysis of the reasons for Ibn Khafaja’s success in description and portrait making.   Key words: Ibn Khafaja, poetry, description, portrait   * Associate Professor, Bu Ali Sina University of Hamadan E-mail: bTaheriniya@yahoo.com  ** M.A. in Arabic Language and Literature

  14. Converting Taxonomic Descriptions to New Digital Formats

    Directory of Open Access Journals (Sweden)

    Hong Cui

    2008-01-01

    Full Text Available Abstract.--The majority of taxonomic descriptions is currently in print format. The majority of digital descriptions are in formats such as DOC, HTML, or PDF and for human readers. These formats do not convey rich semantics in taxonomic descriptions for computer-aided process. Newer digital formats such as XML and RDF accommodate semantic annotations that allow computers to process the rich semantics on human's behalf, thus open up opportunities for a wide range of innovative usages of taxonomic descriptions, such as searching in more precise and flexible ways, integrating with gnomic and geographic information, generating taxonomic keys automatically, and text data mining and information visualization etc. This paper discusses the challenges in automated conversion of multiple collections of descriptions to XML format and reports an automated system, MARTT. MARTT is a machine-learning system that makes use of training examples to tag new descriptions into XML format. A number of utilities are implemented as solutions to the challenges. The utilities are used to reduce the effort for training example preparation, to facilitate the creation of a comprehensive schema, and to predict system performance on a new collection of descriptions. The system has been tested with several plant and alga taxonomic publications including Flora of China and Flora of North America.

  15. Employing Picture Description to Assess the Students' Descriptive Paragraph Writing

    Directory of Open Access Journals (Sweden)

    Ida Ayu Mega Cahyani

    2018-03-01

    Full Text Available Writing is considered as an important skill in learning process which is needed to be mastered by the students. However, in teaching learning process at schools or universities, the assessment of writing skill is not becoming the focus of learning process and the assessment is administered inappropriately. In this present study, the researcher undertook the study which dealt with assessing descriptive paragraph writing ability of the students through picture description by employing an ex post facto as the research design. The present study was intended to answer the research problem dealing with the extent of the students’ achievement of descriptive paragraph writing ability which is assessed through picture description. The samples under the study were 40 students determined by means of random sampling technique with lottery system. The data were collected through administering picture description as the research instrument. The obtained data were analyzed by using norm-reference measure of five standard values. The results of the data analysis showed that there were 67.50% samples of the study were successful in writing descriptive paragraph, while there were 32.50% samples were unsuccessful in writing descriptive paragraph which was assessed by administering picture description test

  16. The AutoBayes Program Synthesis System: System Description

    Science.gov (United States)

    Fischer, Bernd; Pressburger, Thomas; Rosu, Grigore; Schumann, Johann; Norvog, Peter (Technical Monitor)

    2001-01-01

    AUTOBAYES is a fully automatic program synthesis system for the statistical data analysis domain. Its input is a concise description of a data analysis problem in the form of a statistical model; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. AUTOBAYES synthesizes code by a schema-guided deductive process. Schemas (i.e., code templates with associated semantic constraints) are applied to the original problem and recursively to emerging subproblems. AUTOBAYES complements this approach by symbolic computation to derive closed-form solutions whenever possible. In this paper, we concentrate on the interaction between the symbolic computations and the deductive synthesis process. A statistical model specifies for each problem variable (i.e., data or parameter) its properties and dependencies in the form of a probability distribution, A typical data analysis task is to estimate the best possible parameter values from the given observations or measurements. The following example models normal-distributed data but takes prior information (e.g., from previous experiments) on the data's mean value and variance into account.

  17. Statistical methods for quantitative mass spectrometry proteomic experiments with labeling

    Directory of Open Access Journals (Sweden)

    Oberg Ann L

    2012-11-01

    Full Text Available Abstract Mass Spectrometry utilizing labeling allows multiple specimens to be subjected to mass spectrometry simultaneously. As a result, between-experiment variability is reduced. Here we describe use of fundamental concepts of statistical experimental design in the labeling framework in order to minimize variability and avoid biases. We demonstrate how to export data in the format that is most efficient for statistical analysis. We demonstrate how to assess the need for normalization, perform normalization, and check whether it worked. We describe how to build a model explaining the observed values and test for differential protein abundance along with descriptive statistics and measures of reliability of the findings. Concepts are illustrated through the use of three case studies utilizing the iTRAQ 4-plex labeling protocol.

  18. Statistical methods for quantitative mass spectrometry proteomic experiments with labeling.

    Science.gov (United States)

    Oberg, Ann L; Mahoney, Douglas W

    2012-01-01

    Mass Spectrometry utilizing labeling allows multiple specimens to be subjected to mass spectrometry simultaneously. As a result, between-experiment variability is reduced. Here we describe use of fundamental concepts of statistical experimental design in the labeling framework in order to minimize variability and avoid biases. We demonstrate how to export data in the format that is most efficient for statistical analysis. We demonstrate how to assess the need for normalization, perform normalization, and check whether it worked. We describe how to build a model explaining the observed values and test for differential protein abundance along with descriptive statistics and measures of reliability of the findings. Concepts are illustrated through the use of three case studies utilizing the iTRAQ 4-plex labeling protocol.

  19. Emended description of Pasteuria nishizawae.

    Science.gov (United States)

    Noel, Gregory R; Atibalentja, N; Domier, Leslie L

    2005-07-01

    The description of the Gram-positive, obligately parasitic, mycelial and endospore-forming bacterium, Pasteuria nishizawae, is emended to include additional observations on the life cycle, host specificity and endospore morphology. The nucleotide sequence of the 16S rRNA gene is also provided.

  20. New advances in the statistical parton distributions approach*

    Directory of Open Access Journals (Sweden)

    Soffer Jacques

    2016-01-01

    Full Text Available The quantum statistical parton distributions approach proposed more than one decade ago is revisited by considering a larger set of recent and accurate Deep Inelastic Scattering experimental results. It enables us to improve the description of the data by means of a new determination of the parton distributions. This global next-to-leading order QCD analysis leads to a good description of several structure functions, involving unpolarized parton distributions and helicity distributions, in terms of a rather small number of free parameters. There are many serious challenging issues. The predictions of this theoretical approach will be tested for single-jet production and charge asymmetry in W± production in p̄p and pp collisions up to LHC energies, using recent data and also for forthcoming experimental results.

  1. Handbook of Spatial Statistics

    CERN Document Server

    Gelfand, Alan E

    2010-01-01

    Offers an introduction detailing the evolution of the field of spatial statistics. This title focuses on the three main branches of spatial statistics: continuous spatial variation (point referenced data); discrete spatial variation, including lattice and areal unit data; and, spatial point patterns.

  2. Strategies for improving utilization of computerized statistical data by the social science community.

    OpenAIRE

    Robbin, Alice

    1981-01-01

    In recent decades there has been a notable expansion of statistical data produced by the public and private sectors for administrative, research, policy and evaluation programs. This is due to advances in relatively inexpensive and efficient data collection and management of computer-readable statistical data. Corresponding changes have not occurred in the management of data collection, preservation, description and dissemination. As a result, the process by which data become accessible to so...

  3. A statistical approach to evaluate hydrocarbon remediation in the unsaturated zone

    International Nuclear Information System (INIS)

    Hajali, P.; Marshall, T.; Overman, S.

    1991-01-01

    This paper presents an evaluation of performance and cleanup effectiveness of a vapor extraction system (VES) in extracting chlorinated hydrocarbons and petroleum-based hydrocarbons (mineral spirits) from the unsaturated zone. The statistical analysis of soil concentration data to evaluate the VES remediation success is described. The site is a former electronics refurbishing facility in southern California; soil contamination from organic solvents was found mainly in five areas (Area A through E) beneath two buildings. The evaluation begins with a brief description of the site background, discusses the statistical approach, and presents conclusions

  4. Statistical characterization of the standard map

    Science.gov (United States)

    Ruiz, Guiomar; Tirnakli, Ugur; Borges, Ernesto P.; Tsallis, Constantino

    2017-06-01

    The standard map, paradigmatic conservative system in the (x, p) phase space, has been recently shown (Tirnakli and Borges (2016 Sci. Rep. 6 23644)) to exhibit interesting statistical behaviors directly related to the value of the standard map external parameter K. A comprehensive statistical numerical description is achieved in the present paper. More precisely, for large values of K (e.g. K  =  10) where the Lyapunov exponents are neatly positive over virtually the entire phase space consistently with Boltzmann-Gibbs (BG) statistics, we verify that the q-generalized indices related to the entropy production q{ent} , the sensitivity to initial conditions q{sen} , the distribution of a time-averaged (over successive iterations) phase-space coordinate q{stat} , and the relaxation to the equilibrium final state q{rel} , collapse onto a fixed point, i.e. q{ent}=q{sen}=q{stat}=q{rel}=1 . In remarkable contrast, for small values of K (e.g. K  =  0.2) where the Lyapunov exponents are virtually zero over the entire phase space, we verify q{ent}=q{sen}=0 , q{stat} ≃ 1.935 , and q{rel} ≃1.4 . The situation corresponding to intermediate values of K, where both stable orbits and a chaotic sea are present, is discussed as well. The present results transparently illustrate when BG behavior and/or q-statistical behavior are observed.

  5. Statistical Data Editing in Scientific Articles.

    Science.gov (United States)

    Habibzadeh, Farrokh

    2017-07-01

    Scientific journals are important scholarly forums for sharing research findings. Editors have important roles in safeguarding standards of scientific publication and should be familiar with correct presentation of results, among other core competencies. Editors do not have access to the raw data and should thus rely on clues in the submitted manuscripts. To identify probable errors, they should look for inconsistencies in presented results. Common statistical problems that can be picked up by a knowledgeable manuscript editor are discussed in this article. Manuscripts should contain a detailed section on statistical analyses of the data. Numbers should be reported with appropriate precisions. Standard error of the mean (SEM) should not be reported as an index of data dispersion. Mean (standard deviation [SD]) and median (interquartile range [IQR]) should be used for description of normally and non-normally distributed data, respectively. If possible, it is better to report 95% confidence interval (CI) for statistics, at least for main outcome variables. And, P values should be presented, and interpreted with caution, if there is a hypothesis. To advance knowledge and skills of their members, associations of journal editors are better to develop training courses on basic statistics and research methodology for non-experts. This would in turn improve research reporting and safeguard the body of scientific evidence. © 2017 The Korean Academy of Medical Sciences.

  6. Statistical mechanics of lattice systems a concrete mathematical introduction

    CERN Document Server

    Friedli, Sacha

    2017-01-01

    This motivating textbook gives a friendly, rigorous introduction to fundamental concepts in equilibrium statistical mechanics, covering a selection of specific models, including the Curie–Weiss and Ising models, the Gaussian free field, O(n) models, and models with Kać interactions. Using classical concepts such as Gibbs measures, pressure, free energy, and entropy, the book exposes the main features of the classical description of large systems in equilibrium, in particular the central problem of phase transitions. It treats such important topics as the Peierls argument, the Dobrushin uniqueness, Mermin–Wagner and Lee–Yang theorems, and develops from scratch such workhorses as correlation inequalities, the cluster expansion, Pirogov–Sinai Theory, and reflection positivity. Written as a self-contained course for advanced undergraduate or beginning graduate students, the detailed explanations, large collection of exercises (with solutions), and appendix of mathematical results and concepts also make i...

  7. Probabilistic assessment of fatigue life including statistical uncertainties in the S-N curve

    International Nuclear Information System (INIS)

    Sudret, B.; Hornet, P.; Stephan, J.-M.; Guede, Z.; Lemaire, M.

    2003-01-01

    A probabilistic framework is set up to assess the fatigue life of components of nuclear power plants. It intends to incorporate all kinds of uncertainties such as those appearing in the specimen fatigue life, design sub-factor, mechanical model and applied loading. This paper details the first step, which corresponds to the statistical treatment of the fatigue specimen test data. The specimen fatigue life at stress amplitude S is represented by a lognormal random variable whose mean and standard deviation depend on S. This characterization is then used to compute the random fatigue life of a component submitted to a single kind of cycles. Precisely the mean and coefficient of variation of this quantity are studied, as well as the reliability associated with the (deterministic) design value. (author)

  8. Thiele. Pioneer in statistics

    DEFF Research Database (Denmark)

    Lauritzen, Steffen Lilholt

    This book studies the brilliant Danish 19th Century astronomer, T.N. Thiele who made important contributions to statistics, actuarial science, astronomy and mathematics. The most important of these contributions in statistics are translated into English for the first time, and the text includes...

  9. The spin-statistics connection in quantum gravity

    International Nuclear Information System (INIS)

    Balachandran, A.P.; Batista, E.; Costa e Silva, I.P.; Teotonio-Sobrinho, P.

    2000-01-01

    It is well known that in spite of sharing some properties with conventional particles, topological geons in general violate the spin-statistics theorem. On the other hand, it is generally believed that in quantum gravity theories allowing for topology change, using pair creation and annihilation of geons, one should be able to recover this theorem. In this paper, we take an alternative route, and use an algebraic formalism developed in previous work. We give a description of topological geons where an algebra of 'observables' is identified and quantized. Different irreducible representations of this algebra correspond to different kinds of geons, and are labeled by a non-abelian 'charge' and 'magnetic flux'. We then find that the usual spin-statistics theorem is indeed violated, but a new spin-statistics relation arises, when we assume that the fluxes are superselected. This assumption can be proved if all observables are local, as is generally the case in physical theories. Finally, we also discuss how our approach fits into conventional formulations of quantum gravity

  10. Distinguishing Features and Similarities Between Descriptive Phenomenological and Qualitative Description Research.

    Science.gov (United States)

    Willis, Danny G; Sullivan-Bolyai, Susan; Knafl, Kathleen; Cohen, Marlene Z

    2016-09-01

    Scholars who research phenomena of concern to the discipline of nursing are challenged with making wise choices about different qualitative research approaches. Ultimately, they want to choose an approach that is best suited to answer their research questions. Such choices are predicated on having made distinctions between qualitative methodology, methods, and analytic frames. In this article, we distinguish two qualitative research approaches widely used for descriptive studies: descriptive phenomenological and qualitative description. Providing a clear basis that highlights the distinguishing features and similarities between descriptive phenomenological and qualitative description research will help students and researchers make more informed choices in deciding upon the most appropriate methodology in qualitative research. We orient the reader to distinguishing features and similarities associated with each approach and the kinds of research questions descriptive phenomenological and qualitative description research address. © The Author(s) 2016.

  11. Statistics

    International Nuclear Information System (INIS)

    2004-01-01

    For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees

  12. Statistics

    International Nuclear Information System (INIS)

    2003-01-01

    For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products

  13. The system for statistical analysis of logistic information

    Directory of Open Access Journals (Sweden)

    Khayrullin Rustam Zinnatullovich

    2015-05-01

    Full Text Available The current problem for managers in logistic and trading companies is the task of improving the operational business performance and developing the logistics support of sales. The development of logistics sales supposes development and implementation of a set of works for the development of the existing warehouse facilities, including both a detailed description of the work performed, and the timing of their implementation. Logistics engineering of warehouse complex includes such tasks as: determining the number and the types of technological zones, calculation of the required number of loading-unloading places, development of storage structures, development and pre-sales preparation zones, development of specifications of storage types, selection of loading-unloading equipment, detailed planning of warehouse logistics system, creation of architectural-planning decisions, selection of information-processing equipment, etc. The currently used ERP and WMS systems did not allow us to solve the full list of logistics engineering problems. In this regard, the development of specialized software products, taking into account the specifics of warehouse logistics, and subsequent integration of these software with ERP and WMS systems seems to be a current task. In this paper we suggest a system of statistical analysis of logistics information, designed to meet the challenges of logistics engineering and planning. The system is based on the methods of statistical data processing.The proposed specialized software is designed to improve the efficiency of the operating business and the development of logistics support of sales. The system is based on the methods of statistical data processing, the methods of assessment and prediction of logistics performance, the methods for the determination and calculation of the data required for registration, storage and processing of metal products, as well as the methods for planning the reconstruction and development

  14. A retrospective, descriptive study of shoulder outcomes in outpatient physical therapy.

    Science.gov (United States)

    Millar, A Lynn; Lasheway, Philip A; Eaton, Wendy; Christensen, Frances

    2006-06-01

    A retrospective, descriptive study of clients with shoulder dysfunction referred to physical therapy. To (1) describe the clinical and functional outcomes of clients with shoulder dysfunction following outpatient physical therapy, and (2) to compare the outcomes by type of shoulder dysfunction. Although individuals with shoulder dysfunction are commonly referred to physical therapy few large descriptive studies regarding outcomes following physical therapy are available. Data for 878 clients (468 female, 410 male) were retrieved and analyzed. This database was developed between 1997 and 2000 and included 4 outpatient facilities from 1 healthcare system in the southwest corner of Michigan. Clients were classified by type of shoulder dysfunction, and standardized tests were performed upon admittance and discharge to physical therapy. Descriptive and inferential statistics were calculated for all data. Of all clients, 55.1% had shoulder impingement, while 18.3% had postoperative repair, 8.9% had a frozen shoulder, 7.6% had a rotator cuff tear, 3.0% had shoulder instability, 2.1% were post fracture, and the remaining 4.9% had miscellaneous diagnoses. The average (+/-SD) age of the patients was 53.6 +/- 16.4 years, with an average (+/-SD) number of treatment sessions of 13.7 +/- 11.0. All groups showed significant changes following physical therapy intervention. Clients with diverse types of shoulder dysfunction demonstrated improvement in both clinical and functional measures at the conclusion of physical therapy, although it is not possible to determine whether these changes were due to the interventions or due to time. The type of shoulder dysfunction appears to affect the prognosis, thus expected outcomes should be based upon initial diagnosis and specific measures.

  15. Statistical methods for astronomical data analysis

    CERN Document Server

    Chattopadhyay, Asis Kumar

    2014-01-01

    This book introduces “Astrostatistics” as a subject in its own right with rewarding examples, including work by the authors with galaxy and Gamma Ray Burst data to engage the reader. This includes a comprehensive blending of Astrophysics and Statistics. The first chapter’s coverage of preliminary concepts and terminologies for astronomical phenomenon will appeal to both Statistics and Astrophysics readers as helpful context. Statistics concepts covered in the book provide a methodological framework. A unique feature is the inclusion of different possible sources of astronomical data, as well as software packages for converting the raw data into appropriate forms for data analysis. Readers can then use the appropriate statistical packages for their particular data analysis needs. The ideas of statistical inference discussed in the book help readers determine how to apply statistical tests. The authors cover different applications of statistical techniques already developed or specifically introduced for ...

  16. An Analysis of Research Methods and Statistical Techniques Used by Doctoral Dissertation at the Education Sciences in Turkey

    Science.gov (United States)

    Karadag, Engin

    2010-01-01

    To assess research methods and analysis of statistical techniques employed by educational researchers, this study surveyed unpublished doctoral dissertation from 2003 to 2007. Frequently used research methods consisted of experimental research; a survey; a correlational study; and a case study. Descriptive statistics, t-test, ANOVA, factor…

  17. World offshore energy loss statistics

    International Nuclear Information System (INIS)

    Kaiser, Mark J.

    2007-01-01

    Offshore operations present a unique set of environmental conditions and adverse exposure not observed in a land environment taking place in a confined space in a hostile environment under the constant danger of catastrophe and loss. It is possible to engineer some risks to a very low threshold of probability, but losses and unforeseen events can never be entirely eliminated because of cost considerations, the human factor, and environmental uncertainty. Risk events occur infrequently but have the potential of generating large losses, as evident by the 2005 hurricane season in the Gulf of Mexico, which was the most destructive and costliest natural disaster in the history of offshore production. The purpose of this paper is to provide a statistical assessment of energy losses in offshore basins using the Willis Energy Loss database. A description of the loss categories and causes of property damage are provided, followed by a statistical assessment of damage and loss broken out by region, cause, and loss category for the time horizon 1970-2004. The impact of the 2004-2005 hurricane season in the Gulf of Mexico is summarized

  18. On the statistical-mechanical meaning of the Bousso bound

    International Nuclear Information System (INIS)

    Pesci, Alessandro

    2008-01-01

    The Bousso entropy bound, in its generalized form, is investigated for the case of perfect fluids at local thermodynamic equilibrium and evidence is found that the bound is satisfied if and only if a certain local thermodynamic property holds, emerging when the attempt is made to apply the bound to thin layers of matter. This property consists of the existence of an ultimate lower limit l* to the thickness of the slices for which a statistical-mechanical description is viable, depending l* on the thermodynamical variables which define the state of the system locally. This limiting scale, found to be in general much larger than the Planck scale (so that no Planck scale physics must be necessarily invoked to justify it), appears not related to gravity and this suggests that the generalized entropy bound is likely to be rooted on conventional flat-spacetime statistical mechanics, with the maximum admitted entropy being however actually determined also by gravity. Some examples of ideal fluids are considered in order to identify the mechanisms which can set a lower limit to the statistical-mechanical description and these systems are found to respect the lower limiting scale l*. The photon gas, in particular, appears to seemingly saturate this limiting scale and the consequence is drawn that for systems consisting of a single slice of a photon gas with thickness l*, the generalized Bousso bound is saturated. It is argued that this seems to open the way to a peculiar understanding of black hole entropy: if an entropy can meaningfully (i.e. with a second law) be assigned to a black hole, the value A/4 for it (where A is the area of the black hole) is required simply by (conventional) statistical mechanics coupled to general relativity

  19. Testing statistical hypotheses

    CERN Document Server

    Lehmann, E L

    2005-01-01

    The third edition of Testing Statistical Hypotheses updates and expands upon the classic graduate text, emphasizing optimality theory for hypothesis testing and confidence sets. The principal additions include a rigorous treatment of large sample optimality, together with the requisite tools. In addition, an introduction to the theory of resampling methods such as the bootstrap is developed. The sections on multiple testing and goodness of fit testing are expanded. The text is suitable for Ph.D. students in statistics and includes over 300 new problems out of a total of more than 760. E.L. Lehmann is Professor of Statistics Emeritus at the University of California, Berkeley. He is a member of the National Academy of Sciences and the American Academy of Arts and Sciences, and the recipient of honorary degrees from the University of Leiden, The Netherlands and the University of Chicago. He is the author of Elements of Large-Sample Theory and (with George Casella) he is also the author of Theory of Point Estimat...

  20. Statistics for lawyers

    CERN Document Server

    Finkelstein, Michael O

    2015-01-01

    This classic text, first published in 1990, is designed to introduce law students, law teachers, practitioners, and judges to the basic ideas of mathematical probability and statistics as they have been applied in the law. The third edition includes over twenty new sections, including the addition of timely topics, like New York City police stops, exonerations in death-sentence cases, projecting airline costs, and new material on various statistical techniques such as the randomized response survey technique, rare-events meta-analysis, competing risks, and negative binomial regression. The book consists of sections of exposition followed by real-world cases and case studies in which statistical data have played a role. The reader is asked to apply the theory to the facts, to calculate results (a hand calculator is sufficient), and to explore legal issues raised by quantitative findings. The authors' calculations and comments are given in the back of the book. As with previous editions, the cases and case stu...

  1. GALEX-SDSS CATALOGS FOR STATISTICAL STUDIES

    International Nuclear Information System (INIS)

    Budavari, Tamas; Heinis, Sebastien; Szalay, Alexander S.; Nieto-Santisteban, Maria; Bianchi, Luciana; Gupchup, Jayant; Shiao, Bernie; Smith, Myron; Chang Ruixiang; Kauffmann, Guinevere; Morrissey, Patrick; Wyder, Ted K.; Martin, D. Christopher; Barlow, Tom A.; Forster, Karl; Friedman, Peter G.; Schiminovich, David; Milliard, Bruno; Donas, Jose; Seibert, Mark

    2009-01-01

    We present a detailed study of the Galaxy Evolution Explorer's (GALEX) photometric catalogs with special focus on the statistical properties of the All-sky and Medium Imaging Surveys. We introduce the concept of primaries to resolve the issue of multiple detections and follow a geometric approach to define clean catalogs with well understood selection functions. We cross-identify the GALEX sources (GR2+3) with Sloan Digital Sky Survey (SDSS; DR6) observations, which indirectly provides an invaluable insight into the astrometric model of the UV sources and allows us to revise the band merging strategy. We derive the formal description of the GALEX footprints as well as their intersections with the SDSS coverage along with analytic calculations of their areal coverage. The crossmatch catalogs are made available for the public. We conclude by illustrating the implementation of typical selection criteria in SQL for catalog subsets geared toward statistical analyses, e.g., correlation and luminosity function studies.

  2. Progressive statistics for studies in sports medicine and exercise science.

    Science.gov (United States)

    Hopkins, William G; Marshall, Stephen W; Batterham, Alan M; Hanin, Juri

    2009-01-01

    Statistical guidelines and expert statements are now available to assist in the analysis and reporting of studies in some biomedical disciplines. We present here a more progressive resource for sample-based studies, meta-analyses, and case studies in sports medicine and exercise science. We offer forthright advice on the following controversial or novel issues: using precision of estimation for inferences about population effects in preference to null-hypothesis testing, which is inadequate for assessing clinical or practical importance; justifying sample size via acceptable precision or confidence for clinical decisions rather than via adequate power for statistical significance; showing SD rather than SEM, to better communicate the magnitude of differences in means and nonuniformity of error; avoiding purely nonparametric analyses, which cannot provide inferences about magnitude and are unnecessary; using regression statistics in validity studies, in preference to the impractical and biased limits of agreement; making greater use of qualitative methods to enrich sample-based quantitative projects; and seeking ethics approval for public access to the depersonalized raw data of a study, to address the need for more scrutiny of research and better meta-analyses. Advice on less contentious issues includes the following: using covariates in linear models to adjust for confounders, to account for individual differences, and to identify potential mechanisms of an effect; using log transformation to deal with nonuniformity of effects and error; identifying and deleting outliers; presenting descriptive, effect, and inferential statistics in appropriate formats; and contending with bias arising from problems with sampling, assignment, blinding, measurement error, and researchers' prejudices. This article should advance the field by stimulating debate, promoting innovative approaches, and serving as a useful checklist for authors, reviewers, and editors.

  3. Statistical bootstrap approach to hadronic matter and multiparticle reactions

    International Nuclear Information System (INIS)

    Ilgenfritz, E.M.; Kripfganz, J.; Moehring, H.J.

    1977-01-01

    The authors present the main ideas behind the statistical bootstrap model and recent developments within this model related to the description of fireball cascade decay. Mathematical methods developed in this model might be useful in other phenomenological schemes of strong interaction physics; they are described in detail. The present status of applications of the model to various hadronic reactions is discussed. When discussing the relations of the statistical bootstrap model to other models of hadron physics the authors point out possibly fruitful analogies and dynamical mechanisms which are modelled by the bootstrap dynamics under definite conditions. This offers interpretations for the critical temperature typical for the model and indicates futher fields of application. (author)

  4. Statistical yearbook 2001. Data available as of 15 December 2003. 48 ed

    International Nuclear Information System (INIS)

    2004-01-01

    This is the forty-eight issue of the United Nations Statistical Yearbook, prepared by the Statistics Division, Department of Economic and Social Affairs of the United Nations Secretariat. It contains series covering, in general, 1990-1999 or 1991-2000, based on statistics available to the Statistics Division up to 15 December 2003. The major purpose of the Statistical Yearbook is to provide in a single volume a comprehensive compilation of internationally available statistics on social and economic conditions and activities, at world, regional and national levels, covering roughly a ten-year period. Most of the statistics presented in the Yearbook are extracted from more detailed, specialized publications prepared by the Statistics Division and by many other international statistical services. Thus, while the specialized publications concentrate on monitoring topics and trends in particular social and economic fields, the Statistical Yearbook tables provide data for a more comprehensive, overall description of social and economic structures, conditions, changes and activities. The objective has been to collect, systematize and coordinate the most essential components of comparable statistical information which can give a broad and, to the extent feasible, a consistent picture of social and economic processes at world, regional and national levels. More specifically, the Statistical Yearbook provides systematic information on a wide range of social and economic issues which are of concern in the United Nations system and among the governments and peoples of the world. A particular value of the Yearbook, but also its greatest challenge, is that these issues are extensively interrelated. Meaningful analysis of these issues requires systematization and coordination of the data across many fields. These issues include: General economic growth and related economic conditions; economic situation in developing countries and progress towards the objectives adopted for the

  5. METHODOLOGICAL PRINCIPLES AND METHODS OF TERMS OF TRADE STATISTICAL EVALUATION

    Directory of Open Access Journals (Sweden)

    N. Kovtun

    2014-09-01

    Full Text Available The paper studies the methodological principles and guidance of the statistical evaluation of terms of trade for the United Nations classification model – Harmonized Commodity Description and Coding System (HS. The practical implementation of the proposed three-stage model of index analysis and estimation of terms of trade for Ukraine's commodity-members for the period of 2011-2012 are realized.

  6. Statistical characteristics of trajectories of diamagnetic unicellular organisms in a magnetic field.

    Science.gov (United States)

    Gorobets, Yu I; Gorobets, O Yu

    2015-01-01

    The statistical model is proposed in this paper for description of orientation of trajectories of unicellular diamagnetic organisms in a magnetic field. The statistical parameter such as the effective energy is calculated on basis of this model. The resulting effective energy is the statistical characteristics of trajectories of diamagnetic microorganisms in a magnetic field connected with their metabolism. The statistical model is applicable for the case when the energy of the thermal motion of bacteria is negligible in comparison with their energy in a magnetic field and the bacteria manifest the significant "active random movement", i.e. there is the randomizing motion of the bacteria of non thermal nature, for example, movement of bacteria by means of flagellum. The energy of the randomizing active self-motion of bacteria is characterized by the new statistical parameter for biological objects. The parameter replaces the energy of the randomizing thermal motion in calculation of the statistical distribution. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Mineral industry statistics 1975

    Energy Technology Data Exchange (ETDEWEB)

    1978-01-01

    Production, consumption and marketing statistics are given for solid fuels (coal, peat), liquid fuels and gases (oil, natural gas), iron ore, bauxite and other minerals quarried in France, in 1975. Also accident statistics are included. Production statistics are presented of the Overseas Departments and territories (French Guiana, New Caledonia, New Hebrides). An account of modifications in the mining field in 1975 is given. Concessions, exploitation permits, and permits solely for prospecting for mineral products are discussed. (In French)

  8. Common pitfalls in statistical analysis: "P" values, statistical significance and confidence intervals

    Directory of Open Access Journals (Sweden)

    Priya Ranganathan

    2015-01-01

    Full Text Available In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ′P′ value, explain the importance of ′confidence intervals′ and clarify the importance of including both values in a paper

  9. DATA ON YOUTH, 1967, A STATISTICAL DOCUMENT.

    Science.gov (United States)

    SCHEIDER, GEORGE

    THE DATA IN THIS REPORT ARE STATISTICS ON YOUTH THROUGHOUT THE UNITED STATES AND IN NEW YORK STATE. INCLUDED ARE DATA ON POPULATION, SCHOOL STATISTICS, EMPLOYMENT, FAMILY INCOME, JUVENILE DELINQUENCY AND YOUTH CRIME (INCLUDING NEW YORK CITY FIGURES), AND TRAFFIC ACCIDENTS. THE STATISTICS ARE PRESENTED IN THE TEXT AND IN TABLES AND CHARTS. (NH)

  10. Proceedings of the Pacific Rim Statistical Conference for Production Engineering : Big Data, Production Engineering and Statistics

    CERN Document Server

    Jang, Daeheung; Lai, Tze; Lee, Youngjo; Lu, Ying; Ni, Jun; Qian, Peter; Qiu, Peihua; Tiao, George

    2018-01-01

    This book presents the proceedings of the 2nd Pacific Rim Statistical Conference for Production Engineering: Production Engineering, Big Data and Statistics, which took place at Seoul National University in Seoul, Korea in December, 2016. The papers included discuss a wide range of statistical challenges, methods and applications for big data in production engineering, and introduce recent advances in relevant statistical methods.

  11. Numeric computation and statistical data analysis on the Java platform

    CERN Document Server

    Chekanov, Sergei V

    2016-01-01

    Numerical computation, knowledge discovery and statistical data analysis integrated with powerful 2D and 3D graphics for visualization are the key topics of this book. The Python code examples powered by the Java platform can easily be transformed to other programming languages, such as Java, Groovy, Ruby and BeanShell. This book equips the reader with a computational platform which, unlike other statistical programs, is not limited by a single programming language. The author focuses on practical programming aspects and covers a broad range of topics, from basic introduction to the Python language on the Java platform (Jython), to descriptive statistics, symbolic calculations, neural networks, non-linear regression analysis and many other data-mining topics. He discusses how to find regularities in real-world data, how to classify data, and how to process data for knowledge discoveries. The code snippets are so short that they easily fit into single pages. Numeric Computation and Statistical Data Analysis ...

  12. Use of statistical procedures in Brazilian and international dental journals.

    Science.gov (United States)

    Ambrosano, Gláucia Maria Bovi; Reis, André Figueiredo; Giannini, Marcelo; Pereira, Antônio Carlos

    2004-01-01

    A descriptive survey was performed in order to assess the statistical content and quality of Brazilian and international dental journals, and compare their evolution throughout the last decades. The authors identified the reporting and accuracy of statistical techniques in 1000 papers published from 1970 to 2000 in seven dental journals: three Brazilian (Brazilian Dental Journal, Revista de Odontologia da Universidade de Sao Paulo and Revista de Odontologia da UNESP) and four international journals (Journal of the American Dental Association, Journal of Dental Research, Caries Research and Journal of Periodontology). Papers were divided into two time periods: from 1970 to 1989, and from 1990 to 2000. A slight increase in the number of articles that presented some form of statistical technique was noticed for Brazilian journals (from 61.0 to 66.7%), whereas for international journals, a significant increase was observed (65.8 to 92.6%). In addition, a decrease in the number of statistical errors was verified. The most commonly used statistical tests as well as the most frequent errors found in dental journals were assessed. Hopefully, this investigation will encourage dental educators to better plan the teaching of biostatistics, and to improve the statistical quality of submitted manuscripts.

  13. Jacobson generators, Fock representations and statistics of sl(n + 1)

    International Nuclear Information System (INIS)

    Palev, T.D.; Jeugt, J. van der

    2000-10-01

    The properties of A-statistics, related to the class of simple Lie algebras sl(n + 1), n is an element of Z + (Palev, T.D.: Preprint JINR E17-10550 (1977); hep-th/9705032), are further investigated. The description of each sl(n + 1) is carried out via generators and their relations (see eq. (2.5)), first introduced by Jacobson. The related Fock spaces W p , p is an element of N, are finite-dimensional irreducible sl(n + 1)-modules. The Pauli principle of the underlying statistics is formulated. In addition the paper contains the following new results: (a) the A-statistics are interpreted as exclusion statistics; (b) within each W p operators B(p) 1 ± ,...,B(p) n ± , proportional to the Jacobson generators, are introduced. It is proved that in an appropriate topology (Definition 2) lim p→∞ B(p) i ± = B i ± , where B i ± are Bose creation and annihilation operators; (c) it is shown that the local statistics of the degenerated hard-core Bose models and of the related Heisenberg spin models is p = I A-statistics. (author)

  14. The use and misuse of statistical methodologies in pharmacology research.

    Science.gov (United States)

    Marino, Michael J

    2014-01-01

    Descriptive, exploratory, and inferential statistics are necessary components of hypothesis-driven biomedical research. Despite the ubiquitous need for these tools, the emphasis on statistical methods in pharmacology has become dominated by inferential methods often chosen more by the availability of user-friendly software than by any understanding of the data set or the critical assumptions of the statistical tests. Such frank misuse of statistical methodology and the quest to reach the mystical αstatistical training. Perhaps more critically, a poor understanding of statistical tools limits the conclusions that may be drawn from a study by divorcing the investigator from their own data. The net result is a decrease in quality and confidence in research findings, fueling recent controversies over the reproducibility of high profile findings and effects that appear to diminish over time. The recent development of "omics" approaches leading to the production of massive higher dimensional data sets has amplified these issues making it clear that new approaches are needed to appropriately and effectively mine this type of data. Unfortunately, statistical education in the field has not kept pace. This commentary provides a foundation for an intuitive understanding of statistics that fosters an exploratory approach and an appreciation for the assumptions of various statistical tests that hopefully will increase the correct use of statistics, the application of exploratory data analysis, and the use of statistical study design, with the goal of increasing reproducibility and confidence in the literature. Copyright © 2013. Published by Elsevier Inc.

  15. Statistics for non-statisticians

    CERN Document Server

    Madsen, Birger Stjernholm

    2016-01-01

    This book was written for those who need to know how to collect, analyze and present data. It is meant to be a first course for practitioners, a book for private study or brush-up on statistics, and supplementary reading for general statistics classes. The book is untraditional, both with respect to the choice of topics and the presentation: Topics were determined by what is most useful for practical statistical work, and the presentation is as non-mathematical as possible. The book contains many examples using statistical functions in spreadsheets. In this second edition, new topics have been included e.g. within the area of statistical quality control, in order to make the book even more useful for practitioners working in industry. .

  16. The Concise Encyclopedia of Statistics

    CERN Document Server

    Dodge, Yadolah

    2008-01-01

    The Concise Encyclopedia of Statistics presents the essential information about statistical tests, concepts, and analytical methods in language that is accessible to practitioners and students of the vast community using statistics in medicine, engineering, physical science, life science, social science, and business/economics. The reference is alphabetically arranged to provide quick access to the fundamental tools of statistical methodology and biographies of famous statisticians. The more than 500 entries include definitions, history, mathematical details, limitations, examples, references,

  17. Evaluation of Solid Rocket Motor Component Data Using a Commercially Available Statistical Software Package

    Science.gov (United States)

    Stefanski, Philip L.

    2015-01-01

    Commercially available software packages today allow users to quickly perform the routine evaluations of (1) descriptive statistics to numerically and graphically summarize both sample and population data, (2) inferential statistics that draws conclusions about a given population from samples taken of it, (3) probability determinations that can be used to generate estimates of reliability allowables, and finally (4) the setup of designed experiments and analysis of their data to identify significant material and process characteristics for application in both product manufacturing and performance enhancement. This paper presents examples of analysis and experimental design work that has been conducted using Statgraphics®(Registered Trademark) statistical software to obtain useful information with regard to solid rocket motor propellants and internal insulation material. Data were obtained from a number of programs (Shuttle, Constellation, and Space Launch System) and sources that include solid propellant burn rate strands, tensile specimens, sub-scale test motors, full-scale operational motors, rubber insulation specimens, and sub-scale rubber insulation analog samples. Besides facilitating the experimental design process to yield meaningful results, statistical software has demonstrated its ability to quickly perform complex data analyses and yield significant findings that might otherwise have gone unnoticed. One caveat to these successes is that useful results not only derive from the inherent power of the software package, but also from the skill and understanding of the data analyst.

  18. Applied statistics for economists

    CERN Document Server

    Lewis, Margaret

    2012-01-01

    This book is an undergraduate text that introduces students to commonly-used statistical methods in economics. Using examples based on contemporary economic issues and readily-available data, it not only explains the mechanics of the various methods, it also guides students to connect statistical results to detailed economic interpretations. Because the goal is for students to be able to apply the statistical methods presented, online sources for economic data and directions for performing each task in Excel are also included.

  19. Reason of method of density functional in classical and quantum statistical mechanisms

    International Nuclear Information System (INIS)

    Dinariev, O.Yu.

    2000-01-01

    Interaction between phenomenological description of a multi-component mixture on the basis of entropy functional with members, square in terms of component density gradients and temperature, on the one hand, and description in the framework of classical and quantum statistical mechanics, on the other hand, was investigated. Explicit expressions for the entropy functional in the classical and quantum theory were derived. Then a square approximation for the case of minor disturbances of uniform state was calculated. In the approximation the addends square in reference to the gradient were singlet out. It permits calculation of the relevant phenomenological coefficients from the leading principles [ru

  20. Aspects of statistical consulting not taught by acedemia

    DEFF Research Database (Denmark)

    Kenett, R.; Thyregod, Poul

    2006-01-01

    Education in statistics is preparing for statistical analysis but not necessarily for statistical consulting. The objective of this paper is to explore the phases that precede and follow statistical analysis. Specifically these include: problem elicitation, data collection and, following statisti......Education in statistics is preparing for statistical analysis but not necessarily for statistical consulting. The objective of this paper is to explore the phases that precede and follow statistical analysis. Specifically these include: problem elicitation, data collection and, following...... statistical data analysis, formulation of findings, and presentation of findings, and recommendations. Some insights derived from a literature review and real-life case studies are provided. Areas for joint research by statisticians and cognitive scientists are outlined....

  1. Statistical methodology for discrete fracture model - including fracture size, orientation uncertainty together with intensity uncertainty and variability

    Energy Technology Data Exchange (ETDEWEB)

    Darcel, C. (Itasca Consultants SAS (France)); Davy, P.; Le Goc, R.; Dreuzy, J.R. de; Bour, O. (Geosciences Rennes, UMR 6118 CNRS, Univ. def Rennes, Rennes (France))

    2009-11-15

    and fracturing properties main characteristics. From that starting point we built Statistical Fracture Domains whose significance rely exclusively on fracturing statistics, not including explicitly the current Fracture Domains or closeness between one borehole section or the other. Theoretical developments are proposed in order to incorporate the orientation uncertainty and the fracturing variability into a resulting parent distribution density uncertainty. When applied to both sites, it comes that variability prevails in front of uncertainty, thus validating the good level of data accuracy. Moreover, this allows to define a possible range of variation around the mean values of densities. Finally a sorting algorithm is developed for providing, from the initial elementary bricks mentioned above, a division of a site into Statistical Fracture Domains whose internal variability is reduced.

  2. Construction of the descriptive system for the Assessment of Quality of Life AQoL-6D utility instrument.

    Science.gov (United States)

    Richardson, Jeffrey R J; Peacock, Stuart J; Hawthorne, Graeme; Iezzi, Angelo; Elsworth, Gerald; Day, Neil A

    2012-04-17

    Multi attribute utility (MAU) instruments are used to include the health related quality of life (HRQoL) in economic evaluations of health programs. Comparative studies suggest different MAU instruments measure related but different constructs. The objective of this paper is to describe the methods employed to achieve content validity in the descriptive system of the Assessment of Quality of Life (AQoL)-6D, MAU instrument. The AQoL program introduced the use of psychometric methods in the construction of health related MAU instruments. To develop the AQoL-6D we selected 112 items from previous research, focus groups and expert judgment and administered them to 316 members of the public and 302 hospital patients. The search for content validity across a broad spectrum of health states required both formative and reflective modelling. We employed Exploratory Factor Analysis and Structural Equation Modelling (SEM) to meet these dual requirements. The resulting instrument employs 20 items in a multi-tier descriptive system. Latent dimension variables achieve sensitive descriptions of 6 dimensions which, in turn, combine to form a single latent QoL variable. Diagnostic statistics from the SEM analysis are exceptionally good and confirm the hypothesised structure of the model. The AQoL-6D descriptive system has good psychometric properties. They imply that the instrument has achieved construct validity and provides a sensitive description of HRQoL. This means that it may be used with confidence for measuring health related quality of life and that it is a suitable basis for modelling utilities for inclusion in the economic evaluation of health programs.

  3. Magnus forces and statistics in 2 + 1 dimensions

    International Nuclear Information System (INIS)

    Davis, R.L.

    1990-01-01

    Spinning vortex solutions to the abelian Higgs model, not Nielsen-Olesen solutions, are appropriate to a Ginzburg-Landau description of superconductivity. The main physical distinction is that spinning vortices experience the Magnus force while Nielsen-Olesen vortices do not. In 2 + 1 dimensional superconductivity without a Chern-Simons interaction, the effect of the Magnus force is equivalent to that of a background fictitious magnetic field. Moreover, the phase obtained an interchanging two quasi-particles is always path-dependent. When a Chern-Simons term is added there is an additional localized Magnus flux at the vortex. For point-like vortices, the Chern-Simons interaction can be seen as defining their intrinsic statistics, but in realistic cases of vortices with finite size in strong Magnus fields the quasi-particle statistics are not well-defined

  4. Practical statistics in pain research.

    Science.gov (United States)

    Kim, Tae Kyun

    2017-10-01

    Pain is subjective, while statistics related to pain research are objective. This review was written to help researchers involved in pain research make statistical decisions. The main issues are related with the level of scales that are often used in pain research, the choice of statistical methods between parametric or nonparametric statistics, and problems which arise from repeated measurements. In the field of pain research, parametric statistics used to be applied in an erroneous way. This is closely related with the scales of data and repeated measurements. The level of scales includes nominal, ordinal, interval, and ratio scales. The level of scales affects the choice of statistics between parametric or non-parametric methods. In the field of pain research, the most frequently used pain assessment scale is the ordinal scale, which would include the visual analogue scale (VAS). There used to be another view, however, which considered the VAS to be an interval or ratio scale, so that the usage of parametric statistics would be accepted practically in some cases. Repeated measurements of the same subjects always complicates statistics. It means that measurements inevitably have correlations between each other, and would preclude the application of one-way ANOVA in which independence between the measurements is necessary. Repeated measures of ANOVA (RMANOVA), however, would permit the comparison between the correlated measurements as long as the condition of sphericity assumption is satisfied. Conclusively, parametric statistical methods should be used only when the assumptions of parametric statistics, such as normality and sphericity, are established.

  5. Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement.

    Science.gov (United States)

    Dupont, Corinne; Occelli, Pauline; Deneux-Tharaux, Catherine; Touzet, Sandrine; Duclos, Antoine; Bouvier-Colle, Marie-Hélène; Rudigoz, René-Charles; Huissoud, Cyril

    2014-07-01

    Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement To use statistical process control charts to describe trends in the prevalence of severe postpartum haemorrhage after vaginal delivery. This assessment was performed 7 years after we initiated a continuous quality improvement programme that began with regular criteria-based audits Observational descriptive study, in a French maternity unit in the Rhône-Alpes region. Quarterly clinical audit meetings to analyse all cases of severe postpartum haemorrhage after vaginal delivery and provide feedback on quality of care with statistical process control tools. The primary outcomes were the prevalence of severe PPH after vaginal delivery and its quarterly monitoring with a control chart. The secondary outcomes included the global quality of care for women with severe postpartum haemorrhage, including the performance rate of each recommended procedure. Differences in these variables between 2005 and 2012 were tested. From 2005 to 2012, the prevalence of severe postpartum haemorrhage declined significantly, from 1.2% to 0.6% of vaginal deliveries (pcontrol limits, that is, been out of statistical control. The proportion of cases that were managed consistently with the guidelines increased for all of their main components. Implementation of continuous quality improvement efforts began seven years ago and used, among other tools, statistical process control charts. During this period, the prevalence of severe postpartum haemorrhage after vaginal delivery has been reduced by 50%. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  6. The PHMC algorithm for simulations of dynamical fermions; 1, description and properties

    CERN Document Server

    Frezzotti, R

    1999-01-01

    We give a detailed description of the so-called Polynomial Hybrid Monte Carlo (PHMC) algorithm. The effects of the correction factor, which is introduced to render the algorithm exact, are discussed, stressing their relevance for the statistical fluctuations and (almost) zero mode contributions to physical observables. We also investigate rounding-error effects and propose several ways to reduce memory requirements.

  7. Electromagnetic phenomena in matter statistical and quantum approaches

    CERN Document Server

    Toptygin, Igor N

    2015-01-01

    Modern electrodynamics in different media is a wide branch of electrodynamics which combines the exact theory of electromagnetic fields in the presence of electric charges and currents with statistical description of these fields in gases, plasmas, liquids and solids; dielectrics, conductors and superconductors. It is widely used in physics and in other natural sciences (such as astrophysics and geophysics, biophysics, ecology and evolution of terrestrial climate), and in various technological applications (radio electronics, technology of artificial materials, laser-based technological proces

  8. Topics in theoretical and applied statistics

    CERN Document Server

    Giommi, Andrea

    2016-01-01

    This book highlights the latest research findings from the 46th International Meeting of the Italian Statistical Society (SIS) in Rome, during which both methodological and applied statistical research was discussed. This selection of fully peer-reviewed papers, originally presented at the meeting, addresses a broad range of topics, including the theory of statistical inference; data mining and multivariate statistical analysis; survey methodologies; analysis of social, demographic and health data; and economic statistics and econometrics.

  9. Modern applied U-statistics

    CERN Document Server

    Kowalski, Jeanne

    2008-01-01

    A timely and applied approach to the newly discovered methods and applications of U-statisticsBuilt on years of collaborative research and academic experience, Modern Applied U-Statistics successfully presents a thorough introduction to the theory of U-statistics using in-depth examples and applications that address contemporary areas of study including biomedical and psychosocial research. Utilizing a "learn by example" approach, this book provides an accessible, yet in-depth, treatment of U-statistics, as well as addresses key concepts in asymptotic theory by integrating translational and cross-disciplinary research.The authors begin with an introduction of the essential and theoretical foundations of U-statistics such as the notion of convergence in probability and distribution, basic convergence results, stochastic Os, inference theory, generalized estimating equations, as well as the definition and asymptotic properties of U-statistics. With an emphasis on nonparametric applications when and where applic...

  10. Mineral statistics yearbook 1994

    International Nuclear Information System (INIS)

    1994-01-01

    A summary of mineral production in Saskatchewan was compiled and presented as a reference manual. Statistical information on fuel minerals such as crude oil, natural gas, liquefied petroleum gas and coal, and of industrial and metallic minerals, such as potash, sodium sulphate, salt and uranium, was provided in all conceivable variety of tables. Production statistics, disposition and value of sales of industrial and metallic minerals were also made available. Statistical data on drilling of oil and gas reservoirs and crown land disposition were also included. figs., tabs

  11. Karyotype characterization of Mugil incilis Hancock, 1830 (Mugiliformes: Mugilidae, including a description of an unusual co-localization of major and minor ribosomal genes in the family

    Directory of Open Access Journals (Sweden)

    Anne Kathrin Hett

    Full Text Available This study reports the description of the karyotype of Mugil incilis from Venezuela. The chromosome complement is composed of 48 acrocentric chromosomes, which uniformly decrease in size. Therefore, the homologues can not be clearly identified, with the exception of one of the largest chromosome pairs, classified as number 1, whose homologues may show a subcentromeric secondary constriction, and of chromosome pair number 24, which is considerably smaller than the others. C-banding showed heterochromatic blocks at the centromeric/pericentromeric regions of all chromosomes, which were more conspicuous on chromosomes 1, given the C-positive signals include the secondary constrictions. AgNO3 and fluorescent in situ hybridization (FISH with 45S rDNA demonstrated that the nucleolus organizer regions are indeed located on the secondary constrictions of chromosome pair number 1. FISH with 5S rDNA revealed that the minor ribosomal genes are located on this same chromosome pair, near the NORs, though signals are closer to the centromeres and of smaller size, compared to those of the major ribosomal gene clusters. This is the first description of co-localization of major and minor ribosomal genes in the family. Data are discussed from a cytotaxonomic and phylogenetic perspective.

  12. Advances in ultrasonic testing of austenitic stainless steel welds. Towards a 3D description of the material including attenuation and optimisation by inversion

    Science.gov (United States)

    Moysan, J.; Gueudré, C.; Ploix, M.-A.; Corneloup, G.; Guy, Ph.; Guerjouma, R. El; Chassignole, B.

    In the case of multi-pass welds, the material is very difficult to describe due to its anisotropic and heterogeneous properties. Anisotropy results from the metal solidification and is correlated with the grain orientation. A precise description of the material is one of the key points to obtain reliable results with wave propagation codes. A first advance is the model MINA which predicts the grain orientations in multi-pass 316-L steel welds. For flat position welding, good predictions of the grains orientations were obtained using 2D modelling. In case of welding in position the resulting grain structure may be 3D oriented. We indicate how the MINA model can be improved for 3D description. A second advance is a good quantification of the attenuation. Precise measurements are obtained using plane waves angular spectrum method together with the computation of the transmission coefficients for triclinic material. With these two first advances, the third one is now possible: developing an inverse method to obtain the material description through ultrasonic measurements at different positions.

  13. Evaluation of undergraduate nursing students' attitudes towards statistics courses, before and after a course in applied statistics.

    Science.gov (United States)

    Hagen, Brad; Awosoga, Olu; Kellett, Peter; Dei, Samuel Ofori

    2013-09-01

    Undergraduate nursing students must often take a course in statistics, yet there is scant research to inform teaching pedagogy. The objectives of this study were to assess nursing students' overall attitudes towards statistics courses - including (among other things) overall fear and anxiety, preferred learning and teaching styles, and the perceived utility and benefit of taking a statistics course - before and after taking a mandatory course in applied statistics. The authors used a pre-experimental research design (a one-group pre-test/post-test research design), by administering a survey to nursing students at the beginning and end of the course. The study was conducted at a University in Western Canada that offers an undergraduate Bachelor of Nursing degree. Participants included 104 nursing students, in the third year of a four-year nursing program, taking a course in statistics. Although students only reported moderate anxiety towards statistics, student anxiety about statistics had dropped by approximately 40% by the end of the course. Students also reported a considerable and positive change in their attitudes towards learning in groups by the end of the course, a potential reflection of the team-based learning that was used. Students identified preferred learning and teaching approaches, including the use of real-life examples, visual teaching aids, clear explanations, timely feedback, and a well-paced course. Students also identified preferred instructor characteristics, such as patience, approachability, in-depth knowledge of statistics, and a sense of humor. Unfortunately, students only indicated moderate agreement with the idea that statistics would be useful and relevant to their careers, even by the end of the course. Our findings validate anecdotal reports on statistics teaching pedagogy, although more research is clearly needed, particularly on how to increase students' perceptions of the benefit and utility of statistics courses for their nursing

  14. Quantum-statistical kinetic equations

    International Nuclear Information System (INIS)

    Loss, D.; Schoeller, H.

    1989-01-01

    Considering a homogeneous normal quantum fluid consisting of identical interacting fermions or bosons, the authors derive an exact quantum-statistical generalized kinetic equation with a collision operator given as explicit cluster series where exchange effects are included through renormalized Liouville operators. This new result is obtained by applying a recently developed superoperator formalism (Liouville operators, cluster expansions, symmetrized projectors, P q -rule, etc.) to nonequilibrium systems described by a density operator ρ(t) which obeys the von Neumann equation. By means of this formalism a factorization theorem is proven (being essential for obtaining closed equations), and partial resummations (leading to renormalized quantities) are performed. As an illustrative application, the quantum-statistical versions (including exchange effects due to Fermi-Dirac or Bose-Einstein statistics) of the homogeneous Boltzmann (binary collisions) and Choh-Uhlenbeck (triple collisions) equations are derived

  15. Statistical optics

    Science.gov (United States)

    Goodman, J. W.

    This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.

  16. Do we need statistics when we have linguistics?

    Directory of Open Access Journals (Sweden)

    Cantos Gómez Pascual

    2002-01-01

    Full Text Available Statistics is known to be a quantitative approach to research. However, most of the research done in the fields of language and linguistics is of a different kind, namely qualitative. Succinctly, qualitative analysis differs from quantitative analysis is that in the former no attempt is made to assign frequencies, percentages and the like, to the linguistic features found or identified in the data. In quantitative research, linguistic features are classified and counted, and even more complex statistical models are constructed in order to explain these observed facts. In qualitative research, however, we use the data only for identifying and describing features of language usage and for providing real occurrences/examples of particular phenomena. In this paper, we shall try to show how quantitative methods and statistical techniques can supplement qualitative analyses of language. We shall attempt to present some mathematical and statistical properties of natural languages, and introduce some of the quantitative methods which are of the most value in working empirically with texts and corpora, illustrating the various issues with numerous examples and moving from the most basic descriptive techniques (frequency counts and percentages to decision-taking techniques (chi-square and z-score and to more sophisticated statistical language models (Type-Token/Lemma-Token/Lemma-Type formulae, cluster analysis and discriminant function analysis.

  17. Evaluating and Reporting Statistical Power in Counseling Research

    Science.gov (United States)

    Balkin, Richard S.; Sheperis, Carl J.

    2011-01-01

    Despite recommendations from the "Publication Manual of the American Psychological Association" (6th ed.) to include information on statistical power when publishing quantitative results, authors seldom include analysis or discussion of statistical power. The rationale for discussing statistical power is addressed, approaches to using "G*Power" to…

  18. Quasi-homogenous approximation for description of the properties of dispersed systems. The basic approaches to model hardening processes in nanodispersed silica systems. Part 2. The hardening processes from the standpoint of statistical physics

    Directory of Open Access Journals (Sweden)

    KUDRYAVTSEV Pavel Gennadievich

    2015-04-01

    Full Text Available The paper deals with possibilities to use quasi-homogenous approximation for discription of properties of dispersed systems. The authors applied statistical polymer ethod based on consideration of average structures of all possible macromolecules of the same weight. The equiations which allow evaluating many additive parameters of macromolecules and the systems with them were deduced. Statistical polymer method makes it possible to model branched, cross-linked macromolecules and the systems with them which are in equilibrium or non-equilibrium state. Fractal analysis of statistical polymer allows modeling different types of random fractal and other objects examined with the mehods of fractal theory. The method of fractal polymer can be also applied not only to polymers but also to composites, gels, associates in polar liquids and other packaged systems. There is also a description of the states of colloid solutions of silica oxide from the point of view of statistical physics. This approach is based on the idea that colloid solution of silica dioxide – sol of silica dioxide – consists of enormous number of interacting particles which are always in move. The paper is devoted to the research of ideal system of colliding but not interacting particles of sol. The analysis of behavior of silica sol was performed according to distribution Maxwell-Boltzmann and free path length was calculated. Using this data the number of the particles which can overcome the potential barrier in collision was calculated. To model kinetics of sol-gel transition different approaches were studied.

  19. Strengthening the Role of Part-Time Faculty in Community Colleges. Example Job Description for Part-Time Faculty: Valencia College--Job Description and Essential Competencies

    Science.gov (United States)

    Center for Community College Student Engagement, 2013

    2013-01-01

    In an effort to support college conversations regarding strengthening the role of part-time faculty, this brief document presents the job description for a Valencia College part-time/adjunct professor (revised as of July 19, 2013). The description includes essential functions, qualifications, and knowledge, skills, and abilities. This is followed…

  20. Outcomes Definitions and Statistical Tests in Oncology Studies: A Systematic Review of the Reporting Consistency.

    Science.gov (United States)

    Rivoirard, Romain; Duplay, Vianney; Oriol, Mathieu; Tinquaut, Fabien; Chauvin, Franck; Magne, Nicolas; Bourmaud, Aurelie

    2016-01-01

    Quality of reporting for Randomized Clinical Trials (RCTs) in oncology was analyzed in several systematic reviews, but, in this setting, there is paucity of data for the outcomes definitions and consistency of reporting for statistical tests in RCTs and Observational Studies (OBS). The objective of this review was to describe those two reporting aspects, for OBS and RCTs in oncology. From a list of 19 medical journals, three were retained for analysis, after a random selection: British Medical Journal (BMJ), Annals of Oncology (AoO) and British Journal of Cancer (BJC). All original articles published between March 2009 and March 2014 were screened. Only studies whose main outcome was accompanied by a corresponding statistical test were included in the analysis. Studies based on censored data were excluded. Primary outcome was to assess quality of reporting for description of primary outcome measure in RCTs and of variables of interest in OBS. A logistic regression was performed to identify covariates of studies potentially associated with concordance of tests between Methods and Results parts. 826 studies were included in the review, and 698 were OBS. Variables were described in Methods section for all OBS studies and primary endpoint was clearly detailed in Methods section for 109 RCTs (85.2%). 295 OBS (42.2%) and 43 RCTs (33.6%) had perfect agreement for reported statistical test between Methods and Results parts. In multivariable analysis, variable "number of included patients in study" was associated with test consistency: aOR (adjusted Odds Ratio) for third group compared to first group was equal to: aOR Grp3 = 0.52 [0.31-0.89] (P value = 0.009). Variables in OBS and primary endpoint in RCTs are reported and described with a high frequency. However, statistical tests consistency between methods and Results sections of OBS is not always noted. Therefore, we encourage authors and peer reviewers to verify consistency of statistical tests in oncology studies.

  1. Statistics in action a Canadian outlook

    CERN Document Server

    Lawless, Jerald F

    2014-01-01

    Commissioned by the Statistical Society of Canada (SSC), Statistics in Action: A Canadian Outlook helps both general readers and users of statistics better appreciate the scope and importance of statistics. It presents the ways in which statistics is used while highlighting key contributions that Canadian statisticians are making to science, technology, business, government, and other areas. The book emphasizes the role and impact of computing in statistical modeling and analysis, including the issues involved with the huge amounts of data being generated by automated processes.The first two c

  2. The nature of statistics

    CERN Document Server

    Wallis, W Allen

    2014-01-01

    Focusing on everyday applications as well as those of scientific research, this classic of modern statistical methods requires little to no mathematical background. Readers develop basic skills for evaluating and using statistical data. Lively, relevant examples include applications to business, government, social and physical sciences, genetics, medicine, and public health. ""W. Allen Wallis and Harry V. Roberts have made statistics fascinating."" - The New York Times ""The authors have set out with considerable success, to write a text which would be of interest and value to the student who,

  3. Statistical analysis of vehicle crashes in Mississippi based on crash data from 2010 to 2014.

    Science.gov (United States)

    2017-08-15

    Traffic crash data from 2010 to 2014 were collected by Mississippi Department of Transportation (MDOT) and extracted for the study. Three tasks were conducted in this study: (1) geographic distribution of crashes; (2) descriptive statistics of crash ...

  4. Statistical properties of superimposed stationary spike trains.

    Science.gov (United States)

    Deger, Moritz; Helias, Moritz; Boucsein, Clemens; Rotter, Stefan

    2012-06-01

    The Poisson process is an often employed model for the activity of neuronal populations. It is known, though, that superpositions of realistic, non- Poisson spike trains are not in general Poisson processes, not even for large numbers of superimposed processes. Here we construct superimposed spike trains from intracellular in vivo recordings from rat neocortex neurons and compare their statistics to specific point process models. The constructed superimposed spike trains reveal strong deviations from the Poisson model. We find that superpositions of model spike trains that take the effective refractoriness of the neurons into account yield a much better description. A minimal model of this kind is the Poisson process with dead-time (PPD). For this process, and for superpositions thereof, we obtain analytical expressions for some second-order statistical quantities-like the count variability, inter-spike interval (ISI) variability and ISI correlations-and demonstrate the match with the in vivo data. We conclude that effective refractoriness is the key property that shapes the statistical properties of the superposition spike trains. We present new, efficient algorithms to generate superpositions of PPDs and of gamma processes that can be used to provide more realistic background input in simulations of networks of spiking neurons. Using these generators, we show in simulations that neurons which receive superimposed spike trains as input are highly sensitive for the statistical effects induced by neuronal refractoriness.

  5. Common pitfalls in statistical analysis: “P” values, statistical significance and confidence intervals

    Science.gov (United States)

    Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc

    2015-01-01

    In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ‘P’ value, explain the importance of ‘confidence intervals’ and clarify the importance of including both values in a paper PMID:25878958

  6. Statistical symmetries in physics

    International Nuclear Information System (INIS)

    Green, H.S.; Adelaide Univ., SA

    1994-01-01

    Every law of physics is invariant under some group of transformations and is therefore the expression of some type of symmetry. Symmetries are classified as geometrical, dynamical or statistical. At the most fundamental level, statistical symmetries are expressed in the field theories of the elementary particles. This paper traces some of the developments from the discovery of Bose statistics, one of the two fundamental symmetries of physics. A series of generalizations of Bose statistics is described. A supersymmetric generalization accommodates fermions as well as bosons, and further generalizations, including parastatistics, modular statistics and graded statistics, accommodate particles with properties such as 'colour'. A factorization of elements of ggl(n b ,n f ) can be used to define truncated boson operators. A general construction is given for q-deformed boson operators, and explicit constructions of the same type are given for various 'deformed' algebras. A summary is given of some of the applications and potential applications. 39 refs., 2 figs

  7. Statistical shape analysis with applications in R

    CERN Document Server

    Dryden, Ian L

    2016-01-01

    A thoroughly revised and updated edition of this introduction to modern statistical methods for shape analysis Shape analysis is an important tool in the many disciplines where objects are compared using geometrical features. Examples include comparing brain shape in schizophrenia; investigating protein molecules in bioinformatics; and describing growth of organisms in biology. This book is a significant update of the highly-regarded `Statistical Shape Analysis’ by the same authors. The new edition lays the foundations of landmark shape analysis, including geometrical concepts and statistical techniques, and extends to include analysis of curves, surfaces, images and other types of object data. Key definitions and concepts are discussed throughout, and the relative merits of different approaches are presented. The authors have included substantial new material on recent statistical developments and offer numerous examples throughout the text. Concepts are introduced in an accessible manner, while reta...

  8. Statistical thermodynamics

    CERN Document Server

    Schrödinger, Erwin

    1952-01-01

    Nobel Laureate's brilliant attempt to develop a simple, unified standard method of dealing with all cases of statistical thermodynamics - classical, quantum, Bose-Einstein, Fermi-Dirac, and more.The work also includes discussions of Nernst theorem, Planck's oscillator, fluctuations, the n-particle problem, problem of radiation, much more.

  9. Reports on Cancer - Cancer Statistics

    Science.gov (United States)

    Interactive tools for access to statistics for a cancer site by gender, race, ethnicity, calendar year, age, state, county, stage, and histology. Statistics include incidence, mortality, prevalence, cost, risk factors, behaviors, tobacco use, and policies and are presented as graphs, tables, or maps.

  10. Energy Statistics

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources

  11. Fisher statistics for analysis of diffusion tensor directional information.

    Science.gov (United States)

    Hutchinson, Elizabeth B; Rutecki, Paul A; Alexander, Andrew L; Sutula, Thomas P

    2012-04-30

    A statistical approach is presented for the quantitative analysis of diffusion tensor imaging (DTI) directional information using Fisher statistics, which were originally developed for the analysis of vectors in the field of paleomagnetism. In this framework, descriptive and inferential statistics have been formulated based on the Fisher probability density function, a spherical analogue of the normal distribution. The Fisher approach was evaluated for investigation of rat brain DTI maps to characterize tissue orientation in the corpus callosum, fornix, and hilus of the dorsal hippocampal dentate gyrus, and to compare directional properties in these regions following status epilepticus (SE) or traumatic brain injury (TBI) with values in healthy brains. Direction vectors were determined for each region of interest (ROI) for each brain sample and Fisher statistics were applied to calculate the mean direction vector and variance parameters in the corpus callosum, fornix, and dentate gyrus of normal rats and rats that experienced TBI or SE. Hypothesis testing was performed by calculation of Watson's F-statistic and associated p-value giving the likelihood that grouped observations were from the same directional distribution. In the fornix and midline corpus callosum, no directional differences were detected between groups, however in the hilus, significant (pstatistical comparison of tissue structural orientation. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. Descriptive analysis of individual and community factors among African American youths in urban public housing.

    Science.gov (United States)

    Nebbitt, Von E; Williams, James Herbert; Lombe, Margaret; McCoy, Henrika; Stephens, Jennifer

    2014-07-01

    African American adolescents are disproportionately represented in urban public housing developments. These neighborhoods are generally characterized by high rates of poverty, crime, violence, and disorganization. Although evidence is emerging on youths in these communities, little is known about their depressive symptoms, perceived efficacy, or frequency of substance use and sex-risk behavior. Further, even less is known about their exposure to community and household violence, their parents' behavior, or their sense of connection to their communities. Using a sample of 782 African American adolescents living in public housing neighborhoods located in four large U.S. cities, this article attempts to rectify the observed gap in knowledge by presenting a descriptive overview of their self-reported depressive symptoms; self-efficacy; frequencies of delinquent and sexual-risk behavior; and alcohol, tobacco, and other drug use. The self-reported ratings of their parents' behavior as well as their exposure to community and household violence are presented. Analytic procedures include descriptive statistics and mean comparisons between genders and across research cities. Results suggest several differences between genders and across research sites. However, results are not very different from national data. Implications for social work practice are discussed.

  13. Statistical description of flume experiments on mixed-size bed-load transport and bed armoring processes

    Science.gov (United States)

    Chen, D.; Zhang, Y.

    2008-12-01

    The objective of this paper is to describe the statistical properties of experiments on non-uniform bed-load transport as well as the mechanism of bed armoring processes. Despite substantial effort made over the last two decades, the ability to compute the bed-load flux in a turbulent system remains poor. The major obstacles include the poor understanding of the formation of armor lays on bed surfaces. Such a layer is much flow-resistible than the underlying material and therefore significantly inhibits sediment transport from the reach. To study the problem, we conducted a flume study for mixed sand/gravel sediments. We observed that aggregated sediment blocks were the most common characters in armor layers - the largest sizes resist hydraulic forces, while the smaller sizes add interlocking support and prevent loss of fine material through gaps between the larger particles. Fractional transport rates with the existing of armor layers were measured with time by sediment trapping method at the end of flume. To address the intermittent and time-varying behavior of bed-load transport during bed armoring processes, we investigated the probability distribution of the fractional bed-load transport rates, and the underlying dynamic model derived from the continuous time random walk framework. Results indicate that it is critical to consider the impact of armor layers when a flow is sufficient to move some of the finer particles and yet insufficient to move all the larger particles on a channel bed.

  14. Statistical mechanics of directed models of polymers in the square lattice

    International Nuclear Information System (INIS)

    Rensburg, E J Janse van

    2003-01-01

    Directed square lattice models of polymers and vesicles have received considerable attention in the recent mathematical and physical sciences literature. These are idealized geometric directed lattice models introduced to study phase behaviour in polymers, and include Dyck paths, partially directed paths, directed trees and directed vesicles models. Directed models are closely related to models studied in the combinatorics literature (and are often exactly solvable). They are also simplified versions of a number of statistical mechanics models, including the self-avoiding walk, lattice animals and lattice vesicles. The exchange of approaches and ideas between statistical mechanics and combinatorics have considerably advanced the description and understanding of directed lattice models, and this will be explored in this review. The combinatorial nature of directed lattice path models makes a study using generating function approaches most natural. In contrast, the statistical mechanics approach would introduce partition functions and free energies, and then investigate these using the general framework of critical phenomena. Generating function and statistical mechanics approaches are closely related. For example, questions regarding the limiting free energy may be approached by considering the radius of convergence of a generating function, and the scaling properties of thermodynamic quantities are related to the asymptotic properties of the generating function. In this review the methods for obtaining generating functions and determining free energies in directed lattice path models of linear polymers is presented. These methods include decomposition methods leading to functional recursions, as well as the Temperley method (that is implemented by creating a combinatorial object, one slice at a time). A constant term formulation of the generating function will also be reviewed. The thermodynamic features and critical behaviour in models of directed paths may be

  15. Quantum mechanics of the fractional-statistics gas: Random-phase approximation

    International Nuclear Information System (INIS)

    Dai, Q.; Levy, J.L.; Fetter, A.L.; Hanna, C.B.; Laughlin, R.B.

    1992-01-01

    A description of the fractional-statistics gas based on the complete summation of Hartree, Fock, ladder and bubble diagrams is presented. The superfluid properties identified previously in the random-phase-approximation (RPA) calculation of Fetter, Hanna, and Laughlin [Phys. Rev. B 39, 9679 (1989)] are substantially confirmed. The discrepancy between the RPA sound speed and the Hartree-Fock bulk modulus is found to be eliminated. The unusual Hall-effect behavior is found to vanish for the Bose gas test case but not for the fractional-statistics gas, implying that it is physically correct. Excellent agreement is obtained with the collective-mode dispersion obtained numerically by Xie, He, and Das Sarma [Phys. Rev. Lett. 65, 649 (1990)

  16. Selecting the most appropriate inferential statistical test for your quantitative research study.

    Science.gov (United States)

    Bettany-Saltikov, Josette; Whittaker, Victoria Jane

    2014-06-01

    To discuss the issues and processes relating to the selection of the most appropriate statistical test. A review of the basic research concepts together with a number of clinical scenarios is used to illustrate this. Quantitative nursing research generally features the use of empirical data which necessitates the selection of both descriptive and statistical tests. Different types of research questions can be answered by different types of research designs, which in turn need to be matched to a specific statistical test(s). Discursive paper. This paper discusses the issues relating to the selection of the most appropriate statistical test and makes some recommendations as to how these might be dealt with. When conducting empirical quantitative studies, a number of key issues need to be considered. Considerations for selecting the most appropriate statistical tests are discussed and flow charts provided to facilitate this process. When nursing clinicians and researchers conduct quantitative research studies, it is crucial that the most appropriate statistical test is selected to enable valid conclusions to be made. © 2013 John Wiley & Sons Ltd.

  17. Description logic rules

    CERN Document Server

    Krötzsch, M

    2010-01-01

    Ontological modelling today is applied in many areas of science and technology,including the Semantic Web. The W3C standard OWL defines one of the most important ontology languages based on the semantics of description logics. An alternative is to use rule languages in knowledge modelling, as proposed in the W3C's RIF standard. So far, it has often been unclear how to combine both technologies without sacrificing essential computational properties. This book explains this problem and presents new solutions that have recently been proposed. Extensive introductory chapters provide the necessary

  18. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    Science.gov (United States)

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  19. The Incoming Statistical Knowledge of Undergraduate Majors in a Department of Mathematics and Statistics

    Science.gov (United States)

    Cook, Samuel A.; Fukawa-Connelly, Timothy

    2016-01-01

    Studies have shown that at the end of an introductory statistics course, students struggle with building block concepts, such as mean and standard deviation, and rely on procedural understandings of the concepts. This study aims to investigate the understandings entering freshman of a department of mathematics and statistics (including mathematics…

  20. Statistical physics

    CERN Document Server

    Guénault, Tony

    2007-01-01

    In this revised and enlarged second edition of an established text Tony Guénault provides a clear and refreshingly readable introduction to statistical physics, an essential component of any first degree in physics. The treatment itself is self-contained and concentrates on an understanding of the physical ideas, without requiring a high level of mathematical sophistication. A straightforward quantum approach to statistical averaging is adopted from the outset (easier, the author believes, than the classical approach). The initial part of the book is geared towards explaining the equilibrium properties of a simple isolated assembly of particles. Thus, several important topics, for example an ideal spin-½ solid, can be discussed at an early stage. The treatment of gases gives full coverage to Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein statistics. Towards the end of the book the student is introduced to a wider viewpoint and new chapters are included on chemical thermodynamics, interactions in, for exam...

  1. Statistics for Finance

    DEFF Research Database (Denmark)

    Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard

    Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...... that rarely connect concepts to data and books on econometrics and time series analysis that do not cover specific problems related to option valuation. The book discusses applications of financial derivatives pertaining to risk assessment and elimination. The authors cover various statistical...... and mathematical techniques, including linear and nonlinear time series analysis, stochastic calculus models, stochastic differential equations, Itō’s formula, the Black–Scholes model, the generalized method-of-moments, and the Kalman filter. They explain how these tools are used to price financial derivatives...

  2. Investigation of the statistical properties of light

    International Nuclear Information System (INIS)

    Jensen, A.S.

    1976-09-01

    The report describes the coherence properties and the statistical properties of light from a classical and quantum mechanical point of view. The theoretical part is used to describe some more specific examples within the field of light scattering such as light scattering from a collection of independent scatters, measurement of wind velocity, and single burst detection in a Laser-Doppler velocitimeter system where the description gives a figure of merit for such a system. An experimental investigation was made of Brillouin scattering in some organic liquids. The experimental equipment is described and the results reported. (Auth.)

  3. Statistical Challenges in Military Research

    Science.gov (United States)

    2016-07-30

    SGVU SUBJECT: Professional Presentation Approval 8 APR2016 1. Your paper, entitled Statistical Challenges in Military Research presented at Joint...Statistical Meetings, Chicago, IL 30 July - 4 Aug 2016 and Proceedings of the Joint Statistical Meetings with MDWI 41-108, and has been assigned ...charges (to include costs for tables and black and white photos). We cannot pay for reprints. If you are 59 MDW staff member, we can forward your request

  4. Frontiers in statistical quality control

    CERN Document Server

    Wilrich, Peter-Theodor

    2004-01-01

    This volume treats the four main categories of Statistical Quality Control: General SQC Methodology, On-line Control including Sampling Inspection and Statistical Process Control, Off-line Control with Data Analysis and Experimental Design, and, fields related to Reliability. Experts with international reputation present their newest contributions.

  5. 78 FR 13072 - Seventh Annual Drug Information Association/Food and Drug Administration Statistics Forum-2013...

    Science.gov (United States)

    2013-02-26

    ... establish an ongoing dialogue regarding FDA's ``Critical Path'' initiative--emphasizing the regulatory and... application of statistical methodologies and thinking to the development of new therapeutic biologics and... improving the communication between industry statisticians and FDA reviewers. A description of the planned...

  6. Collective doorways and statistical doorways: The decay properties of giant multipole resonances

    International Nuclear Information System (INIS)

    Dias, H.; Hussein, M.S.; Adhikari, S.K.

    1985-01-01

    A theoretical framework for the description of the decay of giant multipole resonances is developed. It is shown that the statistical decay of the GMR is not necessarily described by the Hauser-Feschbach theory owing to the existence of a mixing parameter. The contribution of pre-equilibrium emission to the GMR decay is also discussed. (Author) [pt

  7. Construction of the descriptive system for the assessment of quality of life AQoL-6D utility instrument

    Directory of Open Access Journals (Sweden)

    Richardson Jeffrey RJ

    2012-04-01

    Full Text Available Abstract Background Multi attribute utility (MAU instruments are used to include the health related quality of life (HRQoL in economic evaluations of health programs. Comparative studies suggest different MAU instruments measure related but different constructs. The objective of this paper is to describe the methods employed to achieve content validity in the descriptive system of the Assessment of Quality of Life (AQoL-6D, MAU instrument. Methods The AQoL program introduced the use of psychometric methods in the construction of health related MAU instruments. To develop the AQoL-6D we selected 112 items from previous research, focus groups and expert judgment and administered them to 316 members of the public and 302 hospital patients. The search for content validity across a broad spectrum of health states required both formative and reflective modelling. We employed Exploratory Factor Analysis and Structural Equation Modelling (SEM to meet these dual requirements. Results and Discussion The resulting instrument employs 20 items in a multi-tier descriptive system. Latent dimension variables achieve sensitive descriptions of 6 dimensions which, in turn, combine to form a single latent QoL variable. Diagnostic statistics from the SEM analysis are exceptionally good and confirm the hypothesised structure of the model. Conclusions The AQoL-6D descriptive system has good psychometric properties. They imply that the instrument has achieved construct validity and provides a sensitive description of HRQoL. This means that it may be used with confidence for measuring health related quality of life and that it is a suitable basis for modelling utilities for inclusion in the economic evaluation of health programs.

  8. A taxonomic revision of the Cymindis (Pinacodera limbata species group (Coleoptera, Carabidae, Lebiini, including description of a new species from Florida, U.S.A.

    Directory of Open Access Journals (Sweden)

    Wesley Hunting

    2013-01-01

    Full Text Available The Cymindis (Pinacodera limbata species group (Coleoptera, Carabidae, Lebiini is a precinctive New World taxon with ranges extended from portions of temperate southeastern Canada and the U.S.A. through the montane regions of Mexico, south to the Isthmus of Tehuantepec. The group is distinguishable from all other members of the subgenus Pinacodera by males possessing a distinctive sclerite (endophallic plate at the apex of the endophallus. In the past, a lack of material and misunderstandings of range of variation within species have contributed to confusion about how many species there really are.This revision of the limbata species group includes a classification, a key to groups within the subgenus Pinacodera and species within the limbata group, descriptions of species, re-rankings and new synonymies. In total 10 taxa are treated, with 6 new synonyms proposed, 1 new combination introduced and 1 new species described: Cymindis (Pinacodera rufostigma (type locality: Archbold Biological Station, Highlands County, Florida, U.S.A.. Each taxon is characterized in terms of structural features of adults, habitat, geographical distribution, and chorological affinities. Available ecological information and treatments of variation are included.

  9. An efficient direct solver for rarefied gas flows with arbitrary statistics

    International Nuclear Information System (INIS)

    Diaz, Manuel A.; Yang, Jaw-Yen

    2016-01-01

    A new numerical methodology associated with a unified treatment is presented to solve the Boltzmann–BGK equation of gas dynamics for the classical and quantum gases described by the Bose–Einstein and Fermi–Dirac statistics. Utilizing a class of globally-stiffly-accurate implicit–explicit Runge–Kutta scheme for the temporal evolution, associated with the discrete ordinate method for the quadratures in the momentum space and the weighted essentially non-oscillatory method for the spatial discretization, the proposed scheme is asymptotic-preserving and imposes no non-linear solver or requires the knowledge of fugacity and temperature to capture the flow structures in the hydrodynamic (Euler) limit. The proposed treatment overcomes the limitations found in the work by Yang and Muljadi (2011) [33] due to the non-linear nature of quantum relations, and can be applied in studying the dynamics of a gas with internal degrees of freedom with correct values of the ratio of specific heat for the flow regimes for all Knudsen numbers and energy wave lengths. The present methodology is numerically validated with the unified treatment by the one-dimensional shock tube problem and the two-dimensional Riemann problems for gases of arbitrary statistics. Descriptions of ideal quantum gases including rotational degrees of freedom have been successfully achieved under the proposed methodology.

  10. Descriptive analysis of bacon smoked with Brazilian woods from reforestation: methodological aspects, statistical analysis, and study of sensory characteristics.

    Science.gov (United States)

    Saldaña, Erick; Castillo, Luiz Saldarriaga; Sánchez, Jorge Cabrera; Siche, Raúl; de Almeida, Marcio Aurélio; Behrens, Jorge H; Selani, Miriam Mabel; Contreras-Castillo, Carmen J

    2018-06-01

    The aim of this study was to perform a descriptive analysis (DA) of bacons smoked with woods from reforestation and liquid smokes in order to investigate their sensory profile. Six samples of bacon were selected: three smoked bacons with different wood species (Eucalyptus citriodora, Acacia mearnsii, and Bambusa vulgaris), two artificially smoked bacon samples (liquid smoke) and one negative control (unsmoked bacon). Additionally, a commercial bacon sample was also evaluated. DA was developed successfully, presenting a good performance in terms of discrimination, consensus and repeatability. The study revealed that the smoking process modified the sensory profile by intensifying the "saltiness" and differentiating the unsmoked from the smoked samples. The results from the current research represent the first methodological development of descriptive analysis of bacon and may be used by food companies and other stakeholders to understand the changes in sensory characteristics of bacon due to traditional smoking process. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Statistical methods for determination of background levels for naturally occuring radionuclides in soil at a RCRA facility

    International Nuclear Information System (INIS)

    Guha, S.; Taylor, J.H.

    1996-01-01

    It is critical that summary statistics on background data, or background levels, be computed based on standardized and defensible statistical methods because background levels are frequently used in subsequent analyses and comparisons performed by separate analysts over time. The final background for naturally occurring radionuclide concentrations in soil at a RCRA facility, and the associated statistical methods used to estimate these concentrations, are presented. The primary objective is to describe, via a case study, the statistical methods used to estimate 95% upper tolerance limits (UTL) on radionuclide background soil data sets. A 95% UTL on background samples can be used as a screening level concentration in the absence of definitive soil cleanup criteria for naturally occurring radionuclides. The statistical methods are based exclusively on EPA guidance. This paper includes an introduction, a discussion of the analytical results for the radionuclides and a detailed description of the statistical analyses leading to the determination of 95% UTLs. Soil concentrations reported are based on validated data. Data sets are categorized as surficial soil; samples collected at depths from zero to one-half foot; and deep soil, samples collected from 3 to 5 feet. These data sets were tested for statistical outliers and underlying distributions were determined by using the chi-squared test for goodness-of-fit. UTLs for the data sets were then computed based on the percentage of non-detects and the appropriate best-fit distribution (lognormal, normal, or non-parametric). For data sets containing greater than approximately 50% nondetects, nonparametric UTLs were computed

  12. Statistical Analysis of Hypercalcaemia Data related to Transferability

    DEFF Research Database (Denmark)

    Frølich, Anne; Nielsen, Bo Friis

    2005-01-01

    In this report we describe statistical analysis related to a study of hypercalcaemia carried out in the Copenhagen area in the ten year period from 1984 to 1994. Results from the study have previously been publised in a number of papers [3, 4, 5, 6, 7, 8, 9] and in various abstracts and posters...... at conferences during the late eighties and early nineties. In this report we give a more detailed description of many of the analysis and provide some new results primarily by simultaneous studies of several databases....

  13. Statistical and theoretical research

    International Nuclear Information System (INIS)

    Anon.

    1983-01-01

    Significant accomplishments include the creation of field designs to detect population impacts, new census procedures for small mammals, and methods for designing studies to determine where and how much of a contaminant is extent over certain landscapes. A book describing these statistical methods is currently being written and will apply to a variety of environmental contaminants, including radionuclides. PNL scientists also have devised an analytical method for predicting the success of field eexperiments on wild populations. Two highlights of current research are the discoveries that population of free-roaming horse herds can double in four years and that grizzly bear populations may be substantially smaller than once thought. As stray horses become a public nuisance at DOE and other large Federal sites, it is important to determine their number. Similar statistical theory can be readily applied to other situations where wild animals are a problem of concern to other government agencies. Another book, on statistical aspects of radionuclide studies, is written specifically for researchers in radioecology

  14. On two methods of statistical image analysis

    NARCIS (Netherlands)

    Missimer, J; Knorr, U; Maguire, RP; Herzog, H; Seitz, RJ; Tellman, L; Leenders, K.L.

    1999-01-01

    The computerized brain atlas (CBA) and statistical parametric mapping (SPM) are two procedures for voxel-based statistical evaluation of PET activation studies. Each includes spatial standardization of image volumes, computation of a statistic, and evaluation of its significance. In addition,

  15. Introduction to Statistics - eNotes

    DEFF Research Database (Denmark)

    Brockhoff, Per B.; Møller, Jan Kloppenborg; Andersen, Elisabeth Wreford

    2015-01-01

    Online textbook used in the introductory statistics courses at DTU. It provides a basic introduction to applied statistics for engineers. The necessary elements from probability theory are introduced (stochastic variable, density and distribution function, mean and variance, etc.) and thereafter...... the most basic statistical analysis methods are presented: Confidence band, hypothesis testing, simulation, simple and muliple regression, ANOVA and analysis of contingency tables. Examples with the software R are included for all presented theory and methods....

  16. Statistics for business

    CERN Document Server

    Waller, Derek L

    2008-01-01

    Statistical analysis is essential to business decision-making and management, but the underlying theory of data collection, organization and analysis is one of the most challenging topics for business students and practitioners. This user-friendly text and CD-ROM package will help you to develop strong skills in presenting and interpreting statistical information in a business or management environment. Based entirely on using Microsoft Excel rather than more complicated applications, it includes a clear guide to using Excel with the key functions employed in the book, a glossary of terms and

  17. Quantifying scenarios to check statistical procedures

    International Nuclear Information System (INIS)

    Beetle, T.M.

    1976-01-01

    Ways of diverting nuclear material are presented in a form that reflects the effects of the diversions on a select set of statistical accounting procedures. Twelve statistics are examined for changes in mean values under sixty diversion scenarios. Several questions about the statistics are answered using a table of quantification results. Findings include a smallest, proper subset of the set of statistics which has one or more changed mean values under each of the diversion scenarios

  18. Olkiluoto site description 2006

    International Nuclear Information System (INIS)

    Andersson, J.; Ahokas, H.; Hudson, J.A.

    2007-03-01

    subjective criteria are used for identifying faults; Regarding the rock mechanics description, presented in Chapter 5, additional data have been assessed and the overall level of rock mechanics understanding has increased compared with previous model versions; A new hydrogeological model has been developed, which is presented in Chapter 6, which places more emphasis on the hydraulic data. Pressure anomalies are now stronger drivers for including hydraulic zones in the model and the number of explicit uncertain cases is greater; The hydrogeochemical description, presented in Chapter 7, is consistent with the previous model and there is an increased level of internal consistency in the hydrogeochemical understanding. Information on gases and understanding of their origin has increased significantly. Based on these descriptions, the report also makes a second set of predictions in Chapter 9 concerning the expected geology and rock mechanics properties to be found during the excavation of the ONKALO and also predicts the hydrogeological and hydrogeochemical impacts of these excavations. Two types of predictions are made: type A predictions, that use only the latest version of the overall Site Model; and type B predictions, that also use all the data from the tunnel, which are derived from activities, such as tunnel mapping and Pilot holes. The Site Descriptive Modelling involves uncertainties and it is necessary to assess the confidence in such modelling. This has been assessed through special protocols in a technical auditing exercise, which is presented in Chapter 10. These protocols investigate whether all data have been considered and understood; where the uncertainties lie and what the potential is for alternative interpretations; whether there is sufficient consistency between disciplines and consistency with the past evolution of the site; as well as comparisons with previous model versions. Chapter 11 concludes that, overall, the uncertainty and confidence assessment

  19. Nuclear magnetic resonance provides a quantitative description of protein conformational flexibility on physiologically important time scales.

    Science.gov (United States)

    Salmon, Loïc; Bouvignies, Guillaume; Markwick, Phineus; Blackledge, Martin

    2011-04-12

    A complete description of biomolecular activity requires an understanding of the nature and the role of protein conformational dynamics. In recent years, novel nuclear magnetic resonance-based techniques that provide hitherto inaccessible detail concerning biomolecular motions occurring on physiologically important time scales have emerged. Residual dipolar couplings (RDCs) provide precise information about time- and ensemble-averaged structural and dynamic processes with correlation times up to the millisecond and thereby encode key information for understanding biological activity. In this review, we present the application of two very different approaches to the quantitative description of protein motion using RDCs. The first is purely analytical, describing backbone dynamics in terms of diffusive motions of each peptide plane, using extensive statistical analysis to validate the proposed dynamic modes. The second is based on restraint-free accelerated molecular dynamics simulation, providing statistically sampled free energy-weighted ensembles that describe conformational fluctuations occurring on time scales from pico- to milliseconds, at atomic resolution. Remarkably, the results from these two approaches converge closely in terms of distribution and absolute amplitude of motions, suggesting that this kind of combination of analytical and numerical models is now capable of providing a unified description of protein conformational dynamics in solution.

  20. Standarized input for Hanford environmental impact statements. Part II: site description

    International Nuclear Information System (INIS)

    Jamison, J.D.

    1982-07-01

    Information is presented under the following section headings: summary description; location and physiography; geology; seismology; hydrology; meteorology; ecology; demography and land use; and radiological condition. Five appendixes are included on the 100N, 200 east, 200 west, 300, and 400 areas. This report is intended to provide a description of the Hanford Site against which the environmental impacts of new projects at Hanford can be assessed. It is expected that the summary description amplified with material from the appropriate appendix, will serve as the basic site description section of environmental impact statements prepared to address the requirements of the National Environmental Policy Act

  1. Point vortex description of drift wave vortices: Dynamics and transport

    International Nuclear Information System (INIS)

    Kono, M.; Horton, W.

    1991-05-01

    Point-vortex description for drift wave vortices is formulated based on the Hasegawa-Mima equation to study elementary processes for the interactions of vortices as well as statistical properties like vortex diffusion. Dynamical properties of drift wave vortices known by numerical experiments are recovered. Furthermore a vortex diffusion model discussed by Horton based on numerical simulations is shown to be analytically obtained. A variety of phenomena arising from the short-range nature of the interaction force of point vortices are suggested. 12 refs., 10 figs

  2. Introduction to statistics and data analysis with exercises, solutions and applications in R

    CERN Document Server

    Heumann, Christian; Shalabh

    2016-01-01

    This introductory statistics textbook conveys the essential concepts and tools needed to develop and nurture statistical thinking. It presents descriptive, inductive and explorative statistical methods and guides the reader through the process of quantitative data analysis. In the experimental sciences and interdisciplinary research, data analysis has become an integral part of any scientific study. Issues such as judging the credibility of data, analyzing the data, evaluating the reliability of the obtained results and finally drawing the correct and appropriate conclusions from the results are vital. The text is primarily intended for undergraduate students in disciplines like business administration, the social sciences, medicine, politics, macroeconomics, etc. It features a wealth of examples, exercises and solutions with computer code in the statistical programming language R as well as supplementary material that will enable the reader to quickly adapt all methods to their own applications.

  3. INCREASING STUDENTS’ WRITING SKILL TO DEVELOP IDEAS IN DESCRIPTIVE TEXT THROUGH THE USE OF INTERNET-BASED MATERIALS

    Directory of Open Access Journals (Sweden)

    Aulia Hanifah Qomar

    2017-02-01

    Full Text Available The objective of the research are: (1 to identify weather and to what extend the use of internet-based materials increase students’ skill in developing ideas to write descriptive text; and (2 to describe the strengths and the weaknesses of internet-based materials in this research. The Classroom Action Research which was carried out at Muhammadiyah University of Metro for the third semester in the academic year of 2012/2013. In collecting the data, she used interviews, observations, questionnaires, diaries, documents, and tests. The data were analyzed through Constant Comparative Method and descriptive statistics. The research findings showed that internet-based materials can increase students’ writing skill in developing ideas to write descriptive text. The increase in students’ writing skill includes: 1 The number of appropriate paragraphs in describing something is all describing the topic. 2 The number of appropriate sentences in describing something was all representing main idea in the paragraphs. 3 Students had knowledge able substantive, development of thesis topic relevant to assign topic. 4 Students were fluent expression, ideas clearly stated / support, well organized, logical sequencing, cohesive and correct the generic structure of descriptive text such as identification and description. 5 Students were sophisticated range, effective word or diction choice and usage word from mastery, appropriate register. 6 Students have effective complex construction, few errors of agreement, tense number, word order/function, articles, pronoun, and preposition. 7 Students were demonstrated mastery of conventions, few errors spelling, punctuation, capitalization, paragraphing. The final result of the tests showed that their score were increasing in the mean score; from 69 (pre test to 73 (test in cycle 1, 79 (test in cycle 2, and 81 (in cycle 3. It was above the minimum standard of the school (72. Related to the strengths of internet

  4. Device for flattening statistically distributed pulses

    International Nuclear Information System (INIS)

    Il'kanaev, G.I.; Iskenderov, V.G.; Rudnev, O.V.; Teller, V.S.

    1976-01-01

    The description is given of a device that converts the series of statistically distributed pulses into a pseudo-uniform one. The inlet pulses switch over the first counter, and the second one is switched over by the clock pulses each time the uniformity of the counters' states is violated. This violation is recorded by the logic circuit which passes to the output the clock pulses in the amount equal to that of the pulses that reached the device inlet. Losses at the correlation between the light velocity and the sampling rate up to 0.3 do not exceed 0.7 per cent for the memory of pulse counters 3, and 0.035 per cent for memory 7

  5. Statistical mechanics of dense granular media

    International Nuclear Information System (INIS)

    Coniglio, A; Fierro, A; Nicodemi, M; Ciamarra, M Pica; Tarzia, M

    2005-01-01

    We discuss some recent results on the statistical mechanics approach to dense granular media. In particular, by analytical mean field investigation we derive the phase diagram of monodisperse and bidisperse granular assemblies. We show that 'jamming' corresponds to a phase transition from a 'fluid' to a 'glassy' phase, observed when crystallization is avoided. The nature of such a 'glassy' phase turns out to be the same as found in mean field models for glass formers. This gives quantitative evidence for the idea of a unified description of the 'jamming' transition in granular media and thermal systems, such as glasses. We also discuss mixing/segregation transitions in binary mixtures and their connections to phase separation and 'geometric' effects

  6. A model for the statistical description of analytical errors occurring in clinical chemical laboratories with time.

    Science.gov (United States)

    Hyvärinen, A

    1985-01-01

    The main purpose of the present study was to describe the statistical behaviour of daily analytical errors in the dimensions of place and time, providing a statistical basis for realistic estimates of the analytical error, and hence allowing the importance of the error and the relative contributions of its different sources to be re-evaluated. The observation material consists of creatinine and glucose results for control sera measured in daily routine quality control in five laboratories for a period of one year. The observation data were processed and computed by means of an automated data processing system. Graphic representations of time series of daily observations, as well as their means and dispersion limits when grouped over various time intervals, were investigated. For partition of the total variation several two-way analyses of variance were done with laboratory and various time classifications as factors. Pooled sets of observations were tested for normality of distribution and for consistency of variances, and the distribution characteristics of error variation in different categories of place and time were compared. Errors were found from the time series to vary typically between days. Due to irregular fluctuations in general and particular seasonal effects in creatinine, stable estimates of means or of dispersions for errors in individual laboratories could not be easily obtained over short periods of time but only from data sets pooled over long intervals (preferably at least one year). Pooled estimates of proportions of intralaboratory variation were relatively low (less than 33%) when the variation was pooled within days. However, when the variation was pooled over longer intervals this proportion increased considerably, even to a maximum of 89-98% (95-98% in each method category) when an outlying laboratory in glucose was omitted, with a concomitant decrease in the interaction component (representing laboratory-dependent variation with time

  7. A Test of Strategies for Enhanced Learning of AP Descriptive Chemistry

    Science.gov (United States)

    Kotcherlakota, Suhasini; Brooks, David W.

    2008-01-01

    The Advanced Placement (AP) Descriptive Chemistry Website allows users to practice chemistry problems. This study involved the redesign of the Website using worked examples to enhance learner performance. The population sample for the study includes users (students and teachers) interested in learning descriptive chemistry materials. The users…

  8. Renyi statistics in equilibrium statistical mechanics

    International Nuclear Information System (INIS)

    Parvan, A.S.; Biro, T.S.

    2010-01-01

    The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.

  9. Dynamical and statistical aspects of intermediate energy heavy ion collisions

    International Nuclear Information System (INIS)

    Knoll, J.

    1987-01-01

    The lectures presented deal with three different topics relevant for the discussion of nuclear collisions at medium to high energies. The first lecture concerns a subject of general interest, the description of statistical systems and their dynamics by the concept of missing information. If presents an excellent scope to formulate statistical theories in such a way that they carefully keep track of the known (relevant) information while maximizing the ignorance about the irrelevant, unknown information. The last two lectures deal with quite actual questions of intermediate energy heavy-ion collisions. These are the multi-fragmentation dynamics of highly excited nuclear systems, and the so called subthreshold particle production. All three subjects are self-contained, and can be read without the knowledge about the other ones. (orig.)

  10. Differences in game-related statistics of basketball performance by game location for men's winning and losing teams.

    Science.gov (United States)

    Gómez, Miguel A; Lorenzo, Alberto; Barakat, Rubén; Ortega, Enrique; Palao, José M

    2008-02-01

    The aim of the present study was to identify game-related statistics that differentiate winning and losing teams according to game location. The sample included 306 games of the 2004-2005 regular season of the Spanish professional men's league (ACB League). The independent variables were game location (home or away) and game result (win or loss). The game-related statistics registered were free throws (successful and unsuccessful), 2- and 3-point field goals (successful and unsuccessful), offensive and defensive rebounds, blocks, assists, fouls, steals, and turnovers. Descriptive and inferential analyses were done (one-way analysis of variance and discriminate analysis). The multivariate analysis showed that winning teams differ from losing teams in defensive rebounds (SC = .42) and in assists (SC = .38). Similarly, winning teams differ from losing teams when they play at home in defensive rebounds (SC = .40) and in assists (SC = .41). On the other hand, winning teams differ from losing teams when they play away in defensive rebounds (SC = .44), assists (SC = .30), successful 2-point field goals (SC = .31), and unsuccessful 3-point field goals (SC = -.35). Defensive rebounds and assists were the only game-related statistics common to all three analyses.

  11. Energy statistics: Fourth quarter, 1989

    International Nuclear Information System (INIS)

    Anon.

    1989-01-01

    This volume contains 100 tables compiling data into the following broad categories: energy, drilling, natural gas, gas liquids, oil, coal, peat, electricity, uranium, and business indicators. The types of data that are given include production and consumption statistics, reserves, imports and exports, prices, fossil fuel and nuclear power generation statistics, and price indices

  12. Hardware description languages

    Science.gov (United States)

    Tucker, Jerry H.

    1994-01-01

    Hardware description languages are special purpose programming languages. They are primarily used to specify the behavior of digital systems and are rapidly replacing traditional digital system design techniques. This is because they allow the designer to concentrate on how the system should operate rather than on implementation details. Hardware description languages allow a digital system to be described with a wide range of abstraction, and they support top down design techniques. A key feature of any hardware description language environment is its ability to simulate the modeled system. The two most important hardware description languages are Verilog and VHDL. Verilog has been the dominant language for the design of application specific integrated circuits (ASIC's). However, VHDL is rapidly gaining in popularity.

  13. ELECTRICAL SUPPORT SYSTEM DESCRIPTION DOCUMENT

    International Nuclear Information System (INIS)

    Roy, S.

    2004-01-01

    The purpose of this revision of the System Design Description (SDD) is to establish requirements that drive the design of the electrical support system and their bases to allow the design effort to proceed to License Application. This SDD is a living document that will be revised at strategic points as the design matures over time. This SDD identifies the requirements and describes the system design as they exist at this time, with emphasis on those attributes of the design provided to meet the requirements. This SDD has been developed to be an engineering tool for design control. Accordingly, the primary audience/users are design engineers. This type of SDD both ''leads'' and ''trails'' the design process. It leads the design process with regard to the flow down of upper tier requirements onto the system. Knowledge of these requirements is essential in performing the design process. The SDD trails the design with regard to the description of the system. The description provided in the SDD is a reflection of the results of the design process to date. Functional and operational requirements applicable to electrical support systems are obtained from the ''Project Functional and Operational Requirements'' (F andOR) (Siddoway 2003). Other requirements to support the design process have been taken from higher-level requirements documents such as the ''Project Design Criteria Document'' (PDC) (Doraswamy 2004), and fire hazards analyses. The above-mentioned low-level documents address ''Project Requirements Document'' (PRD) (Canon and Leitner 2003) requirements. This SDD contains several appendices that include supporting information. Appendix B lists key system charts, diagrams, drawings, and lists, and Appendix C includes a list of system procedures

  14. Preliminary site description. Simpevarp area - version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    Winberg, Anders [ed.

    2004-08-01

    resolution. The lithology model includes four interpreted rock domains. The deformation zone model includes 14 zones of interpreted high confidence (of existence). A discrete fracture network model has also been developed where attempts are made to assess effects on fracturing imposed by interpreted deformation zones. Furthermore, the validity of extrapolating surface fracture statistics to larger depths was explored. The rock mechanics strength model is based on information from the Aespoe Hard Rock Laboratory and an empirical, mechanical classification of data from KSH01A and at outcrops. A first model of thermal properties of the rock has been developed largely based on data from Aespoe Hard Rock Laboratory, and projections based on density and mineral content. Overall the rock at Simpevarp is characterised by a low thermal conductivity. A consequence of the planned delay in parts of the geological model is that the hydrogeological description is based solely on the version 0 regional structural model. The regional flow pattern is found to be governed by the geometry of the interpreted deformation zones in relation to the acting hydraulic gradient. Hydrogeological simulations of the groundwater evolution since the last glaciation were compared with the developed hydrogeochemical conceptual model. The conceptual model of the development of post-glacial hydrogeochemistry was updated. Also, the salinity distribution, mixing processes and the major reactions altering the groundwater composition were described down to a depth of 300 m. A first model of the transport properties of the rock was presented, although still rather immature due to lack of site-specific data in support of the model. For the near-surface, the Simpevarp subarea is characterised by a large portion of outcrop rock. There is information regarding the distribution of Quaternary deposits, and some information about the stratigraphy of the till, the latter found to be of small thickness, generally 1-3 m

  15. Calibrating the Difficulty of an Assessment Tool: The Blooming of a Statistics Examination

    Science.gov (United States)

    Dunham, Bruce; Yapa, Gaitri; Yu, Eugenia

    2015-01-01

    Bloom's taxonomy is proposed as a tool by which to assess the level of complexity of assessment tasks in statistics. Guidelines are provided for how to locate tasks at each level of the taxonomy, along with descriptions and examples of suggested test questions. Through the "Blooming" of an examination--that is, locating its constituent…

  16. Site descriptive modelling - strategy for integrated evaluation

    International Nuclear Information System (INIS)

    Andersson, Johan

    2003-02-01

    The current document establishes the strategy to be used for achieving sufficient integration between disciplines in producing Site Descriptive Models during the Site Investigation stage. The Site Descriptive Model should be a multidisciplinary interpretation of geology, rock mechanics, thermal properties, hydrogeology, hydrogeochemistry, transport properties and ecosystems using site investigation data from deep bore holes and from the surface as input. The modelling comprise the following iterative steps, evaluation of primary data, descriptive and quantitative modelling (in 3D), overall confidence evaluation. Data are first evaluated within each discipline and then the evaluations are checked between the disciplines. Three-dimensional modelling (i.e. estimating the distribution of parameter values in space and its uncertainty) is made in a sequence, where the geometrical framework is taken from the geological model and in turn used by the rock mechanics, thermal and hydrogeological modelling etc. The three-dimensional description should present the parameters with their spatial variability over a relevant and specified scale, with the uncertainty included in this description. Different alternative descriptions may be required. After the individual discipline modelling and uncertainty assessment a phase of overall confidence evaluation follows. Relevant parts of the different modelling teams assess the suggested uncertainties and evaluate the feedback. These discussions should assess overall confidence by, checking that all relevant data are used, checking that information in past model versions is considered, checking that the different kinds of uncertainty are addressed, checking if suggested alternatives make sense and if there is potential for additional alternatives, and by discussing, if appropriate, how additional measurements (i.e. more data) would affect confidence. The findings as well as the modelling results are to be documented in a Site Description

  17. Microscopic description of rotational spectra including band-mixing. 1. Formulation in a microscopic basis

    International Nuclear Information System (INIS)

    Brut, F.; Jang, S.

    1982-05-01

    Within the framework of the projection theory of collective motion, a microscopic description of the rotational energy with band-mixing is formulated using a method based on an inverse power perturbation expansion in a quantity related to the expectation value of the operator Jsub(y)sup(2). The reliability of the present formulation is discussed in relation to the difference between the individual wave functions obtained from the variational equations which are established before and after projection. In addition to the various familiar quantities which appear in the phenomenological energy formula, such as the moment of inertia parameter, the decoupling factor and the band-mixing matrix element for ΔK=1, other unfamiliar quantities having the factors with peculiar phases, (-1)sup(J+1)J(J+1), (-1)sup(J+3/2)(J-1/2)(J+1/2)(J+3/2), (-1)sup(J+1/2)(J+1/2)J(J+1), (-1)sup(J)J(J+1)(J-1)(J+2) and [J(J+1)] 2 are obtained. The band-mixing term for ΔK=2 is also new. All these quantities are expressed in terms of two-body interactions and expectation values of the operator Jsub(y)sup(m), where m is an integer, within the framework of particle-hole formalism. The difference between the moment of inertia of an even-even and a neighboring even-odd nucleus, as well as the effect of band-mixing on the moment of inertia are studied. All results are put into the forms so as to facilitate comparisons with the corresponding phenomenological terms and also for further application

  18. 33 CFR 150.15 - What must the operations manual include?

    Science.gov (United States)

    2010-07-01

    ... containment; (iii) Connecting and disconnecting transfer equipment, including a floating hose string for a...) Connecting and disconnecting of transfer equipment, including to a floating hose string for a SPM; (iv) Line..., bolted flanges, and quick-disconnect coupling. (10) A description of the method used to water and de...

  19. Annual Statistical Supplement, 2002

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2002 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  20. Annual Statistical Supplement, 2010

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2010 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  1. Annual Statistical Supplement, 2007

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2007 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  2. Annual Statistical Supplement, 2001

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2001 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  3. Annual Statistical Supplement, 2016

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2016 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  4. Annual Statistical Supplement, 2011

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2011 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  5. Annual Statistical Supplement, 2005

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2005 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  6. Annual Statistical Supplement, 2015

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2015 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  7. Annual Statistical Supplement, 2003

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2003 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  8. Annual Statistical Supplement, 2017

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2017 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  9. Annual Statistical Supplement, 2008

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2008 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  10. Annual Statistical Supplement, 2014

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2014 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  11. Annual Statistical Supplement, 2004

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2004 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  12. Annual Statistical Supplement, 2000

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2000 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  13. Annual Statistical Supplement, 2009

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2009 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  14. Annual Statistical Supplement, 2006

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2006 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  15. Energy statistics yearbook 2002

    International Nuclear Information System (INIS)

    2005-01-01

    The Energy Statistics Yearbook 2002 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-sixth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities

  16. Energy statistics yearbook 2001

    International Nuclear Information System (INIS)

    2004-01-01

    The Energy Statistics Yearbook 2001 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-fifth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities

  17. Energy statistics yearbook 2000

    International Nuclear Information System (INIS)

    2002-01-01

    The Energy Statistics Yearbook 2000 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-third in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities

  18. Influences On Pupils’ Mathematical Outcome: A Description Of Challenges

    Directory of Open Access Journals (Sweden)

    Lara Said

    2006-12-01

    Full Text Available Measures of instructional behaviour are specific process indicators reflecting teacher effectiveness. The Mathematics in Primary Schools (MIPS study poses the question of: ‘How is pupil progress and teachers’ instructional behaviour, within schools, related?’ This is conducted by tracking pupil progress in primary school mathematics from Year 1 to Year 2 and by collating a number of contextual/process variables situated at the pupil/classroom and school levels. The paper, which adopts, a multi-stage, stratified sample involves 1,786 pupils, 99 teachers and 40 schools, within the statistical framework of a 3-level hierarchical linear model. This paper also reports the findings of the pilot study. Although the paper is limited by the inability to use multilevel statistical techniques due to: the relatively small sample size of the pilot study and, the, as yet, unstandardised pupil scores on Maths 6 (NFER, the utility of this paper lies in the description of challenges faced when engaging in school effectiveness research.

  19. Index of subfactors and statistics of quantum fields. Pt. 2

    International Nuclear Information System (INIS)

    Longo, R.

    1990-01-01

    The endomorphism semigroup End(M) of an infinite factor M is endowed with a natural conjugation (modulo inner automorphisms) anti ρ=ρ -1. γ, where γ is the canonical endomorphism of ρ(M) into M. In Quantum Field Theory conjugate endomorphisms are shown to correspond to conjugate superselection sectors in the description of Doplicher, Haag and Roberts. On the other hand one easily sees that conjugate endormorphisms correspond to conjugate correspondences in the setting of A. Connes. In particular we identify the canonical tower associated with the inclusion ρ(A(O)is contained inA(O) relative to a sector ρ. As a corollary, making use of our previously established index-statistics correspondence, we conpletely describe, in low dimensional theories, the statistics of a selfconjugate superselection sector ρ with 3 or less channels, in particular with statistical dimension d(ρ)<2, by obtaining the braid group representations of V. Jones and Birman, Wenzyl and Murakami. The statistics is thus described in these cases by the polynomial invariants for knots and links of Jones and Kauffman. Selfconjugate sectors are subdivided in real and pseudoreal ones and the effect of this distinction on the statistics is analyzed. The FYHLMO polynomial describes arbitrary 2-channels sectors. (orig.)

  20. Descriptive Topology in Selected Topics of Functional Analysis

    CERN Document Server

    Kakol, J; Pellicer, Manuel Lopez

    2011-01-01

    "Descriptive Topology in Selected Topics of Functional Analysis" is a collection of recent developments in the field of descriptive topology, specifically focused on the classes of infinite-dimensional topological vector spaces that appear in functional analysis. Such spaces include Frechet spaces, (LF)-spaces and their duals, and the space of continuous real-valued functions C(X) on a completely regular Hausdorff space X, to name a few. These vector spaces appear in functional analysis in distribution theory, differential equations, complex analysis, and various other analytical set

  1. Kinetic and dynamic probability-density-function descriptions of disperse turbulent two-phase flows

    Science.gov (United States)

    Minier, Jean-Pierre; Profeta, Christophe

    2015-11-01

    This article analyzes the status of two classical one-particle probability density function (PDF) descriptions of the dynamics of discrete particles dispersed in turbulent flows. The first PDF formulation considers only the process made up by particle position and velocity Zp=(xp,Up) and is represented by its PDF p (t ;yp,Vp) which is the solution of a kinetic PDF equation obtained through a flux closure based on the Furutsu-Novikov theorem. The second PDF formulation includes fluid variables into the particle state vector, for example, the fluid velocity seen by particles Zp=(xp,Up,Us) , and, consequently, handles an extended PDF p (t ;yp,Vp,Vs) which is the solution of a dynamic PDF equation. For high-Reynolds-number fluid flows, a typical formulation of the latter category relies on a Langevin model for the trajectories of the fluid seen or, conversely, on a Fokker-Planck equation for the extended PDF. In the present work, a new derivation of the kinetic PDF equation is worked out and new physical expressions of the dispersion tensors entering the kinetic PDF equation are obtained by starting from the extended PDF and integrating over the fluid seen. This demonstrates that, under the same assumption of a Gaussian colored noise and irrespective of the specific stochastic model chosen for the fluid seen, the kinetic PDF description is the marginal of a dynamic PDF one. However, a detailed analysis reveals that kinetic PDF models of particle dynamics in turbulent flows described by statistical correlations constitute incomplete stand-alone PDF descriptions and, moreover, that present kinetic-PDF equations are mathematically ill posed. This is shown to be the consequence of the non-Markovian characteristic of the stochastic process retained to describe the system and the use of an external colored noise. Furthermore, developments bring out that well-posed PDF descriptions are essentially due to a proper choice of the variables selected to describe physical systems

  2. Generalized statistical mechanics approaches to earthquakes and tectonics

    Science.gov (United States)

    Papadakis, Giorgos; Michas, Georgios

    2016-01-01

    Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes. PMID:28119548

  3. Statistical mechanics of superconductivity

    CERN Document Server

    Kita, Takafumi

    2015-01-01

    This book provides a theoretical, step-by-step comprehensive explanation of superconductivity for undergraduate and graduate students who have completed elementary courses on thermodynamics and quantum mechanics. To this end, it adopts the unique approach of starting with the statistical mechanics of quantum ideal gases and successively adding and clarifying elements and techniques indispensible for understanding it. They include the spin-statistics theorem, second quantization, density matrices, the Bloch–De Dominicis theorem, the variational principle in statistical mechanics, attractive interaction, and bound states. Ample examples of their usage are also provided in terms of topics from advanced statistical mechanics such as two-particle correlations of quantum ideal gases, derivation of the Hartree–Fock equations, and Landau’s Fermi-liquid theory, among others. With these preliminaries, the fundamental mean-field equations of superconductivity are derived with maximum mathematical clarity based on ...

  4. Basics of statistical physics

    CERN Document Server

    Müller-Kirsten, Harald J W

    2013-01-01

    Statistics links microscopic and macroscopic phenomena, and requires for this reason a large number of microscopic elements like atoms. The results are values of maximum probability or of averaging. This introduction to statistical physics concentrates on the basic principles, and attempts to explain these in simple terms supplemented by numerous examples. These basic principles include the difference between classical and quantum statistics, a priori probabilities as related to degeneracies, the vital aspect of indistinguishability as compared with distinguishability in classical physics, the differences between conserved and non-conserved elements, the different ways of counting arrangements in the three statistics (Maxwell-Boltzmann, Fermi-Dirac, Bose-Einstein), the difference between maximization of the number of arrangements of elements, and averaging in the Darwin-Fowler method. Significant applications to solids, radiation and electrons in metals are treated in separate chapters, as well as Bose-Eins...

  5. Statistical theory of heat

    CERN Document Server

    Scheck, Florian

    2016-01-01

    Scheck’s textbook starts with a concise introduction to classical thermodynamics, including geometrical aspects. Then a short introduction to probabilities and statistics lays the basis for the statistical interpretation of thermodynamics. Phase transitions, discrete models and the stability of matter are explained in great detail. Thermodynamics has a special role in theoretical physics. Due to the general approach of thermodynamics the field has a bridging function between several areas like the theory of condensed matter, elementary particle physics, astrophysics and cosmology. The classical thermodynamics describes predominantly averaged properties of matter, reaching from few particle systems and state of matter to stellar objects. Statistical Thermodynamics covers the same fields, but explores them in greater depth and unifies classical statistical mechanics with quantum theory of multiple particle systems. The content is presented as two tracks: the fast track for master students, providing the essen...

  6. Philosophy of statistics

    CERN Document Server

    Forster, Malcolm R

    2011-01-01

    Statisticians and philosophers of science have many common interests but restricted communication with each other. This volume aims to remedy these shortcomings. It provides state-of-the-art research in the area of philosophy of statistics by encouraging numerous experts to communicate with one another without feeling "restricted” by their disciplines or thinking "piecemeal” in their treatment of issues. A second goal of this book is to present work in the field without bias toward any particular statistical paradigm. Broadly speaking, the essays in this Handbook are concerned with problems of induction, statistics and probability. For centuries, foundational problems like induction have been among philosophers' favorite topics; recently, however, non-philosophers have increasingly taken a keen interest in these issues. This volume accordingly contains papers by both philosophers and non-philosophers, including scholars from nine academic disciplines.

  7. Straightforward statistics understanding the tools of research

    CERN Document Server

    Geher, Glenn

    2014-01-01

    Straightforward Statistics: Understanding the Tools of Research is a clear and direct introduction to statistics for the social, behavioral, and life sciences. Based on the author's extensive experience teaching undergraduate statistics, this book provides a narrative presentation of the core principles that provide the foundation for modern-day statistics. With step-by-step guidance on the nuts and bolts of computing these statistics, the book includes detailed tutorials how to use state-of-the-art software, SPSS, to compute the basic statistics employed in modern academic and applied researc

  8. Standarized input for Hanford environmental impact statements. Part II: site description

    Energy Technology Data Exchange (ETDEWEB)

    Jamison, J.D.

    1982-07-01

    Information is presented under the following section headings: summary description; location and physiography; geology; seismology; hydrology; meteorology; ecology; demography and land use; and radiological condition. Five appendixes are included on the 100N, 200 east, 200 west, 300, and 400 areas. This report is intended to provide a description of the Hanford Site against which the environmental impacts of new projects at Hanford can be assessed. It is expected that the summary description amplified with material from the appropriate appendix, will serve as the basic site description section of environmental impact statements prepared to address the requirements of the National Environmental Policy Act (NEPA).

  9. Algebraic methods in statistical mechanics and quantum field theory

    CERN Document Server

    Emch, Dr Gérard G

    2009-01-01

    This systematic algebraic approach concerns problems involving a large number of degrees of freedom. It extends the traditional formalism of quantum mechanics, and it eliminates conceptual and mathematical difficulties common to the development of statistical mechanics and quantum field theory. Further, the approach is linked to research in applied and pure mathematics, offering a reflection of the interplay between formulation of physical motivations and self-contained descriptions of the mathematical methods.The four-part treatment begins with a survey of algebraic approaches to certain phys

  10. 1997 statistical yearbook

    International Nuclear Information System (INIS)

    1998-01-01

    The international office of energy information and studies (Enerdata), has published the second edition of its 1997 statistical yearbook which includes consolidated 1996 data with respect to the previous version from June 1997. The CD-Rom comprises the annual worldwide petroleum, natural gas, coal and electricity statistics from 1991 to 1996 with information about production, external trade, consumption, market shares, sectoral distribution of consumption and energy balance sheets. The world is divided into 12 zones (52 countries available). It contains also energy indicators: production and consumption tendencies, supply and production structures, safety of supplies, energy efficiency, and CO 2 emissions. (J.S.)

  11. Elementary statistical physics

    CERN Document Server

    Kittel, C

    1965-01-01

    This book is intended to help physics students attain a modest working knowledge of several areas of statistical mechanics, including stochastic processes and transport theory. The areas discussed are among those forming a useful part of the intellectual background of a physicist.

  12. Scientific tourism communication in Brazil: Descriptive analysis of national journals from 1990 to 2012

    Directory of Open Access Journals (Sweden)

    Glauber Eduardo de Oliveira Santos

    2013-04-01

    Full Text Available This paper provides descriptive analysis of 2.126 articles published in 20 Brazilian tourism journals from 1990 to 2012. It offers a comprehensive and objective picture of these journals, contributing to the debate about editorial policies, as well as to a broader understanding of the Brazilian academic research developed in this period. The study analyses the evolution of the number of published papers and descriptive statistics about the length of articles, titles and abstracts. Authors with the largest number of publications and the most recurrent keywords are identified. The integration level among journals is analyzed; point out which publications are closer to the center of the Brazilian tourism scientific publishing network.

  13. Transport properties site descriptive model. Guidelines for evaluation and modelling

    International Nuclear Information System (INIS)

    Berglund, Sten; Selroos, Jan-Olof

    2004-04-01

    This report describes a strategy for the development of Transport Properties Site Descriptive Models within the SKB Site Investigation programme. Similar reports have been produced for the other disciplines in the site descriptive modelling (Geology, Hydrogeology, Hydrogeochemistry, Rock mechanics, Thermal properties, and Surface ecosystems). These reports are intended to guide the site descriptive modelling, but also to provide the authorities with an overview of modelling work that will be performed. The site descriptive modelling of transport properties is presented in this report and in the associated 'Strategy for the use of laboratory methods in the site investigations programme for the transport properties of the rock', which describes laboratory measurements and data evaluations. Specifically, the objectives of the present report are to: Present a description that gives an overview of the strategy for developing Site Descriptive Models, and which sets the transport modelling into this general context. Provide a structure for developing Transport Properties Site Descriptive Models that facilitates efficient modelling and comparisons between different sites. Provide guidelines on specific modelling issues where methodological consistency is judged to be of special importance, or where there is no general consensus on the modelling approach. The objectives of the site descriptive modelling process and the resulting Transport Properties Site Descriptive Models are to: Provide transport parameters for Safety Assessment. Describe the geoscientific basis for the transport model, including the qualitative and quantitative data that are of importance for the assessment of uncertainties and confidence in the transport description, and for the understanding of the processes at the sites. Provide transport parameters for use within other discipline-specific programmes. Contribute to the integrated evaluation of the investigated sites. The site descriptive modelling of

  14. Statistics for X-chromosome associations.

    Science.gov (United States)

    Özbek, Umut; Lin, Hui-Min; Lin, Yan; Weeks, Daniel E; Chen, Wei; Shaffer, John R; Purcell, Shaun M; Feingold, Eleanor

    2018-06-13

    In a genome-wide association study (GWAS), association between genotype and phenotype at autosomal loci is generally tested by regression models. However, X-chromosome data are often excluded from published analyses of autosomes because of the difference between males and females in number of X chromosomes. Failure to analyze X-chromosome data at all is obviously less than ideal, and can lead to missed discoveries. Even when X-chromosome data are included, they are often analyzed with suboptimal statistics. Several mathematically sensible statistics for X-chromosome association have been proposed. The optimality of these statistics, however, is based on very specific simple genetic models. In addition, while previous simulation studies of these statistics have been informative, they have focused on single-marker tests and have not considered the types of error that occur even under the null hypothesis when the entire X chromosome is scanned. In this study, we comprehensively tested several X-chromosome association statistics using simulation studies that include the entire chromosome. We also considered a wide range of trait models for sex differences and phenotypic effects of X inactivation. We found that models that do not incorporate a sex effect can have large type I error in some cases. We also found that many of the best statistics perform well even when there are modest deviations, such as trait variance differences between the sexes or small sex differences in allele frequencies, from assumptions. © 2018 WILEY PERIODICALS, INC.

  15. Description of surface systems. Preliminary site description Simpevarp sub area - Version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Lindborg, Tobias [ed.

    2005-03-01

    Swedish Nuclear Fuel and Waste Management Co is currently conducting site characterisation in the Simpevarp area. The area is divided into two subareas, the Simpevarp and the Laxemar subarea. The two subareas are surrounded by a common regional model area, the Simpevarp area. This report describes both the regional area and the subareas. This report is an interim version (model version 1.2) of the description of the surface systems at the Simpevarp area, and should be seen as a background report to the site description of the Simpevarp area, version 1.2, SKB-R--05-08. The basis for this description is quality-assured field data available in the SKB SICADA and GIS databases, together with generic data from the literature. The Surface system, here defined as everything above the bedrock, comprises a number of separate disciplines (e.g. hydrology, geology, topography, oceanography and ecology). Each discipline has developed descriptions and models for a number of properties that together represent the site description. The current methodology for developing the surface system description and the integration to ecosystem models is documented in a methodology strategy report SKB-R--03-06. The procedures and guidelines given in that report were followed in this report. Compared with version 1.1 of the surface system description SKB-R--04-25, this report presents considerable additional features, especially in the ecosystem description (Chapter 4) and in the description of the surface hydrology (Section 3.4). A first attempt has also been made to connect the flow of matter (carbon) between the different ecosystems into an overall ecosystem model at a landscape level. A summarised version of this report is also presented in SKB-R--05-08 together with geological-, hydrogeological-, transport properties-, thermal properties-, rock mechanics- and hydrogeochemical descriptions.

  16. Description of surface systems. Preliminary site description Simpevarp sub area - Version 1.2

    International Nuclear Information System (INIS)

    Lindborg, Tobias

    2005-03-01

    Swedish Nuclear Fuel and Waste Management Co is currently conducting site characterisation in the Simpevarp area. The area is divided into two subareas, the Simpevarp and the Laxemar subarea. The two subareas are surrounded by a common regional model area, the Simpevarp area. This report describes both the regional area and the subareas. This report is an interim version (model version 1.2) of the description of the surface systems at the Simpevarp area, and should be seen as a background report to the site description of the Simpevarp area, version 1.2, SKB-R--05-08. The basis for this description is quality-assured field data available in the SKB SICADA and GIS databases, together with generic data from the literature. The Surface system, here defined as everything above the bedrock, comprises a number of separate disciplines (e.g. hydrology, geology, topography, oceanography and ecology). Each discipline has developed descriptions and models for a number of properties that together represent the site description. The current methodology for developing the surface system description and the integration to ecosystem models is documented in a methodology strategy report SKB-R--03-06. The procedures and guidelines given in that report were followed in this report. Compared with version 1.1 of the surface system description SKB-R--04-25, this report presents considerable additional features, especially in the ecosystem description (Chapter 4) and in the description of the surface hydrology (Section 3.4). A first attempt has also been made to connect the flow of matter (carbon) between the different ecosystems into an overall ecosystem model at a landscape level. A summarised version of this report is also presented in SKB-R--05-08 together with geological-, hydrogeological-, transport properties-, thermal properties-, rock mechanics- and hydrogeochemical descriptions

  17. Testing and qualification of confidence in statistical procedures

    Energy Technology Data Exchange (ETDEWEB)

    Serghiuta, D.; Tholammakkil, J.; Hammouda, N. [Canadian Nuclear Safety Commission (Canada); O' Hagan, A. [Sheffield Univ. (United Kingdom)

    2014-07-01

    This paper discusses a framework for designing artificial test problems, evaluation criteria, and two of the benchmark tests developed under a research project initiated by the Canadian Nuclear Safety Commission to investigate the approaches for qualification of tolerance limit methods and algorithms proposed for application in optimization of CANDU regional/neutron overpower protection trip setpoints for aged conditions. A significant component of this investigation has been the development of a series of benchmark problems of gradually increased complexity, from simple 'theoretical' problems up to complex problems closer to the real application. The first benchmark problem discussed in this paper is a simplified scalar problem which does not involve extremal, maximum or minimum, operations, typically encountered in the real applications. The second benchmark is a high dimensional, but still simple, problem for statistical inference of maximum channel power during normal operation. Bayesian algorithms have been developed for each benchmark problem to provide an independent way of constructing tolerance limits from the same data and allow assessing how well different methods make use of those data and, depending on the type of application, evaluating what the level of 'conservatism' is. The Bayesian method is not, however, used as a reference method, or 'gold' standard, but simply as an independent review method. The approach and the tests developed can be used as a starting point for developing a generic suite (generic in the sense of potentially applying whatever the proposed statistical method) of empirical studies, with clear criteria for passing those tests. Some lessons learned, in particular concerning the need to assure the completeness of the description of the application and the role of completeness of input information, are also discussed. It is concluded that a formal process which includes extended and detailed benchmark

  18. Statistical implications in Monte Carlo depletions - 051

    International Nuclear Information System (INIS)

    Zhiwen, Xu; Rhodes, J.; Smith, K.

    2010-01-01

    As a result of steady advances of computer power, continuous-energy Monte Carlo depletion analysis is attracting considerable attention for reactor burnup calculations. The typical Monte Carlo analysis is set up as a combination of a Monte Carlo neutron transport solver and a fuel burnup solver. Note that the burnup solver is a deterministic module. The statistical errors in Monte Carlo solutions are introduced into nuclide number densities and propagated along fuel burnup. This paper is towards the understanding of the statistical implications in Monte Carlo depletions, including both statistical bias and statistical variations in depleted fuel number densities. The deterministic Studsvik lattice physics code, CASMO-5, is modified to model the Monte Carlo depletion. The statistical bias in depleted number densities is found to be negligible compared to its statistical variations, which, in turn, demonstrates the correctness of the Monte Carlo depletion method. Meanwhile, the statistical variation in number densities generally increases with burnup. Several possible ways of reducing the statistical errors are discussed: 1) to increase the number of individual Monte Carlo histories; 2) to increase the number of time steps; 3) to run additional independent Monte Carlo depletion cases. Finally, a new Monte Carlo depletion methodology, called the batch depletion method, is proposed, which consists of performing a set of independent Monte Carlo depletions and is thus capable of estimating the overall statistical errors including both the local statistical error and the propagated statistical error. (authors)

  19. Workshop on Analytical Methods in Statistics

    CERN Document Server

    Jurečková, Jana; Maciak, Matúš; Pešta, Michal

    2017-01-01

    This volume collects authoritative contributions on analytical methods and mathematical statistics. The methods presented include resampling techniques; the minimization of divergence; estimation theory and regression, eventually under shape or other constraints or long memory; and iterative approximations when the optimal solution is difficult to achieve. It also investigates probability distributions with respect to their stability, heavy-tailness, Fisher information and other aspects, both asymptotically and non-asymptotically. The book not only presents the latest mathematical and statistical methods and their extensions, but also offers solutions to real-world problems including option pricing. The selected, peer-reviewed contributions were originally presented at the workshop on Analytical Methods in Statistics, AMISTAT 2015, held in Prague, Czech Republic, November 10-13, 2015.

  20. Information theory and statistics

    CERN Document Server

    Kullback, Solomon

    1968-01-01

    Highly useful text studies logarithmic measures of information and their application to testing statistical hypotheses. Includes numerous worked examples and problems. References. Glossary. Appendix. 1968 2nd, revised edition.

  1. Prevalence of depressive symptoms among schoolchildren in Cyprus: a cross-sectional descriptive correlational study.

    Science.gov (United States)

    Sokratis, Sokratous; Christos, Ζilides; Despo, Panagi; Maria, Karanikola

    2017-01-01

    Depressive symptoms in the young constitute a public health issue. The current study aims to estimate: (a) the frequency of depressive symptoms in a sample of final grade elementary-school children in Cyprus, (b) the association among frequency of depressive symptoms, gender and nationality and, (c) the metric properties of the Greek-Cypriot version of the children's depression inventory (CDI). A descriptive cross-sectional study with internal comparison was performed. The occurrence of depressive symptoms was assessed with the CDI, which includes 5 subscales: depressive mood, interpersonal difficulties, ineffectiveness, anhedonia and negative self-esteem. Clinical depressive symptoms were reported as CDI score ≥19. CDI was anonymously and voluntarily completed by 439 schoolchildren [mean age 12.3 (±0.51) years old] from fifteen public elementary schools (217 boys and 222 girls), yielding a response rate of 58.2%. The metric properties of the CDI were assessed in terms of internal consistency reliability and construct validity via exploratory factor analysis (rotated and unrotated principal component analysis). Descriptive and inferential statistics were explored. 10.25% of Cypriot schoolchildren reported clinical depressive symptoms (CDI score ≥19). Statistically significant differences were reported between boys and girls in all five subscales of the CDI. Girls reported higher scores in "Depressive mood", "Negative self-esteem" and "Anhedonia" subscales, while boys scored higher in "Interpersonal difficulties" and "Ineffectiveness" subscales. There were no statistically significant differences among ethnicity groups regarding the entire CDI or the subscales of it. Concerning the metric properties of the Greek-Cypriot version of the CDI, internal consistency reliability was adequate (Cronbach's alpha = 0.84). Factor analysis with varimax rotation resulted in five factors explaining 42% of the variance. The Greek-Cypriot version of the CDI is a reliable

  2. Fractional statistics and quantum theory

    CERN Document Server

    Khare, Avinash

    1997-01-01

    This book explains the subtleties of quantum statistical mechanics in lower dimensions and their possible ramifications in quantum theory. The discussion is at a pedagogical level and is addressed to both graduate students and advanced research workers with a reasonable background in quantum and statistical mechanics. The main emphasis will be on explaining new concepts. Topics in the first part of the book includes the flux tube model of anyons, the braid group and quantum and statistical mechanics of noninteracting anyon gas. The second part of the book provides a detailed discussion about f

  3. Fusion Engineering Device. Volume II. Design description

    International Nuclear Information System (INIS)

    1981-10-01

    This volume summarizes the design of the FED. It includes a description of the major systems and subsystems, the supporting plasma design analysis, a projected device cost and associated construction schedule, and a description of the facilities to house and support the device. This effort represents the culmination of the FY81 studies conducted at the Fusion Engineering Design Center (FEDC). Unique in these design activities has been the collaborative involvement of the Design Center personnel and numerous resource physicists from the fusion community who have made significant contributions in the physics design analysis as well as the physics support of the engineering design of the major FED systems and components

  4. Statistical physics approach to earthquake occurrence and forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Arcangelis, Lucilla de [Department of Industrial and Information Engineering, Second University of Naples, Aversa (CE) (Italy); Godano, Cataldo [Department of Mathematics and Physics, Second University of Naples, Caserta (Italy); Grasso, Jean Robert [ISTerre, IRD-CNRS-OSUG, University of Grenoble, Saint Martin d’Héres (France); Lippiello, Eugenio, E-mail: eugenio.lippiello@unina2.it [Department of Mathematics and Physics, Second University of Naples, Caserta (Italy)

    2016-04-25

    There is striking evidence that the dynamics of the Earth crust is controlled by a wide variety of mutually dependent mechanisms acting at different spatial and temporal scales. The interplay of these mechanisms produces instabilities in the stress field, leading to abrupt energy releases, i.e., earthquakes. As a consequence, the evolution towards instability before a single event is very difficult to monitor. On the other hand, collective behavior in stress transfer and relaxation within the Earth crust leads to emergent properties described by stable phenomenological laws for a population of many earthquakes in size, time and space domains. This observation has stimulated a statistical mechanics approach to earthquake occurrence, applying ideas and methods as scaling laws, universality, fractal dimension, renormalization group, to characterize the physics of earthquakes. In this review we first present a description of the phenomenological laws of earthquake occurrence which represent the frame of reference for a variety of statistical mechanical models, ranging from the spring-block to more complex fault models. Next, we discuss the problem of seismic forecasting in the general framework of stochastic processes, where seismic occurrence can be described as a branching process implementing space–time-energy correlations between earthquakes. In this context we show how correlations originate from dynamical scaling relations between time and energy, able to account for universality and provide a unifying description for the phenomenological power laws. Then we discuss how branching models can be implemented to forecast the temporal evolution of the earthquake occurrence probability and allow to discriminate among different physical mechanisms responsible for earthquake triggering. In particular, the forecasting problem will be presented in a rigorous mathematical framework, discussing the relevance of the processes acting at different temporal scales for

  5. A perceptual space of local image statistics.

    Science.gov (United States)

    Victor, Jonathan D; Thengone, Daniel J; Rizvi, Syed M; Conte, Mary M

    2015-12-01

    Local image statistics are important for visual analysis of textures, surfaces, and form. There are many kinds of local statistics, including those that capture luminance distributions, spatial contrast, oriented segments, and corners. While sensitivity to each of these kinds of statistics have been well-studied, much less is known about visual processing when multiple kinds of statistics are relevant, in large part because the dimensionality of the problem is high and different kinds of statistics interact. To approach this problem, we focused on binary images on a square lattice - a reduced set of stimuli which nevertheless taps many kinds of local statistics. In this 10-parameter space, we determined psychophysical thresholds to each kind of statistic (16 observers) and all of their pairwise combinations (4 observers). Sensitivities and isodiscrimination contours were consistent across observers. Isodiscrimination contours were elliptical, implying a quadratic interaction rule, which in turn determined ellipsoidal isodiscrimination surfaces in the full 10-dimensional space, and made predictions for sensitivities to complex combinations of statistics. These predictions, including the prediction of a combination of statistics that was metameric to random, were verified experimentally. Finally, check size had only a mild effect on sensitivities over the range from 2.8 to 14min, but sensitivities to second- and higher-order statistics was substantially lower at 1.4min. In sum, local image statistics form a perceptual space that is highly stereotyped across observers, in which different kinds of statistics interact according to simple rules. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Development of a statistical model for cervical cancer cell death with irreversible electroporation in vitro.

    Science.gov (United States)

    Yang, Yongji; Moser, Michael A J; Zhang, Edwin; Zhang, Wenjun; Zhang, Bing

    2018-01-01

    The aim of this study was to develop a statistical model for cell death by irreversible electroporation (IRE) and to show that the statistic model is more accurate than the electric field threshold model in the literature using cervical cancer cells in vitro. HeLa cell line was cultured and treated with different IRE protocols in order to obtain data for modeling the statistical relationship between the cell death and pulse-setting parameters. In total, 340 in vitro experiments were performed with a commercial IRE pulse system, including a pulse generator and an electric cuvette. Trypan blue staining technique was used to evaluate cell death after 4 hours of incubation following IRE treatment. Peleg-Fermi model was used in the study to build the statistical relationship using the cell viability data obtained from the in vitro experiments. A finite element model of IRE for the electric field distribution was also built. Comparison of ablation zones between the statistical model and electric threshold model (drawn from the finite element model) was used to show the accuracy of the proposed statistical model in the description of the ablation zone and its applicability in different pulse-setting parameters. The statistical models describing the relationships between HeLa cell death and pulse length and the number of pulses, respectively, were built. The values of the curve fitting parameters were obtained using the Peleg-Fermi model for the treatment of cervical cancer with IRE. The difference in the ablation zone between the statistical model and the electric threshold model was also illustrated to show the accuracy of the proposed statistical model in the representation of ablation zone in IRE. This study concluded that: (1) the proposed statistical model accurately described the ablation zone of IRE with cervical cancer cells, and was more accurate compared with the electric field model; (2) the proposed statistical model was able to estimate the value of electric

  7. Parametric Level Statistics in Random Matrix Theory: Exact Solution

    International Nuclear Information System (INIS)

    Kanzieper, E.

    1999-01-01

    During recent several years, the theory of non-Gaussian random matrix ensembles has experienced a sound progress motivated by new ideas in quantum chromodynamics (QCD) and mesoscopic physics. Invariant non-Gaussian random matrix models appear to describe universal features of low-energy part of the spectrum of Dirac operator in QCD, and electron level statistics in normal conducting-superconducting hybrid structures. They also serve as a basis for constructing the toy models of universal spectral statistics expected at the edge of the metal-insulator transition. While conventional spectral statistics has received a detailed study in the context of RMT, quite a bit is known about parametric level statistics in non-Gaussian random matrix models. In this communication we report about exact solution to the problem of parametric level statistics in unitary invariant, U(N), non-Gaussian ensembles of N x N Hermitian random matrices with either soft or strong level confinement. The solution is formulated within the framework of the orthogonal polynomial technique and is shown to depend on both the unfolded two-point scalar kernel and the level confinement through a double integral transformation which, in turn, provides a constructive tool for description of parametric level correlations in non-Gaussian RMT. In the case of soft level confinement, the formalism developed is potentially applicable to a study of parametric level statistics in an important class of random matrix models with finite level compressibility expected to describe a disorder-induced metal-insulator transition. In random matrix ensembles with strong level confinement, the solution presented takes a particular simple form in the thermodynamic limit: In this case, a new intriguing connection relation between the parametric level statistics and the scalar two-point kernel of an unperturbed ensemble is demonstrated to emerge. Extension of the results obtained to higher-order parametric level statistics is

  8. Statistical physics of hard optimization problems

    International Nuclear Information System (INIS)

    Zdeborova, L.

    2009-01-01

    Optimization is fundamental in many areas of science, from computer science and information theory to engineering and statistical physics, as well as to biology or social sciences. It typically involves a large number of variables and a cost function depending on these variables. Optimization problems in the non-deterministic polynomial (NP)-complete class are particularly difficult, it is believed that the number of operations required to minimize the cost function is in the most difficult cases exponential in the system size. However, even in an NP-complete problem the practically arising instances might, in fact, be easy to solve. The principal question we address in this article is: How to recognize if an NP-complete constraint satisfaction problem is typically hard and what are the main reasons for this? We adopt approaches from the statistical physics of disordered systems, in particular the cavity method developed originally to describe glassy systems. We describe new properties of the space of solutions in two of the most studied constraint satisfaction problems - random satisfy ability and random graph coloring. We suggest a relation between the existence of the so-called frozen variables and the algorithmic hardness of a problem. Based on these insights, we introduce a new class of problems which we named ”locked” constraint satisfaction, where the statistical description is easily solvable, but from the algorithmic point of view they are even more challenging than the canonical satisfy ability.

  9. Statistical physics of hard optimization problems

    International Nuclear Information System (INIS)

    Zdeborova, L.

    2009-01-01

    Optimization is fundamental in many areas of science, from computer science and information theory to engineering and statistical physics, as well as to biology or social sciences. It typically involves a large number of variables and a cost function depending on these variables. Optimization problems in the non-deterministic polynomial-complete class are particularly difficult, it is believed that the number of operations required to minimize the cost function is in the most difficult cases exponential in the system size. However, even in an non-deterministic polynomial-complete problem the practically arising instances might, in fact, be easy to solve. The principal the question we address in the article is: How to recognize if an non-deterministic polynomial-complete constraint satisfaction problem is typically hard and what are the main reasons for this? We adopt approaches from the statistical physics of disordered systems, in particular the cavity method developed originally to describe glassy systems. We describe new properties of the space of solutions in two of the most studied constraint satisfaction problems - random satisfiability and random graph coloring. We suggest a relation between the existence of the so-called frozen variables and the algorithmic hardness of a problem. Based on these insights, we introduce a new class of problems which we named 'locked' constraint satisfaction, where the statistical description is easily solvable, but from the algorithmic point of view they are even more challenging than the canonical satisfiability (Authors)

  10. Statistical physics of hard optimization problems

    Science.gov (United States)

    Zdeborová, Lenka

    2009-06-01

    Optimization is fundamental in many areas of science, from computer science and information theory to engineering and statistical physics, as well as to biology or social sciences. It typically involves a large number of variables and a cost function depending on these variables. Optimization problems in the non-deterministic polynomial (NP)-complete class are particularly difficult, it is believed that the number of operations required to minimize the cost function is in the most difficult cases exponential in the system size. However, even in an NP-complete problem the practically arising instances might, in fact, be easy to solve. The principal question we address in this article is: How to recognize if an NP-complete constraint satisfaction problem is typically hard and what are the main reasons for this? We adopt approaches from the statistical physics of disordered systems, in particular the cavity method developed originally to describe glassy systems. We describe new properties of the space of solutions in two of the most studied constraint satisfaction problems - random satisfiability and random graph coloring. We suggest a relation between the existence of the so-called frozen variables and the algorithmic hardness of a problem. Based on these insights, we introduce a new class of problems which we named "locked" constraint satisfaction, where the statistical description is easily solvable, but from the algorithmic point of view they are even more challenging than the canonical satisfiability.

  11. APPLIED STATISTICS – THE STATE AND THE PROSPECTS

    OpenAIRE

    Orlov A. I.

    2016-01-01

    Applied Statistics - the science of how to analyze the statistical data. As an independent scientificpractical area it develops very quickly. It includes numerous widely and deeply developed scientific directions. Those who use the applied statistics and other statistical methods, usually focused on specific areas of study, ie, are not specialists in applied statistics. Therefore, it is useful to make a critical analysis of the current state of applied statistics and discuss trends in the dev...

  12. Statistical inference for financial engineering

    CERN Document Server

    Taniguchi, Masanobu; Ogata, Hiroaki; Taniai, Hiroyuki

    2014-01-01

    This monograph provides the fundamentals of statistical inference for financial engineering and covers some selected methods suitable for analyzing financial time series data. In order to describe the actual financial data, various stochastic processes, e.g. non-Gaussian linear processes, non-linear processes, long-memory processes, locally stationary processes etc. are introduced and their optimal estimation is considered as well. This book also includes several statistical approaches, e.g., discriminant analysis, the empirical likelihood method, control variate method, quantile regression, realized volatility etc., which have been recently developed and are considered to be powerful tools for analyzing the financial data, establishing a new bridge between time series and financial engineering. This book is well suited as a professional reference book on finance, statistics and statistical financial engineering. Readers are expected to have an undergraduate-level knowledge of statistics.

  13. Statistics and analysis of scientific data

    CERN Document Server

    Bonamente, Massimiliano

    2017-01-01

    The revised second edition of this textbook provides the reader with a solid foundation in probability theory and statistics as applied to the physical sciences, engineering and related fields. It covers a broad range of numerical and analytical methods that are essential for the correct analysis of scientific data, including probability theory, distribution functions of statistics, fits to two-dimensional data and parameter estimation, Monte Carlo methods and Markov chains. Features new to this edition include: • a discussion of statistical techniques employed in business science, such as multiple regression analysis of multivariate datasets. • a new chapter on the various measures of the mean including logarithmic averages. • new chapters on systematic errors and intrinsic scatter, and on the fitting of data with bivariate errors. • a new case study and additional worked examples. • mathematical derivations and theoretical background material have been appropriately marked,to improve the readabili...

  14. ELECTRICAL SUPPORT SYSTEM DESCRIPTION DOCUMENT

    Energy Technology Data Exchange (ETDEWEB)

    S. Roy

    2004-06-24

    The purpose of this revision of the System Design Description (SDD) is to establish requirements that drive the design of the electrical support system and their bases to allow the design effort to proceed to License Application. This SDD is a living document that will be revised at strategic points as the design matures over time. This SDD identifies the requirements and describes the system design as they exist at this time, with emphasis on those attributes of the design provided to meet the requirements. This SDD has been developed to be an engineering tool for design control. Accordingly, the primary audience/users are design engineers. This type of SDD both ''leads'' and ''trails'' the design process. It leads the design process with regard to the flow down of upper tier requirements onto the system. Knowledge of these requirements is essential in performing the design process. The SDD trails the design with regard to the description of the system. The description provided in the SDD is a reflection of the results of the design process to date. Functional and operational requirements applicable to electrical support systems are obtained from the ''Project Functional and Operational Requirements'' (F&OR) (Siddoway 2003). Other requirements to support the design process have been taken from higher-level requirements documents such as the ''Project Design Criteria Document'' (PDC) (Doraswamy 2004), and fire hazards analyses. The above-mentioned low-level documents address ''Project Requirements Document'' (PRD) (Canon and Leitner 2003) requirements. This SDD contains several appendices that include supporting information. Appendix B lists key system charts, diagrams, drawings, and lists, and Appendix C includes a list of system procedures.

  15. Binomial vs poisson statistics in radiation studies

    International Nuclear Information System (INIS)

    Foster, J.; Kouris, K.; Spyrou, N.M.; Matthews, I.P.; Welsh National School of Medicine, Cardiff

    1983-01-01

    The processes of radioactive decay, decay and growth of radioactive species in a radioactive chain, prompt emission(s) from nuclear reactions, conventional activation and cyclic activation are discussed with respect to their underlying statistical density function. By considering the transformation(s) that each nucleus may undergo it is shown that all these processes are fundamentally binomial. Formally, when the number of experiments N is large and the probability of success p is close to zero, the binomial is closely approximated by the Poisson density function. In radiation and nuclear physics, N is always large: each experiment can be conceived of as the observation of the fate of each of the N nuclei initially present. Whether p, the probability that a given nucleus undergoes a prescribed transformation, is close to zero depends on the process and nuclide(s) concerned. Hence, although a binomial description is always valid, the Poisson approximation is not always adequate. Therefore further clarification is provided as to when the binomial distribution must be used in the statistical treatment of detected events. (orig.)

  16. 2014 ICSA/KISS Joint Applied Statistics Symposium

    CERN Document Server

    Liu, Mengling; Luo, Xiaolong

    2016-01-01

    The papers in this volume represent the most timely and advanced contributions to the 2014 Joint Applied Statistics Symposium of the International Chinese Statistical Association (ICSA) and the Korean International Statistical Society (KISS), held in Portland, Oregon. The contributions cover new developments in statistical modeling and clinical research: including model development, model checking, and innovative clinical trial design and analysis. Each paper was peer-reviewed by at least two referees and also by an editor. The conference was attended by over 400 participants from academia, industry, and government agencies around the world, including from North America, Asia, and Europe. It offered 3 keynote speeches, 7 short courses, 76 parallel scientific sessions, student paper sessions, and social events. The most timely and advanced contributions from the joint 2014 ICSA/KISS Applied Statistics Symposium All papers feature original, peer-reviewed content Coverage consists of new developments in statisti...

  17. Fuzzy statistical decision-making theory and applications

    CERN Document Server

    Kabak, Özgür

    2016-01-01

    This book offers a comprehensive reference guide to fuzzy statistics and fuzzy decision-making techniques. It provides readers with all the necessary tools for making statistical inference in the case of incomplete information or insufficient data, where classical statistics cannot be applied. The respective chapters, written by prominent researchers, explain a wealth of both basic and advanced concepts including: fuzzy probability distributions, fuzzy frequency distributions, fuzzy Bayesian inference, fuzzy mean, mode and median, fuzzy dispersion, fuzzy p-value, and many others. To foster a better understanding, all the chapters include relevant numerical examples or case studies. Taken together, they form an excellent reference guide for researchers, lecturers and postgraduate students pursuing research on fuzzy statistics. Moreover, by extending all the main aspects of classical statistical decision-making to its fuzzy counterpart, the book presents a dynamic snapshot of the field that is expected to stimu...

  18. Statistics available for site studies in registers and surveys at Statistics Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Haldorson, Marie [Statistics Sweden, Oerebro (Sweden)

    2000-03-01

    Statistics Sweden (SCB) has produced this report on behalf of the Swedish Nuclear Fuel and Waste Management Company (SKB), as part of the data to be used by SKB in conducting studies of potential sites. The report goes over the statistics obtainable from SCB in the form of registers and surveys. The purpose is to identify the variables that are available, and to specify their degree of geographical detail and the time series that are available. Chapter two describes the statistical registers available at SCB, registers that share the common feature that they provide total coverage, i.e. they contain all 'objects' of a given type, such as population, economic activities (e.g. from statements of employees' earnings provided to the tax authorities), vehicles, enterprises or real estate. SCB has exclusive responsibility for seven of the nine registers included in the chapter, while two registers are ordered by public authorities with statistical responsibilities. Chapter three describes statistical surveys that are conducted by SCB, with the exception of the National Forest Inventory, which is carried out by the Swedish University of Agricultural Sciences. In terms of geographical breakdown, the degree of detail in the surveys varies, but all provide some possibility of reporting data at lower than the national level. The level involved may be county, municipality, yield district, coastal district or category of enterprises, e.g. aquaculture. Six of the nine surveys included in the chapter have been ordered by public authorities with statistical responsibilities, while SCB has exclusive responsibility for the others. Chapter four presents an overview of the statistics on land use maintained by SCB. This chapter does not follow the same pattern as chapters two and three but instead gives a more general account. The conclusion can be drawn that there are good prospects that SKB can make use of SCB's data as background information or in other ways when

  19. Statistics available for site studies in registers and surveys at Statistics Sweden

    International Nuclear Information System (INIS)

    Haldorson, Marie

    2000-03-01

    Statistics Sweden (SCB) has produced this report on behalf of the Swedish Nuclear Fuel and Waste Management Company (SKB), as part of the data to be used by SKB in conducting studies of potential sites. The report goes over the statistics obtainable from SCB in the form of registers and surveys. The purpose is to identify the variables that are available, and to specify their degree of geographical detail and the time series that are available. Chapter two describes the statistical registers available at SCB, registers that share the common feature that they provide total coverage, i.e. they contain all 'objects' of a given type, such as population, economic activities (e.g. from statements of employees' earnings provided to the tax authorities), vehicles, enterprises or real estate. SCB has exclusive responsibility for seven of the nine registers included in the chapter, while two registers are ordered by public authorities with statistical responsibilities. Chapter three describes statistical surveys that are conducted by SCB, with the exception of the National Forest Inventory, which is carried out by the Swedish University of Agricultural Sciences. In terms of geographical breakdown, the degree of detail in the surveys varies, but all provide some possibility of reporting data at lower than the national level. The level involved may be county, municipality, yield district, coastal district or category of enterprises, e.g. aquaculture. Six of the nine surveys included in the chapter have been ordered by public authorities with statistical responsibilities, while SCB has exclusive responsibility for the others. Chapter four presents an overview of the statistics on land use maintained by SCB. This chapter does not follow the same pattern as chapters two and three but instead gives a more general account. The conclusion can be drawn that there are good prospects that SKB can make use of SCB's data as background information or in other ways when undertaking future

  20. Statistics available for site studies in registers and surveys at Statistics Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Haldorson, Marie [Statistics Sweden, Oerebro (Sweden)

    2000-03-01

    Statistics Sweden (SCB) has produced this report on behalf of the Swedish Nuclear Fuel and Waste Management Company (SKB), as part of the data to be used by SKB in conducting studies of potential sites. The report goes over the statistics obtainable from SCB in the form of registers and surveys. The purpose is to identify the variables that are available, and to specify their degree of geographical detail and the time series that are available. Chapter two describes the statistical registers available at SCB, registers that share the common feature that they provide total coverage, i.e. they contain all 'objects' of a given type, such as population, economic activities (e.g. from statements of employees' earnings provided to the tax authorities), vehicles, enterprises or real estate. SCB has exclusive responsibility for seven of the nine registers included in the chapter, while two registers are ordered by public authorities with statistical responsibilities. Chapter three describes statistical surveys that are conducted by SCB, with the exception of the National Forest Inventory, which is carried out by the Swedish University of Agricultural Sciences. In terms of geographical breakdown, the degree of detail in the surveys varies, but all provide some possibility of reporting data at lower than the national level. The level involved may be county, municipality, yield district, coastal district or category of enterprises, e.g. aquaculture. Six of the nine surveys included in the chapter have been ordered by public authorities with statistical responsibilities, while SCB has exclusive responsibility for the others. Chapter four presents an overview of the statistics on land use maintained by SCB. This chapter does not follow the same pattern as chapters two and three but instead gives a more general account. The conclusion can be drawn that there are good prospects that SKB can make use of SCB's data as background information or in other ways when undertaking future