WorldWideScience

Sample records for alternative statistical method

  1. Alternative statistical methods for cytogenetic radiation biological dosimetry

    CERN Document Server

    Fornalski, Krzysztof Wojciech

    2014-01-01

    The paper presents alternative statistical methods for biological dosimetry, such as the Bayesian and Monte Carlo method. The classical Gaussian and robust Bayesian fit algorithms for the linear, linear-quadratic as well as saturated and critical calibration curves are described. The Bayesian model selection algorithm for those curves is also presented. In addition, five methods of dose estimation for a mixed neutron and gamma irradiation field were described: two classical methods, two Bayesian methods and one Monte Carlo method. Bayesian methods were also enhanced and generalized for situations with many types of mixed radiation. All algorithms were presented in easy-to-use form, which can be applied to any computational programming language. The presented algorithm is universal, although it was originally dedicated to cytogenetic biological dosimetry of victims of a nuclear reactor accident.

  2. An Alternating Iterative Method and Its Application in Statistical Inference

    Institute of Scientific and Technical Information of China (English)

    Ning Zhong SHI; Guo Rong HU; Qing CUI

    2008-01-01

    This paper studies non-convex programming problems. It is known that, in statistical inference, many constrained estimation problems may be expressed as convex programming problems. However, in many practical problems, the objective functions are not convex. In this paper, we give a definition of a semi-convex objective function and discuss the corresponding non-convex programming problems. A two-step iterative algorithm called the alternating iterative method is proposed for finding solutions for such problems. The method is illustrated by three examples in constrained estimation problems given in Sasabuchi et al. (Biometrika, 72, 465–472 (1983)), Shi N. Z. (J. Multivariate Anal.,50, 282–293 (1994)) and El Barmi H. and Dykstra R. (Ann. Statist., 26, 1878–1893 (1998)).

  3. Statistical methods

    CERN Document Server

    Szulc, Stefan

    1965-01-01

    Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then

  4. Statistical methods

    CERN Document Server

    Freund, Rudolf J; Wilson, William J

    2010-01-01

    Statistical Methods, 3e provides students with a working introduction to statistical methods offering a wide range of applications that emphasize the quantitative skills useful across many academic disciplines. This text takes a classic approach emphasizing concepts and techniques for working out problems and intepreting results. The book includes research projects, real-world case studies, numerous examples and data exercises organized by level of difficulty. This text requires that a student be familiar with algebra. New to this edition: NEW expansion of exercises a

  5. A statistical method for the detection of alternative splicing using RNA-seq.

    Directory of Open Access Journals (Sweden)

    Liguo Wang

    Full Text Available Deep sequencing of transcriptome (RNA-seq provides unprecedented opportunity to interrogate plausible mRNA splicing patterns by mapping RNA-seq reads to exon junctions (thereafter junction reads. In most previous studies, exon junctions were detected by using the quantitative information of junction reads. The quantitative criterion (e.g. minimum of two junction reads, although is straightforward and widely used, usually results in high false positive and false negative rates, owning to the complexity of transcriptome. Here, we introduced a new metric, namely Minimal Match on Either Side of exon junction (MMES, to measure the quality of each junction read, and subsequently implemented an empirical statistical model to detect exon junctions. When applied to a large dataset (>200M reads consisting of mouse brain, liver and muscle mRNA sequences, and using independent transcripts databases as positive control, our method was proved to be considerably more accurate than previous ones, especially for detecting junctions originated from low-abundance transcripts. Our results were also confirmed by real time RT-PCR assay. The MMES metric can be used either in this empirical statistical model or in other more sophisticated classifiers, such as logistic regression.

  6. Evaluation of an Alternative Statistical Method for Analysis of RCRA Groundwater Monitoring Data at the Hanford Site

    Energy Technology Data Exchange (ETDEWEB)

    Chou, Charissa J.

    2004-06-24

    Statistical methods are required in groundwater monitoring programs to determine if a RCRA-regulated unit affects groundwater quality beneath a site. This report presents the results of the statistical analysis of groundwater monitoring data acquired at B Pond and the 300 Area process trenches during a 2-year trial test period.

  7. Rapid Statistical Methods: Part 1.

    Science.gov (United States)

    Lyon, A. J.

    1980-01-01

    Discusses some rapid statistical methods which are intended for use by physics teachers. Part one of this article gives some of the simplest and most commonly useful rapid methods. Part two gives references to the relevant theory together with some alternative and additional methods. (HM)

  8. The analysis of covariance and alternatives statistical methods for experiments, quasi-experiments, and single-case studies

    CERN Document Server

    Huitema, Bradley

    2011-01-01

    A complete guide to cutting-edge techniques and best practices for applying covariance analysis methods The Second Edition of Analysis of Covariance and Alternatives sheds new light on its topic, offering in-depth discussions of underlying assumptions, comprehensive interpretations of results, and comparisons of distinct approaches. The book has been extensively revised and updated to feature an in-depth review of prerequisites and the latest developments in the field. The author begins with a discussion of essential topics relating to experimental design and analysis

  9. Statistical methods in astronomy

    OpenAIRE

    Long, James P.; de Souza, Rafael S.

    2017-01-01

    We present a review of data types and statistical methods often encountered in astronomy. The aim is to provide an introduction to statistical applications in astronomy for statisticians and computer scientists. We highlight the complex, often hierarchical, nature of many astronomy inference problems and advocate for cross-disciplinary collaborations to address these challenges.

  10. Statistical Methods for Astronomy

    CERN Document Server

    Feigelson, Eric D

    2012-01-01

    This review outlines concepts of mathematical statistics, elements of probability theory, hypothesis tests and point estimation for use in the analysis of modern astronomical data. Least squares, maximum likelihood, and Bayesian approaches to statistical inference are treated. Resampling methods, particularly the bootstrap, provide valuable procedures when distributions functions of statistics are not known. Several approaches to model selection and good- ness of fit are considered. Applied statistics relevant to astronomical research are briefly discussed: nonparametric methods for use when little is known about the behavior of the astronomical populations or processes; data smoothing with kernel density estimation and nonparametric regression; unsupervised clustering and supervised classification procedures for multivariate problems; survival analysis for astronomical datasets with nondetections; time- and frequency-domain times series analysis for light curves; and spatial statistics to interpret the spati...

  11. Bayesian Information Criterion as an Alternative way of Statistical Inference

    Directory of Open Access Journals (Sweden)

    Nadejda Yu. Gubanova

    2012-05-01

    Full Text Available The article treats Bayesian information criterion as an alternative to traditional methods of statistical inference, based on NHST. The comparison of ANOVA and BIC results for psychological experiment is discussed.

  12. Bayesian Information Criterion as an Alternative way of Statistical Inference

    OpenAIRE

    Nadejda Yu. Gubanova; Simon Zh. Simavoryan

    2012-01-01

    The article treats Bayesian information criterion as an alternative to traditional methods of statistical inference, based on NHST. The comparison of ANOVA and BIC results for psychological experiment is discussed.

  13. STATISTICAL METHODS IN HISTORY

    Directory of Open Access Journals (Sweden)

    Orlov A. I.

    2016-01-01

    Full Text Available We have given a critical analysis of statistical models and methods for processing text information in historical records to establish the times when there were certain events, ie, to build science-based chronology. There are three main kinds of sources of knowledge of ancient history: ancient texts, the remains of material culture and traditions. The specific date of the extracted by archaeologists objects in most cases can not be found. The group of Academician A.T. Fomenko has developed and applied new statistical methods for analysis of historical texts (Chronicle, based on the intensive use of computer technology. Two major scientific results were: the majority of historical records that we know now, are duplicated (in particular, chronicles, describing the so-called "Ancient Rome" and "Middle Ages", talking about the same events; the known historical chronicles tell us about real events, separated from the present time for not more than 1000 years. It was found that chronicles describing the history of "ancient times" and "Middle Ages" and the chronicle of Chinese history and the history of various European countries do not talk about different, but about the same events. We have the attempt of a new dating of historical events and restoring the true history of human society based on new data. From the standpoint of statistical methods of historical records and images of their fragments – they are special cases of non-numeric objects of nature. Therefore, developed by the group of A.T. Fomenko computer-statistical methods are the part of non-numerical statistics. We have considered some methods of statistical analysis of chronicles applied by the group of A.T. Fomenko: correlation method of maximums; dynasties method; the method of attenuation frequency; questionnaire method codes. New chronology allows us to understand much of the battle of ideas in modern science and mass consciousness. It becomes clear the root cause of cautious

  14. Nonparametric statistical methods

    CERN Document Server

    Hollander, Myles; Chicken, Eric

    2013-01-01

    Praise for the Second Edition"This book should be an essential part of the personal library of every practicing statistician."-Technometrics  Thoroughly revised and updated, the new edition of Nonparametric Statistical Methods includes additional modern topics and procedures, more practical data sets, and new problems from real-life situations. The book continues to emphasize the importance of nonparametric methods as a significant branch of modern statistics and equips readers with the conceptual and technical skills necessary to select and apply the appropriate procedures for any given sit

  15. Statistical methods for forecasting

    CERN Document Server

    Abraham, Bovas

    2009-01-01

    The Wiley-Interscience Paperback Series consists of selected books that have been made more accessible to consumers in an effort to increase global appeal and general circulation. With these new unabridged softcover volumes, Wiley hopes to extend the lives of these works by making them available to future generations of statisticians, mathematicians, and scientists."This book, it must be said, lives up to the words on its advertising cover: ''Bridging the gap between introductory, descriptive approaches and highly advanced theoretical treatises, it provides a practical, intermediate level discussion of a variety of forecasting tools, and explains how they relate to one another, both in theory and practice.'' It does just that!"-Journal of the Royal Statistical Society"A well-written work that deals with statistical methods and models that can be used to produce short-term forecasts, this book has wide-ranging applications. It could be used in the context of a study of regression, forecasting, and time series ...

  16. Introducing linear functions: an alternative statistical approach

    Science.gov (United States)

    Nolan, Caroline; Herbert, Sandra

    2015-12-01

    The introduction of linear functions is the turning point where many students decide if mathematics is useful or not. This means the role of parameters and variables in linear functions could be considered to be `threshold concepts'. There is recognition that linear functions can be taught in context through the exploration of linear modelling examples, but this has its limitations. Currently, statistical data is easily attainable, and graphics or computer algebra system (CAS) calculators are common in many classrooms. The use of this technology provides ease of access to different representations of linear functions as well as the ability to fit a least-squares line for real-life data. This means these calculators could support a possible alternative approach to the introduction of linear functions. This study compares the results of an end-of-topic test for two classes of Australian middle secondary students at a regional school to determine if such an alternative approach is feasible. In this study, test questions were grouped by concept and subjected to concept by concept analysis of the means of test results of the two classes. This analysis revealed that the students following the alternative approach demonstrated greater competence with non-standard questions.

  17. Register-based statistics statistical methods for administrative data

    CERN Document Server

    Wallgren, Anders

    2014-01-01

    This book provides a comprehensive and up to date treatment of  theory and practical implementation in Register-based statistics. It begins by defining the area, before explaining how to structure such systems, as well as detailing alternative approaches. It explains how to create statistical registers, how to implement quality assurance, and the use of IT systems for register-based statistics. Further to this, clear details are given about the practicalities of implementing such statistical methods, such as protection of privacy and the coordination and coherence of such an undertaking. Thi

  18. Statistical methods in nonlinear dynamics

    Indian Academy of Sciences (India)

    K P N Murthy; R Harish; S V M Satyanarayana

    2005-03-01

    Sensitivity to initial conditions in nonlinear dynamical systems leads to exponential divergence of trajectories that are initially arbitrarily close, and hence to unpredictability. Statistical methods have been found to be helpful in extracting useful information about such systems. In this paper, we review briefly some statistical methods employed in the study of deterministic and stochastic dynamical systems. These include power spectral analysis and aliasing, extreme value statistics and order statistics, recurrence time statistics, the characterization of intermittency in the Sinai disorder problem, random walk analysis of diffusion in the chaotic pendulum, and long-range correlations in stochastic sequences of symbols.

  19. Methods of statistical model estimation

    CERN Document Server

    Hilbe, Joseph

    2013-01-01

    Methods of Statistical Model Estimation examines the most important and popular methods used to estimate parameters for statistical models and provide informative model summary statistics. Designed for R users, the book is also ideal for anyone wanting to better understand the algorithms used for statistical model fitting. The text presents algorithms for the estimation of a variety of regression procedures using maximum likelihood estimation, iteratively reweighted least squares regression, the EM algorithm, and MCMC sampling. Fully developed, working R code is constructed for each method. Th

  20. Statistical methods for ranking data

    CERN Document Server

    Alvo, Mayer

    2014-01-01

    This book introduces advanced undergraduate, graduate students and practitioners to statistical methods for ranking data. An important aspect of nonparametric statistics is oriented towards the use of ranking data. Rank correlation is defined through the notion of distance functions and the notion of compatibility is introduced to deal with incomplete data. Ranking data are also modeled using a variety of modern tools such as CART, MCMC, EM algorithm and factor analysis. This book deals with statistical methods used for analyzing such data and provides a novel and unifying approach for hypotheses testing. The techniques described in the book are illustrated with examples and the statistical software is provided on the authors’ website.

  1. Statistical Methods in Integrative Genomics

    OpenAIRE

    Richardson, Sylvia; Tseng, George C.; Sun, Wei

    2016-01-01

    Statistical methods in integrative genomics aim to answer important biology questions by jointly analyzing multiple types of genomic data (vertical integration) or aggregating the same type of data across multiple studies (horizontal integration). In this article, we introduce different types of genomic data and data resources, and then review statistical methods of integrative genomics, with emphasis on the motivation and rationale of these methods. We conclude with some summary points and f...

  2. Bayesian Methods for Statistical Analysis

    OpenAIRE

    Puza, Borek

    2015-01-01

    Bayesian methods for statistical analysis is a book on statistical methods for analysing a wide variety of data. The book consists of 12 chapters, starting with basic concepts and covering numerous topics, including Bayesian estimation, decision theory, prediction, hypothesis testing, hierarchical models, Markov chain Monte Carlo methods, finite population inference, biased sampling and nonignorable nonresponse. The book contains many exercises, all with worked solutions, including complete c...

  3. Nonparametric statistical methods using R

    CERN Document Server

    Kloke, John

    2014-01-01

    A Practical Guide to Implementing Nonparametric and Rank-Based ProceduresNonparametric Statistical Methods Using R covers traditional nonparametric methods and rank-based analyses, including estimation and inference for models ranging from simple location models to general linear and nonlinear models for uncorrelated and correlated responses. The authors emphasize applications and statistical computation. They illustrate the methods with many real and simulated data examples using R, including the packages Rfit and npsm.The book first gives an overview of the R language and basic statistical c

  4. Statistical Methods in Psychology Journals.

    Science.gov (United States)

    Willkinson, Leland

    1999-01-01

    Proposes guidelines for revising the American Psychological Association (APA) publication manual or other APA materials to clarify the application of statistics in research reports. The guidelines are intended to induce authors and editors to recognize the thoughtless application of statistical methods. Contains 54 references. (SLD)

  5. Statistical Methods in Psychology Journals.

    Science.gov (United States)

    Willkinson, Leland

    1999-01-01

    Proposes guidelines for revising the American Psychological Association (APA) publication manual or other APA materials to clarify the application of statistics in research reports. The guidelines are intended to induce authors and editors to recognize the thoughtless application of statistical methods. Contains 54 references. (SLD)

  6. Technical Note: Higher-order statistical moments and a procedure that detects potentially anomalous years as two alternative methods describing alterations in continuous environmental data

    Science.gov (United States)

    Arismendi, I.; Johnson, S. L.; Dunham, J. B.

    2015-03-01

    Statistics of central tendency and dispersion may not capture relevant or desired characteristics of the distribution of continuous phenomena and, thus, they may not adequately describe temporal patterns of change. Here, we present two methodological approaches that can help to identify temporal changes in environmental regimes. First, we use higher-order statistical moments (skewness and kurtosis) to examine potential changes of empirical distributions at decadal extents. Second, we adapt a statistical procedure combining a non-metric multidimensional scaling technique and higher density region plots to detect potentially anomalous years. We illustrate the use of these approaches by examining long-term stream temperature data from minimally and highly human-influenced streams. In particular, we contrast predictions about thermal regime responses to changing climates and human-related water uses. Using these methods, we effectively diagnose years with unusual thermal variability and patterns in variability through time, as well as spatial variability linked to regional and local factors that influence stream temperature. Our findings highlight the complexity of responses of thermal regimes of streams and reveal their differential vulnerability to climate warming and human-related water uses. The two approaches presented here can be applied with a variety of other continuous phenomena to address historical changes, extreme events, and their associated ecological responses.

  7. Statistical methods in language processing.

    Science.gov (United States)

    Abney, Steven

    2011-05-01

    The term statistical methods here refers to a methodology that has been dominant in computational linguistics since about 1990. It is characterized by the use of stochastic models, substantial data sets, machine learning, and rigorous experimental evaluation. The shift to statistical methods in computational linguistics parallels a movement in artificial intelligence more broadly. Statistical methods have so thoroughly permeated computational linguistics that almost all work in the field draws on them in some way. There has, however, been little penetration of the methods into general linguistics. The methods themselves are largely borrowed from machine learning and information theory. We limit attention to that which has direct applicability to language processing, though the methods are quite general and have many nonlinguistic applications. Not every use of statistics in language processing falls under statistical methods as we use the term. Standard hypothesis testing and experimental design, for example, are not covered in this article. WIREs Cogni Sci 2011 2 315-322 DOI: 10.1002/wcs.111 For further resources related to this article, please visit the WIREs website.

  8. Statistical methods for physical science

    CERN Document Server

    Stanford, John L

    1994-01-01

    This volume of Methods of Experimental Physics provides an extensive introduction to probability and statistics in many areas of the physical sciences, with an emphasis on the emerging area of spatial statistics. The scope of topics covered is wide-ranging-the text discusses a variety of the most commonly used classical methods and addresses newer methods that are applicable or potentially important. The chapter authors motivate readers with their insightful discussions, augmenting their material withKey Features* Examines basic probability, including coverage of standard distributions, time s

  9. Statistical Methods for Evolutionary Trees

    OpenAIRE

    Edwards, A. W. F.

    2009-01-01

    In 1963 and 1964, L. L. Cavalli-Sforza and A. W. F. Edwards introduced novel methods for computing evolutionary trees from genetical data, initially for human populations from blood-group gene frequencies. The most important development was their introduction of statistical methods of estimation applied to stochastic models of evolution.

  10. Statistical methods for evolutionary trees.

    Science.gov (United States)

    Edwards, A W F

    2009-09-01

    In 1963 and 1964, L. L. Cavalli-Sforza and A. W. F. Edwards introduced novel methods for computing evolutionary trees from genetical data, initially for human populations from blood-group gene frequencies. The most important development was their introduction of statistical methods of estimation applied to stochastic models of evolution.

  11. Higher statistical moments and an outlier detection technique as two alternative methods that capture long-term changes in continuous environmental data

    Directory of Open Access Journals (Sweden)

    I. Arismendi

    2014-05-01

    Full Text Available Central tendency statistics may not capture relevant or desired characteristics about the variability of continuous phenomena and thus, they may not completely track temporal patterns of change. Here, we present two methodological approaches to identify long-term changes in environmental regimes. First, we use higher statistical moments (skewness and kurtosis to examine potential changes of empirical distributions at decadal scale. Second, we adapt an outlier detection procedure combining a non-metric multidimensional scaling technique and higher density region plots to detect anomalous years. We illustrate the use of these approaches by examining long-term stream temperature data from minimally and highly human-influenced streams. In particular, we contrast predictions about thermal regime responses to changing climates and human-related water uses. Using these methods, we effectively diagnose years with unusual thermal variability, patterns in variability through time, and spatial variability linked to regional and local factors that influence stream temperature. Our findings highlight the complexity of responses of thermal regimes of streams and reveal a differentiated vulnerability to both the climate warming and human-related water uses. The two approaches presented here can be applied with a variety of other continuous phenomena to address historical changes, extreme events, and their associated ecological responses.

  12. Beyond Statistical Methods – Compendium of Statistical Methods for Researchers

    Directory of Open Access Journals (Sweden)

    Ondřej Vozár

    2014-12-01

    Full Text Available Book Review: HENDL, J. Přehled statistických metod: Analýza a metaanalýza dat (Overview of Statistical Methods: Data Analysis and Metaanalysis. 4th extended edition. Prague: Portál, 2012. ISBN 978-80-262-0200-4.

  13. Robust statistical methods with R

    CERN Document Server

    Jureckova, Jana

    2005-01-01

    Robust statistical methods were developed to supplement the classical procedures when the data violate classical assumptions. They are ideally suited to applied research across a broad spectrum of study, yet most books on the subject are narrowly focused, overly theoretical, or simply outdated. Robust Statistical Methods with R provides a systematic treatment of robust procedures with an emphasis on practical application.The authors work from underlying mathematical tools to implementation, paying special attention to the computational aspects. They cover the whole range of robust methods, including differentiable statistical functions, distance of measures, influence functions, and asymptotic distributions, in a rigorous yet approachable manner. Highlighting hands-on problem solving, many examples and computational algorithms using the R software supplement the discussion. The book examines the characteristics of robustness, estimators of real parameter, large sample properties, and goodness-of-fit tests. It...

  14. Statistical methods for bioimpedance analysis

    Directory of Open Access Journals (Sweden)

    Christian Tronstad

    2014-04-01

    Full Text Available This paper gives a basic overview of relevant statistical methods for the analysis of bioimpedance measurements, with an aim to answer questions such as: How do I begin with planning an experiment? How many measurements do I need to take? How do I deal with large amounts of frequency sweep data? Which statistical test should I use, and how do I validate my results? Beginning with the hypothesis and the research design, the methodological framework for making inferences based on measurements and statistical analysis is explained. This is followed by a brief discussion on correlated measurements and data reduction before an overview is given of statistical methods for comparison of groups, factor analysis, association, regression and prediction, explained in the context of bioimpedance research. The last chapter is dedicated to the validation of a new method by different measures of performance. A flowchart is presented for selection of statistical method, and a table is given for an overview of the most important terms of performance when evaluating new measurement technology.

  15. Statistical Methods for Fuzzy Data

    CERN Document Server

    Viertl, Reinhard

    2011-01-01

    Statistical data are not always precise numbers, or vectors, or categories. Real data are frequently what is called fuzzy. Examples where this fuzziness is obvious are quality of life data, environmental, biological, medical, sociological and economics data. Also the results of measurements can be best described by using fuzzy numbers and fuzzy vectors respectively. Statistical analysis methods have to be adapted for the analysis of fuzzy data. In this book, the foundations of the description of fuzzy data are explained, including methods on how to obtain the characterizing function of fuzzy m

  16. Statistical Cost Estimation in Higher Education: Some Alternatives.

    Science.gov (United States)

    Brinkman, Paul T.; Niwa, Shelley

    Recent developments in econometrics that are relevant to the task of estimating costs in higher education are reviewed. The relative effectiveness of alternative statistical procedures for estimating costs are also tested. Statistical cost estimation involves three basic parts: a model, a data set, and an estimation procedure. Actual data are used…

  17. Statistical methods in spatial genetics

    DEFF Research Database (Denmark)

    Guillot, Gilles; Leblois, Raphael; Coulon, Aurelie

    2009-01-01

    The joint analysis of spatial and genetic data is rapidly becoming the norm in population genetics. More and more studies explicitly describe and quantify the spatial organization of genetic variation and try to relate it to underlying ecological processes. As it has become increasingly difficult...... to keep abreast with the latest methodological developments, we review the statistical toolbox available to analyse population genetic data in a spatially explicit framework. We mostly focus on statistical concepts but also discuss practical aspects of the analytical methods, highlighting not only...

  18. Assessment of alternatives to correct inventory difference statistical treatment deficiencies

    Energy Technology Data Exchange (ETDEWEB)

    Byers, K.R.; Johnston, J.W.; Bennett, C.A.; Brouns, R.J.; Mullen, M.F.; Roberts, F.P.

    1983-11-01

    This document presents an analysis of alternatives to correct deficiencies in the statistical treatment of inventory differences in the NRC guidance documents and licensee practice. Pacific Northwest Laboratory's objective for this study was to assess alternatives developed by the NRC and a panel of safeguards statistical experts. Criteria were developed for the evaluation and the assessment was made considering the criteria. The results of this assessment are PNL recommendations, which are intended to provide NRC decision makers with a logical and statistically sound basis for correcting the deficiencies.

  19. Order statistics & inference estimation methods

    CERN Document Server

    Balakrishnan, N

    1991-01-01

    The literature on order statistics and inferenc eis quite extensive and covers a large number of fields ,but most of it is dispersed throughout numerous publications. This volume is the consolidtion of the most important results and places an emphasis on estimation. Both theoretical and computational procedures are presented to meet the needs of researchers, professionals, and students. The methods of estimation discussed are well-illustrated with numerous practical examples from both the physical and life sciences, including sociology,psychology,a nd electrical and chemical engineering. A co

  20. Propensity Score Analysis: An Alternative Statistical Approach for HRD Researchers

    Science.gov (United States)

    Keiffer, Greggory L.; Lane, Forrest C.

    2016-01-01

    Purpose: This paper aims to introduce matching in propensity score analysis (PSA) as an alternative statistical approach for researchers looking to make causal inferences using intact groups. Design/methodology/approach: An illustrative example demonstrated the varying results of analysis of variance, analysis of covariance and PSA on a heuristic…

  1. Bayes linear statistics, theory & methods

    CERN Document Server

    Goldstein, Michael

    2007-01-01

    Bayesian methods combine information available from data with any prior information available from expert knowledge. The Bayes linear approach follows this path, offering a quantitative structure for expressing beliefs, and systematic methods for adjusting these beliefs, given observational data. The methodology differs from the full Bayesian methodology in that it establishes simpler approaches to belief specification and analysis based around expectation judgements. Bayes Linear Statistics presents an authoritative account of this approach, explaining the foundations, theory, methodology, and practicalities of this important field. The text provides a thorough coverage of Bayes linear analysis, from the development of the basic language to the collection of algebraic results needed for efficient implementation, with detailed practical examples. The book covers:The importance of partial prior specifications for complex problems where it is difficult to supply a meaningful full prior probability specification...

  2. THE GROWTH POINTS OF STATISTICAL METHODS

    Directory of Open Access Journals (Sweden)

    Orlov A. I.

    2014-11-01

    Full Text Available On the basis of a new paradigm of applied mathematical statistics, data analysis and economic-mathematical methods are identified; we have also discussed five topical areas in which modern applied statistics is developing as well as the other statistical methods, i.e. five "growth points" – nonparametric statistics, robustness, computer-statistical methods, statistics of interval data, statistics of non-numeric data

  3. The faulty statistics of complementary alternative medicine (CAM).

    Science.gov (United States)

    Pandolfi, Maurizio; Carreras, Giulia

    2014-09-01

    The authors illustrate the difficulties involved in obtaining a valid statistical significance in clinical studies especially when the prior probability of the hypothesis under scrutiny is low. Since the prior probability of a research hypothesis is directly related to its scientific plausibility, the commonly used frequentist statistics, which does not take into account this probability, is particularly unsuitable for studies exploring matters in various degree disconnected from science such as complementary alternative medicine (CAM) interventions. Any statistical significance obtained in this field should be considered with great caution and may be better applied to more plausible hypotheses (like placebo effect) than that examined - which usually is the specific efficacy of the intervention. Since achieving meaningful statistical significance is an essential step in the validation of medical interventions, CAM practices, producing only outcomes inherently resistant to statistical validation, appear not to belong to modern evidence-based medicine.

  4. Statistical methods in radiation physics

    CERN Document Server

    Turner, James E; Bogard, James S

    2012-01-01

    This statistics textbook, with particular emphasis on radiation protection and dosimetry, deals with statistical solutions to problems inherent in health physics measurements and decision making. The authors begin with a description of our current understanding of the statistical nature of physical processes at the atomic level, including radioactive decay and interactions of radiation with matter. Examples are taken from problems encountered in health physics, and the material is presented such that health physicists and most other nuclear professionals will more readily understand the application of statistical principles in the familiar context of the examples. Problems are presented at the end of each chapter, with solutions to selected problems provided online. In addition, numerous worked examples are included throughout the text.

  5. Statistical inference via fiducial methods

    NARCIS (Netherlands)

    Salomé, Diemer

    1998-01-01

    In this thesis the attention is restricted to inductive reasoning using a mathematical probability model. A statistical procedure prescribes, for every theoretically possible set of data, the inference about the unknown of interest. ... Zie: Summary

  6. Statistical methods in translational medicine.

    Science.gov (United States)

    Chow, Shein-Chung; Tse, Siu-Keung; Lin, Min

    2008-12-01

    This study focuses on strategies and statistical considerations for assessment of translation in language (e.g. translation of case report forms in multinational clinical trials), information (e.g. translation of basic discoveries to the clinic) and technology (e.g. translation of Chinese diagnostic techniques to well-established clinical study endpoints) in pharmaceutical/clinical research and development. However, most of our efforts will be directed to statistical considerations for translation in information. Translational medicine has been defined as bench-to-bedside research, where a basic laboratory discovery becomes applicable to the diagnosis, treatment or prevention of a specific disease, and is brought forth by either a physicianscientist who works at the interface between the research laboratory and patient care, or by a team of basic and clinical science investigators. Statistics plays an important role in translational medicine to ensure that the translational process is accurate and reliable with certain statistical assurance. Statistical inference for the applicability of an animal model to a human model is also discussed. Strategies for selection of clinical study endpoints (e.g. absolute changes, relative changes, or responder-defined, based on either absolute or relative change) are reviewed.

  7. Statistical Methods in Translational Medicine

    Directory of Open Access Journals (Sweden)

    Shein-Chung Chow

    2008-12-01

    Full Text Available This study focuses on strategies and statistical considerations for assessment of translation in language (e.g. translation of case report forms in multinational clinical trials, information (e.g. translation of basic discoveries to the clinic and technology (e.g. translation of Chinese diagnostic techniques to well-established clinical study endpoints in pharmaceutical/clinical research and development. However, most of our efforts will be directed to statistical considerations for translation in information. Translational medicine has been defined as bench-to-bedside research, where a basic laboratory discovery becomes applicable to the diagnosis, treatment or prevention of a specific disease, and is brought forth by either a physician—scientist who works at the interface between the research laboratory and patient care, or by a team of basic and clinical science investigators. Statistics plays an important role in translational medicine to ensure that the translational process is accurate and reliable with certain statistical assurance. Statistical inference for the applicability of an animal model to a human model is also discussed. Strategies for selection of clinical study endpoints (e.g. absolute changes, relative changes, or responder-defined, based on either absolute or relative change are reviewed.

  8. Permutation statistical methods an integrated approach

    CERN Document Server

    Berry, Kenneth J; Johnston, Janis E

    2016-01-01

    This research monograph provides a synthesis of a number of statistical tests and measures, which, at first consideration, appear disjoint and unrelated. Numerous comparisons of permutation and classical statistical methods are presented, and the two methods are compared via probability values and, where appropriate, measures of effect size. Permutation statistical methods, compared to classical statistical methods, do not rely on theoretical distributions, avoid the usual assumptions of normality and homogeneity of variance, and depend only on the data at hand. This text takes a unique approach to explaining statistics by integrating a large variety of statistical methods, and establishing the rigor of a topic that to many may seem to be a nascent field in statistics. This topic is new in that it took modern computing power to make permutation methods available to people working in the mainstream of research. This research monograph addresses a statistically-informed audience, and can also easily serve as a ...

  9. Climate Prediction through Statistical Methods

    CERN Document Server

    Akgun, Bora; Tuter, Levent; Kurnaz, Mehmet Levent

    2008-01-01

    Climate change is a reality of today. Paleoclimatic proxies and climate predictions based on coupled atmosphere-ocean general circulation models provide us with temperature data. Using Detrended Fluctuation Analysis, we are investigating the statistical connection between the climate types of the present and these local temperatures. We are relating this issue to some well-known historic climate shifts. Our main result is that the temperature fluctuations with or without a temperature scale attached to them, can be used to classify climates in the absence of other indicators such as pan evaporation and precipitation.

  10. The research of railway freight statistics system and statistical methods

    Directory of Open Access Journals (Sweden)

    Wu Hua-Wen

    2013-01-01

    Full Text Available EXT is a JavaScript framework for developing Web interfaces, this paper describes the Ext framework and its application in railway freight statistical and analyzing system and Statistical methods. the paper also analyzes the design, function, implementation and so on of the system in detail. As information technology and the requirements of railway transportation organization and operation continue to improve, railway freight statistical and analyzing system improves obviously in the index system, decision analysis and other aspects, better meeting the work requirements. It will play a more important role in the railway transport organization, management, passenger and freight marketing.

  11. Statistical Methods for Unusual Count Data

    DEFF Research Database (Denmark)

    Guthrie, Katherine A.; Gammill, Hilary S.; Kamper-Jørgensen, Mads

    2016-01-01

    microchimerism data present challenges for statistical analysis, including a skewed distribution, excess zero values, and occasional large values. Methods for comparing microchimerism levels across groups while controlling for covariates are not well established. We compared statistical models for quantitative...

  12. Statistical methods in physical mapping

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, David O. [Univ. of California, Berkeley, CA (United States)

    1995-05-01

    One of the great success stories of modern molecular genetics has been the ability of biologists to isolate and characterize the genes responsible for serious inherited diseases like fragile X syndrome, cystic fibrosis and myotonic muscular dystrophy. This dissertation concentrates on constructing high-resolution physical maps. It demonstrates how probabilistic modeling and statistical analysis can aid molecular geneticists in the tasks of planning, execution, and evaluation of physical maps of chromosomes and large chromosomal regions. The dissertation is divided into six chapters. Chapter 1 provides an introduction to the field of physical mapping, describing the role of physical mapping in gene isolation and ill past efforts at mapping chromosomal regions. The next two chapters review and extend known results on predicting progress in large mapping projects. Such predictions help project planners decide between various approaches and tactics for mapping large regions of the human genome. Chapter 2 shows how probability models have been used in the past to predict progress in mapping projects. Chapter 3 presents new results, based on stationary point process theory, for progress measures for mapping projects based on directed mapping strategies. Chapter 4 describes in detail the construction of all initial high-resolution physical map for human chromosome 19. This chapter introduces the probability and statistical models involved in map construction in the context of a large, ongoing physical mapping project. Chapter 5 concentrates on one such model, the trinomial model. This chapter contains new results on the large-sample behavior of this model, including distributional results, asymptotic moments, and detection error rates. In addition, it contains an optimality result concerning experimental procedures based on the trinomial model. The last chapter explores unsolved problems and describes future work.

  13. Multivariate statistical methods a primer

    CERN Document Server

    Manly, Bryan FJ

    2004-01-01

    THE MATERIAL OF MULTIVARIATE ANALYSISExamples of Multivariate DataPreview of Multivariate MethodsThe Multivariate Normal DistributionComputer ProgramsGraphical MethodsChapter SummaryReferencesMATRIX ALGEBRAThe Need for Matrix AlgebraMatrices and VectorsOperations on MatricesMatrix InversionQuadratic FormsEigenvalues and EigenvectorsVectors of Means and Covariance MatricesFurther Reading Chapter SummaryReferencesDISPLAYING MULTIVARIATE DATAThe Problem of Displaying Many Variables in Two DimensionsPlotting index VariablesThe Draftsman's PlotThe Representation of Individual Data P:ointsProfiles o

  14. Statistical test of Duane-Hunt's law and its comparison with an alternative law

    CERN Document Server

    Perkovac, Milan

    2010-01-01

    Using Pearson correlation coefficient a statistical analysis of Duane-Hunt and Kulenkampff's measurement results was performed. This analysis reveals that empirically based Duane-Hunt's law is not entirely consistent with the measurement data. The author has theoretically found the action of electromagnetic oscillators, which corresponds to Planck's constant, and also has found an alternative law based on the classical theory. Using the same statistical method, this alternative law is likewise tested, and it is proved that the alternative law is completely in accordance with the measurements. The alternative law gives a relativistic expression for the energy of electromagnetic wave emitted or absorbed by atoms and proves that the empirically derived Planck-Einstein's expression is only valid for relatively low frequencies. Wave equation, which is similar to the Schr\\"odinger equation, and wavelength of the standing electromagnetic wave are also established by the author's analysis. For a relatively low energy...

  15. Equilibrium Statistics: Monte Carlo Methods

    Science.gov (United States)

    Kröger, Martin

    Monte Carlo methods use random numbers, or ‘random’ sequences, to sample from a known shape of a distribution, or to extract distribution by other means. and, in the context of this book, to (i) generate representative equilibrated samples prior being subjected to external fields, or (ii) evaluate high-dimensional integrals. Recipes for both topics, and some more general methods, are summarized in this chapter. It is important to realize, that Monte Carlo should be as artificial as possible to be efficient and elegant. Advanced Monte Carlo ‘moves’, required to optimize the speed of algorithms for a particular problem at hand, are outside the scope of this brief introduction. One particular modern example is the wavelet-accelerated MC sampling of polymer chains [406].

  16. Statistical methods for nuclear material management

    Energy Technology Data Exchange (ETDEWEB)

    Bowen W.M.; Bennett, C.A. (eds.)

    1988-12-01

    This book is intended as a reference manual of statistical methodology for nuclear material management practitioners. It describes statistical methods currently or potentially important in nuclear material management, explains the choice of methods for specific applications, and provides examples of practical applications to nuclear material management problems. Together with the accompanying training manual, which contains fully worked out problems keyed to each chapter, this book can also be used as a textbook for courses in statistical methods for nuclear material management. It should provide increased understanding and guidance to help improve the application of statistical methods to nuclear material management problems.

  17. Statistical Methods for Material Characterization and Qualification

    Energy Technology Data Exchange (ETDEWEB)

    Kercher, A.K.

    2005-04-01

    This document describes a suite of statistical methods that can be used to infer lot parameters from the data obtained from inspection/testing of random samples taken from that lot. Some of these methods will be needed to perform the statistical acceptance tests required by the Advanced Gas Reactor Fuel Development and Qualification (AGR) Program. Special focus has been placed on proper interpretation of acceptance criteria and unambiguous methods of reporting the statistical results. In addition, modified statistical methods are described that can provide valuable measures of quality for different lots of material. This document has been written for use as a reference and a guide for performing these statistical calculations. Examples of each method are provided. Uncertainty analysis (e.g., measurement uncertainty due to instrumental bias) is not included in this document, but should be considered when reporting statistical results.

  18. Statistical methods for material characterization and qualification

    Energy Technology Data Exchange (ETDEWEB)

    Hunn, John D [ORNL; Kercher, Andrew K [ORNL

    2005-01-01

    This document describes a suite of statistical methods that can be used to infer lot parameters from the data obtained from inspection/testing of random samples taken from that lot. Some of these methods will be needed to perform the statistical acceptance tests required by the Advanced Gas Reactor Fuel Development and Qualification (AGR) Program. Special focus has been placed on proper interpretation of acceptance criteria and unambiguous methods of reporting the statistical results. In addition, modified statistical methods are described that can provide valuable measures of quality for different lots of material. This document has been written for use as a reference and a guide for performing these statistical calculations. Examples of each method are provided. Uncertainty analysis (e.g., measurement uncertainty due to instrumental bias) is not included in this document, but should be considered when reporting statistical results.

  19. Statistical methods for material characterization and qualification

    Energy Technology Data Exchange (ETDEWEB)

    Hunn, John D [ORNL; Kercher, Andrew K [ORNL

    2005-01-01

    This document describes a suite of statistical methods that can be used to infer lot parameters from the data obtained from inspection/testing of random samples taken from that lot. Some of these methods will be needed to perform the statistical acceptance tests required by the Advanced Gas Reactor Fuel Development and Qualification (AGR) Program. Special focus has been placed on proper interpretation of acceptance criteria and unambiguous methods of reporting the statistical results. In addition, modified statistical methods are described that can provide valuable measures of quality for different lots of material. This document has been written for use as a reference and a guide for performing these statistical calculations. Examples of each method are provided. Uncertainty analysis (e.g., measurement uncertainty due to instrumental bias) is not included in this document, but should be considered when reporting statistical results.

  20. Statistical Methods for Material Characterization and Qualification

    Energy Technology Data Exchange (ETDEWEB)

    Kercher, A.K.

    2005-04-01

    This document describes a suite of statistical methods that can be used to infer lot parameters from the data obtained from inspection/testing of random samples taken from that lot. Some of these methods will be needed to perform the statistical acceptance tests required by the Advanced Gas Reactor Fuel Development and Qualification (AGR) Program. Special focus has been placed on proper interpretation of acceptance criteria and unambiguous methods of reporting the statistical results. In addition, modified statistical methods are described that can provide valuable measures of quality for different lots of material. This document has been written for use as a reference and a guide for performing these statistical calculations. Examples of each method are provided. Uncertainty analysis (e.g., measurement uncertainty due to instrumental bias) is not included in this document, but should be considered when reporting statistical results.

  1. Statistical Models and Methods for Lifetime Data

    CERN Document Server

    Lawless, Jerald F

    2011-01-01

    Praise for the First Edition"An indispensable addition to any serious collection on lifetime data analysis and . . . a valuable contribution to the statistical literature. Highly recommended . . ."-Choice"This is an important book, which will appeal to statisticians working on survival analysis problems."-Biometrics"A thorough, unified treatment of statistical models and methods used in the analysis of lifetime data . . . this is a highly competent and agreeable statistical textbook."-Statistics in MedicineThe statistical analysis of lifetime or response time data is a key tool in engineering,

  2. Methods for Characterization of Alternative RNA Splicing.

    Science.gov (United States)

    Harvey, Samuel E; Cheng, Chonghui

    2016-01-01

    Quantification of alternative splicing to detect the abundance of differentially spliced isoforms of a gene in total RNA can be accomplished via RT-PCR using both quantitative real-time and semi-quantitative PCR methods. These methods require careful PCR primer design to ensure specific detection of particular splice isoforms. We also describe analysis of alternative splicing using a splicing "minigene" in mammalian cell tissue culture to facilitate investigation of the regulation of alternative splicing of a particular exon of interest.

  3. Multivariate statistical methods a first course

    CERN Document Server

    Marcoulides, George A

    2014-01-01

    Multivariate statistics refer to an assortment of statistical methods that have been developed to handle situations in which multiple variables or measures are involved. Any analysis of more than two variables or measures can loosely be considered a multivariate statistical analysis. An introductory text for students learning multivariate statistical methods for the first time, this book keeps mathematical details to a minimum while conveying the basic principles. One of the principal strategies used throughout the book--in addition to the presentation of actual data analyses--is poin

  4. SOME STATISTICAL SOFTWARE APPLICATIONS FOR TAGUCHI METHODS

    Directory of Open Access Journals (Sweden)

    Adrian Stere PARIS

    2016-05-01

    Full Text Available The paper details the variety of Taguchi methods, as important contribution to the quality improvement. The extended use of these methods imposes more and more complex calculi for the practical application and optimization. It should be necessary to benefit by the new software developments, assisted by the advanced statistical methods. The paper presents a few particular applications of some statistical software for the Taguchi methods as a quality enhancement insisting on the quality loss functions, the design of experiments and the new developments of statistical process control.

  5. Advanced statistical methods in data science

    CERN Document Server

    Chen, Jiahua; Lu, Xuewen; Yi, Grace; Yu, Hao

    2016-01-01

    This book gathers invited presentations from the 2nd Symposium of the ICSA- CANADA Chapter held at the University of Calgary from August 4-6, 2015. The aim of this Symposium was to promote advanced statistical methods in big-data sciences and to allow researchers to exchange ideas on statistics and data science and to embraces the challenges and opportunities of statistics and data science in the modern world. It addresses diverse themes in advanced statistical analysis in big-data sciences, including methods for administrative data analysis, survival data analysis, missing data analysis, high-dimensional and genetic data analysis, longitudinal and functional data analysis, the design and analysis of studies with response-dependent and multi-phase designs, time series and robust statistics, statistical inference based on likelihood, empirical likelihood and estimating functions. The editorial group selected 14 high-quality presentations from this successful symposium and invited the presenters to prepare a fu...

  6. Statistical methods for environmental pollution monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, R.O.

    1987-01-01

    The application of statistics to environmental pollution monitoring studies requires a knowledge of statistical analysis methods particularly well suited to pollution data. This book fills that need by providing sampling plans, statistical tests, parameter estimation procedure techniques, and references to pertinent publications. Most of the statistical techniques are relatively simple, and examples, exercises, and case studies are provided to illustrate procedures. The book is logically divided into three parts. Chapters 1, 2, and 3 are introductory chapters. Chapters 4 through 10 discuss field sampling designs and Chapters 11 through 18 deal with a broad range of statistical analysis procedures. Some statistical techniques given here are not commonly seen in statistics book. For example, see methods for handling correlated data (Sections 4.5 and 11.12), for detecting hot spots (Chapter 10), and for estimating a confidence interval for the mean of a lognormal distribution (Section 13.2). Also, Appendix B lists a computer code that estimates and tests for trends over time at one or more monitoring stations using nonparametric methods (Chapters 16 and 17). Unfortunately, some important topics could not be included because of their complexity and the need to limit the length of the book. For example, only brief mention could be made of time series analysis using Box-Jenkins methods and of kriging techniques for estimating spatial and spatial-time patterns of pollution, although multiple references on these topics are provided. Also, no discussion of methods for assessing risks from environmental pollution could be included.

  7. Statistical Methods for Environmental Pollution Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, Richard O. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    1987-01-01

    The application of statistics to environmental pollution monitoring studies requires a knowledge of statistical analysis methods particularly well suited to pollution data. This book fills that need by providing sampling plans, statistical tests, parameter estimation procedure techniques, and references to pertinent publications. Most of the statistical techniques are relatively simple, and examples, exercises, and case studies are provided to illustrate procedures. The book is logically divided into three parts. Chapters 1, 2, and 3 are introductory chapters. Chapters 4 through 10 discuss field sampling designs and Chapters 11 through 18 deal with a broad range of statistical analysis procedures. Some statistical techniques given here are not commonly seen in statistics book. For example, see methods for handling correlated data (Sections 4.5 and 11.12), for detecting hot spots (Chapter 10), and for estimating a confidence interval for the mean of a lognormal distribution (Section 13.2). Also, Appendix B lists a computer code that estimates and tests for trends over time at one or more monitoring stations using nonparametric methods (Chapters 16 and 17). Unfortunately, some important topics could not be included because of their complexity and the need to limit the length of the book. For example, only brief mention could be made of time series analysis using Box-Jenkins methods and of kriging techniques for estimating spatial and spatial-time patterns of pollution, although multiple references on these topics are provided. Also, no discussion of methods for assessing risks from environmental pollution could be included.

  8. ABOUT THE METHODOLOGY OF STATISTICAL METHODS

    OpenAIRE

    Orlov A. I.

    2014-01-01

    The purpose of the article - to justify the need to develop the methodology of statistical methods as an independent scientific direction. The models of mathematician and applied specialist are presented. We have obtained the conclusions on teaching and research and discussed five major unsolved problems of statistical methods: the effect of deviations from the traditional prerequisites; use asymptotic results for finite sample sizes; selecting one of the many specific tests for the hypothesi...

  9. Modern statistical methods in respiratory medicine.

    Science.gov (United States)

    Wolfe, Rory; Abramson, Michael J

    2014-01-01

    Statistics sits right at the heart of scientific endeavour in respiratory medicine and many other disciplines. In this introductory article, some key epidemiological concepts such as representativeness, random sampling, association and causation, and confounding are reviewed. A brief introduction to basic statistics covering topics such as frequentist methods, confidence intervals, hypothesis testing, P values and Type II error is provided. Subsequent articles in this series will cover some modern statistical methods including regression models, analysis of repeated measures, causal diagrams, propensity scores, multiple imputation, accounting for measurement error, survival analysis, risk prediction, latent class analysis and meta-analysis.

  10. Alternative methods in toxicology: pre-validated and validated methods

    OpenAIRE

    2011-01-01

    The development of alternative methods to animal experimentation has progressed rapidly over the last 20 years. Today, in vitro and in silico methods have an important role in the hazard identification and assessment of toxicology profile of compounds. Advanced alternative methods and their combinations are also used for safety assessment of final products. Several alternative methods, which were scientifically validated and accepted by competent regulatory bodies, can be used for regulatory ...

  11. Alternative methods in toxicology: pre-validated and validated methods.

    Science.gov (United States)

    Kandárová, Helena; Letašiová, Silvia

    2011-09-01

    The development of alternative methods to animal experimentation has progressed rapidly over the last 20 years. Today, in vitro and in silico methods have an important role in the hazard identification and assessment of toxicology profile of compounds. Advanced alternative methods and their combinations are also used for safety assessment of final products. Several alternative methods, which were scientifically validated and accepted by competent regulatory bodies, can be used for regulatory toxicology purposes, thus reducing or fully replacing living animals in toxicology experimentation. The acceptance of the alternative methods as valuable tools of modern toxicology has been recognized by regulators, including OECD, FDA and EPA.This paper provides a brief overview of the topic "alternative methods in toxicology" and focuses on pre-validated and validated alternative methods and their position in the modern toxicology.

  12. Methods for Characterization of Alternative RNA Splicing

    Science.gov (United States)

    Harvey, Samuel E.; Cheng, Chonghui

    2016-01-01

    Quantification of alternative splicing to detect the abundance of differentially spliced isoforms of a gene in total RNA can be accomplished via RT-PCR using both quantitative real-time and semi-quantitative PCR methods. These methods require careful PCR primer design to ensure specific detection of particular splice isoforms. We also describe analysis of alternative splicing using a splicing “minigene” in mammalian cell tissue culture to facilitate investigation of the regulation of alternative splicing of a particular exon of interest. PMID:26721495

  13. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...

  14. Workshop on Analytical Methods in Statistics

    CERN Document Server

    Jurečková, Jana; Maciak, Matúš; Pešta, Michal

    2017-01-01

    This volume collects authoritative contributions on analytical methods and mathematical statistics. The methods presented include resampling techniques; the minimization of divergence; estimation theory and regression, eventually under shape or other constraints or long memory; and iterative approximations when the optimal solution is difficult to achieve. It also investigates probability distributions with respect to their stability, heavy-tailness, Fisher information and other aspects, both asymptotically and non-asymptotically. The book not only presents the latest mathematical and statistical methods and their extensions, but also offers solutions to real-world problems including option pricing. The selected, peer-reviewed contributions were originally presented at the workshop on Analytical Methods in Statistics, AMISTAT 2015, held in Prague, Czech Republic, November 10-13, 2015.

  15. [Pathogenesis of temporomandibular dysfunction. II. Statistical method].

    Science.gov (United States)

    Vágó, P

    1989-08-01

    The variables of the epidemiologic assessments concerned with the aetiology of the mandible joint disfunction were examined in the course of statistical analyses, in general, in their pairwise connections and possibly a multi-variable linear regression calculation was employed. In the course of the examination, for establishing the linear, empirically tested model of the aetiology of the mandible joint disfunction a new type statistical method, the LISREL (Linear Structural Relationship) method was employed. An advantage of this assessment consists in that not only observed variables may figure as the variables of the structural equation but also latent variables which cannot be observed but it is supposable that they are factors of the observed variables. This statistical method is described in closer details in the article in connection with the forming of the aetiological model.

  16. Statistical methods for spatio-temporal systems

    CERN Document Server

    Finkenstadt, Barbel

    2006-01-01

    Statistical Methods for Spatio-Temporal Systems presents current statistical research issues on spatio-temporal data modeling and will promote advances in research and a greater understanding between the mechanistic and the statistical modeling communities.Contributed by leading researchers in the field, each self-contained chapter starts with an introduction of the topic and progresses to recent research results. Presenting specific examples of epidemic data of bovine tuberculosis, gastroenteric disease, and the U.K. foot-and-mouth outbreak, the first chapter uses stochastic models, such as point process models, to provide the probabilistic backbone that facilitates statistical inference from data. The next chapter discusses the critical issue of modeling random growth objects in diverse biological systems, such as bacteria colonies, tumors, and plant populations. The subsequent chapter examines data transformation tools using examples from ecology and air quality data, followed by a chapter on space-time co...

  17. Statistical Methods for Stochastic Differential Equations

    CERN Document Server

    Kessler, Mathieu; Sorensen, Michael

    2012-01-01

    The seventh volume in the SemStat series, Statistical Methods for Stochastic Differential Equations presents current research trends and recent developments in statistical methods for stochastic differential equations. Written to be accessible to both new students and seasoned researchers, each self-contained chapter starts with introductions to the topic at hand and builds gradually towards discussing recent research. The book covers Wiener-driven equations as well as stochastic differential equations with jumps, including continuous-time ARMA processes and COGARCH processes. It presents a sp

  18. Alternating event processes during lifetimes: population dynamics and statistical inference.

    Science.gov (United States)

    Shinohara, Russell T; Sun, Yifei; Wang, Mei-Cheng

    2017-08-07

    In the literature studying recurrent event data, a large amount of work has been focused on univariate recurrent event processes where the occurrence of each event is treated as a single point in time. There are many applications, however, in which univariate recurrent events are insufficient to characterize the feature of the process because patients experience nontrivial durations associated with each event. This results in an alternating event process where the disease status of a patient alternates between exacerbations and remissions. In this paper, we consider the dynamics of a chronic disease and its associated exacerbation-remission process over two time scales: calendar time and time-since-onset. In particular, over calendar time, we explore population dynamics and the relationship between incidence, prevalence and duration for such alternating event processes. We provide nonparametric estimation techniques for characteristic quantities of the process. In some settings, exacerbation processes are observed from an onset time until death; to account for the relationship between the survival and alternating event processes, nonparametric approaches are developed for estimating exacerbation process over lifetime. By understanding the population dynamics and within-process structure, the paper provide a new and general way to study alternating event processes.

  19. New Graphical Methods and Test Statistics for Testing Composite Normality

    Directory of Open Access Journals (Sweden)

    Marc S. Paolella

    2015-07-01

    Full Text Available Several graphical methods for testing univariate composite normality from an i.i.d. sample are presented. They are endowed with correct simultaneous error bounds and yield size-correct tests. As all are based on the empirical CDF, they are also consistent for all alternatives. For one test, called the modified stabilized probability test, or MSP, a highly simplified computational method is derived, which delivers the test statistic and also a highly accurate p-value approximation, essentially instantaneously. The MSP test is demonstrated to have higher power against asymmetric alternatives than the well-known and powerful Jarque-Bera test. A further size-correct test, based on combining two test statistics, is shown to have yet higher power. The methodology employed is fully general and can be applied to any i.i.d. univariate continuous distribution setting.

  20. Applying statistical methods to text steganography

    CERN Document Server

    Nechta, Ivan

    2011-01-01

    This paper presents a survey of text steganography methods used for hid- ing secret information inside some covertext. Widely known hiding techniques (such as translation based steganography, text generating and syntactic embed- ding) and detection are considered. It is shown that statistical analysis has an important role in text steganalysis.

  1. Statistical search methods for lotsizing problems

    NARCIS (Netherlands)

    M. Salomon (Marc); R. Kuik (Roelof); L.N. van Wassenhove (Luk)

    1993-01-01

    textabstractThis paper reports on our experiments with statistical search methods for solving lotsizing problems in production planning. In lotsizing problems the main objective is to generate a minimum cost production and inventory schedule, such that (i) customer demand is satisfied, and (ii) capa

  2. Distributed Reconstruction via Alternating Direction Method

    Directory of Open Access Journals (Sweden)

    Linyuan Wang

    2013-01-01

    Full Text Available With the development of compressive sensing theory, image reconstruction from few-view projections has received considerable research attentions in the field of computed tomography (CT. Total-variation- (TV- based CT image reconstruction has been shown to be experimentally capable of producing accurate reconstructions from sparse-view data. In this study, a distributed reconstruction algorithm based on TV minimization has been developed. This algorithm is very simple as it uses the alternating direction method. The proposed method can accelerate the alternating direction total variation minimization (ADTVM algorithm without losing accuracy.

  3. Statistical properties of alternative national forest inventory area estimators

    Science.gov (United States)

    Francis Roesch; John Coulston; Andrew D. Hill

    2012-01-01

    The statistical properties of potential estimators of forest area for the USDA Forest Service's Forest Inventory and Analysis (FIA) program are presented and discussed. The current FIA area estimator is compared and contrasted with a weighted mean estimator and an estimator based on the Polya posterior, in the presence of nonresponse. Estimator optimality is...

  4. Alternative methods of ophthalmic treatment in Russia.

    Science.gov (United States)

    Vader, L

    1994-04-01

    Russian ophthalmic nurses and physicians are using alternative methods of treatment to supplement traditional eye care. As acupuncture and iridology become more popular in the United States, ophthalmic nurses need to be more knowledgeable about these treatments and the implications for patients.

  5. Statistical methods for assessing agreement between continuous measurements

    DEFF Research Database (Denmark)

    Sokolowski, Ineta; Hansen, Rikke Pilegaard; Vedsted, Peter

    ), concordance coefficient, Bland-Altman limits of agreement and percentage of agreement to assess the agreement between patient reported delay and doctor reported delay in diagnosis of cancer in general practice. Key messages: The correct statistical approach is not obvious. Many studies give the product......-moment correlation coefficient (r) between the results of the two measurements methods as an indicator of agreement, which is wrong. There have been proposed several alternative methods, which we will describe together with preconditions for use of the methods....

  6. Identifying Reflectors in Seismic Images via Statistic and Syntactic Methods

    Directory of Open Access Journals (Sweden)

    Carlos A. Perez

    2010-04-01

    Full Text Available In geologic interpretation of seismic reflection data, accurate identification of reflectors is the foremost step to ensure proper subsurface structural definition. Reflector information, along with other data sets, is a key factor to predict the presence of hydrocarbons. In this work, mathematic and pattern recognition theory was adapted to design two statistical and two syntactic algorithms which constitute a tool in semiautomatic reflector identification. The interpretive power of these four schemes was evaluated in terms of prediction accuracy and computational speed. Among these, the semblance method was confirmed to render the greatest accuracy and speed. Syntactic methods offer an interesting alternative due to their inherently structural search method.

  7. Statistical Methods for Estimating the Cumulative Risk of Screening Mammography Outcomes

    NARCIS (Netherlands)

    Hubbard, R.A.; Ripping, T.M.; Chubak, J.; Broeders, M.J.; Miglioretti, D.L.

    2016-01-01

    BACKGROUND: This study illustrates alternative statistical methods for estimating cumulative risk of screening mammography outcomes in longitudinal studies. METHODS: Data from the US Breast Cancer Surveillance Consortium (BCSC) and the Nijmegen Breast Cancer Screening Program in the Netherlands were

  8. The statistical process control methods - SPC

    Directory of Open Access Journals (Sweden)

    Floreková Ľubica

    1998-03-01

    Full Text Available Methods of statistical evaluation of quality – SPC (item 20 of the documentation system of quality control of ISO norm, series 900 of various processes, products and services belong amongst basic qualitative methods that enable us to analyse and compare data pertaining to various quantitative parameters. Also they enable, based on the latter, to propose suitable interventions with the aim of improving these processes, products and services. Theoretical basis and applicatibily of the principles of the: - diagnostics of a cause and effects, - Paret analysis and Lorentz curve, - number distribution and frequency curves of random variable distribution, - Shewhart regulation charts, are presented in the contribution.

  9. Statistical evaluation of alternative light sources for bloodstain photography.

    Science.gov (United States)

    Lee, Wee Chuen; Khoo, Bee Ee; Bin Abdullah, Ahmad Fahmi Lim; Abdul Aziz, Zalina Binti

    2013-05-01

    Bloodstain photography is important in forensic applications, especially for bloodstain pattern analysis. This study compares the enhancement effect of bloodstain photography using three different types of light source: fluorescent white light, near-ultraviolet (UV) light-emitting diode (LED) light, and 410 nm LED light. Randomized complete block designs were implemented to identify the lighting that would statistically produce the best enhancement results for bloodstains on different types of surfaces. Bloodstain samples were prepared on white cotton, brown carpet, tar road, and wood. These samples were photographed in darkroom conditions using a Canon EOS 50D digital SLR camera, with Canon EFS 60 mm f/2.8 Macro USM lens. Two-way analysis of variance and Fisher's least significant difference test were used to analyze the contrast of the images. The statistical analysis showed that 410 nm light is the best among the tested lights for enhancing bloodstains on the tested surfaces, where the contrast of bloodstain to background was the highest.

  10. Classification image analysis: estimation and statistical inference for two-alternative forced-choice experiments

    Science.gov (United States)

    Abbey, Craig K.; Eckstein, Miguel P.

    2002-01-01

    We consider estimation and statistical hypothesis testing on classification images obtained from the two-alternative forced-choice experimental paradigm. We begin with a probabilistic model of task performance for simple forced-choice detection and discrimination tasks. Particular attention is paid to general linear filter models because these models lead to a direct interpretation of the classification image as an estimate of the filter weights. We then describe an estimation procedure for obtaining classification images from observer data. A number of statistical tests are presented for testing various hypotheses from classification images based on some more compact set of features derived from them. As an example of how the methods we describe can be used, we present a case study investigating detection of a Gaussian bump profile.

  11. Statistical methods for assessment of blend homogeneity

    DEFF Research Database (Denmark)

    Madsen, Camilla

    2002-01-01

    In this thesis the use of various statistical methods to address some of the problems related to assessment of the homogeneity of powder blends in tablet production is discussed. It is not straight forward to assess the homogeneity of a powder blend. The reason is partly that in bulk materials......, it is shown how to set up parametric acceptance criteria for the batch that gives a high confidence that future samples with a probability larger than a specified value will pass the USP threeclass criteria. Properties and robustness of proposed changes to the USP test for content uniformity are investigated...

  12. Statistical process control methods for expert system performance monitoring.

    Science.gov (United States)

    Kahn, M G; Bailey, T C; Steib, S A; Fraser, V J; Dunagan, W C

    1996-01-01

    The literature on the performance evaluation of medical expert system is extensive, yet most of the techniques used in the early stages of system development are inappropriate for deployed expert systems. Because extensive clinical and informatics expertise and resources are required to perform evaluations, efficient yet effective methods of monitoring performance during the long-term maintenance phase of the expert system life cycle must be devised. Statistical process control techniques provide a well-established methodology that can be used to define policies and procedures for continuous, concurrent performance evaluation. Although the field of statistical process control has been developed for monitoring industrial processes, its tools, techniques, and theory are easily transferred to the evaluation of expert systems. Statistical process tools provide convenient visual methods and heuristic guidelines for detecting meaningful changes in expert system performance. The underlying statistical theory provides estimates of the detection capabilities of alternative evaluation strategies. This paper describes a set of statistical process control tools that can be used to monitor the performance of a number of deployed medical expert systems. It describes how p-charts are used in practice to monitor the GermWatcher expert system. The case volume and error rate of GermWatcher are then used to demonstrate how different inspection strategies would perform.

  13. Statistical methods in credit risk management

    Directory of Open Access Journals (Sweden)

    Ljiljanka Kvesić

    2012-12-01

    Full Text Available Successful banks base their operations on the principles of liquidity, profitability and safety. Therefore, the correct assessment of the ability of a loan applicant to carry out certain obligations is of crucial importance for the functioning of a bank. In the past few decades several credit scoring models have been developed to provide support to credit analysts in the assessment of a loan applicant. This paper presents three statistical methods that are used for this purpose in the area of credit risk management: logistical regression, discriminatory analysis and survival analysis. Their implementation in the banking sector was motivated to a great extent by the development and application of information and communication technologies. This paper aims to point out the most important theoretical aspects of these methods, but also to actualise the need for the development and application of the credit scoring model in Croatian banking practice.

  14. Statistical Methods in Phylogenetic and Evolutionary Inferences

    Directory of Open Access Journals (Sweden)

    Luigi Bertolotti

    2013-05-01

    Full Text Available Molecular instruments are the most accurate methods in organisms’identification and characterization. Biologists are often involved in studies where the main goal is to identify relationships among individuals. In this framework, it is very important to know and apply the most robust approaches to infer correctly these relationships, allowing the right conclusions about phylogeny. In this review, we will introduce the reader to the most used statistical methods in phylogenetic analyses, the Maximum Likelihood and the Bayesian approaches, considering for simplicity only analyses regardingDNA sequences. Several studieswill be showed as examples in order to demonstrate how the correct phylogenetic inference can lead the scientists to highlight very peculiar features in pathogens biology and evolution.

  15. Purification of Carbon Nanotubes: Alternative Methods

    Science.gov (United States)

    Files, Bradley; Scott, Carl; Gorelik, Olga; Nikolaev, Pasha; Hulse, Lou; Arepalli, Sivaram

    2000-01-01

    Traditional carbon nanotube purification process involves nitric acid refluxing and cross flow filtration using surfactant TritonX. This is believed to result in damage to nanotubes and surfactant residue on nanotube surface. Alternative purification procedures involving solvent extraction, thermal zone refining and nitric acid refiuxing are used in the current study. The effect of duration and type of solvent to dissolve impurities including fullerenes and P ACs (polyaromatic compounds) are monitored by nuclear magnetic reasonance, high performance liquid chromatography, and thermogravimetric analysis. Thermal zone refining yielded sample areas rich in nanotubes as seen by scanning electric microscopy. Refluxing in boiling nitric acid seem to improve the nanotube content. Different procedural steps are needed to purify samples produced by laser process compared to arc process. These alternative methods of nanotube purification will be presented along with results from supporting analytical techniques.

  16. Application of pedagogy reflective in statistical methods course and practicum statistical methods

    Science.gov (United States)

    Julie, Hongki

    2017-08-01

    Subject Elementary Statistics, Statistical Methods and Statistical Methods Practicum aimed to equip students of Mathematics Education about descriptive statistics and inferential statistics. The students' understanding about descriptive and inferential statistics were important for students on Mathematics Education Department, especially for those who took the final task associated with quantitative research. In quantitative research, students were required to be able to present and describe the quantitative data in an appropriate manner, to make conclusions from their quantitative data, and to create relationships between independent and dependent variables were defined in their research. In fact, when students made their final project associated with quantitative research, it was not been rare still met the students making mistakes in the steps of making conclusions and error in choosing the hypothetical testing process. As a result, they got incorrect conclusions. This is a very fatal mistake for those who did the quantitative research. There were some things gained from the implementation of reflective pedagogy on teaching learning process in Statistical Methods and Statistical Methods Practicum courses, namely: 1. Twenty two students passed in this course and and one student did not pass in this course. 2. The value of the most accomplished student was A that was achieved by 18 students. 3. According all students, their critical stance could be developed by them, and they could build a caring for each other through a learning process in this course. 4. All students agreed that through a learning process that they undergo in the course, they can build a caring for each other.

  17. On two methods of statistical image analysis

    NARCIS (Netherlands)

    Missimer, J; Knorr, U; Maguire, RP; Herzog, H; Seitz, RJ; Tellman, L; Leenders, KL

    1999-01-01

    The computerized brain atlas (CBA) and statistical parametric mapping (SPM) are two procedures for voxel-based statistical evaluation of PET activation studies. Each includes spatial standardization of image volumes, computation of a statistic, and evaluation of its significance. In addition, smooth

  18. The Monte Carlo method the method of statistical trials

    CERN Document Server

    Shreider, YuA

    1966-01-01

    The Monte Carlo Method: The Method of Statistical Trials is a systematic account of the fundamental concepts and techniques of the Monte Carlo method, together with its range of applications. Some of these applications include the computation of definite integrals, neutron physics, and in the investigation of servicing processes. This volume is comprised of seven chapters and begins with an overview of the basic features of the Monte Carlo method and typical examples of its application to simple problems in computational mathematics. The next chapter examines the computation of multi-dimensio

  19. Statistical methods for astronomical data analysis

    CERN Document Server

    Chattopadhyay, Asis Kumar

    2014-01-01

    This book introduces “Astrostatistics” as a subject in its own right with rewarding examples, including work by the authors with galaxy and Gamma Ray Burst data to engage the reader. This includes a comprehensive blending of Astrophysics and Statistics. The first chapter’s coverage of preliminary concepts and terminologies for astronomical phenomenon will appeal to both Statistics and Astrophysics readers as helpful context. Statistics concepts covered in the book provide a methodological framework. A unique feature is the inclusion of different possible sources of astronomical data, as well as software packages for converting the raw data into appropriate forms for data analysis. Readers can then use the appropriate statistical packages for their particular data analysis needs. The ideas of statistical inference discussed in the book help readers determine how to apply statistical tests. The authors cover different applications of statistical techniques already developed or specifically introduced for ...

  20. 77 FR 43827 - International Workshop on Alternative Methods for Leptospira

    Science.gov (United States)

    2012-07-26

    ... Alternatives to Animal Testing (EURL ECVAM), the Japanese Center for the Validation of Alternative Methods, the... HUMAN SERVICES International Workshop on Alternative Methods for Leptospira Vaccine Potency Testing... Workshop on Alternative Methods for Leptospira Vaccine Potency Testing: State of the Science and the Way...

  1. Statistical Method of Estimating Nigerian Hydrocarbon Reserves

    Directory of Open Access Journals (Sweden)

    Jeffrey O. Oseh

    2015-01-01

    Full Text Available Hydrocarbon reserves are basic to planning and investment decisions in Petroleum Industry. Therefore its proper estimation is of considerable importance in oil and gas production. The estimation of hydrocarbon reserves in the Niger Delta Region of Nigeria has been very popular, and very successful, in the Nigerian oil and gas industry for the past 50 years. In order to fully estimate the hydrocarbon potentials in Nigerian Niger Delta Region, a clear understanding of the reserve geology and production history should be acknowledged. Reserves estimation of most fields is often performed through Material Balance and Volumetric methods. Alternatively a simple Estimation Model and Least Squares Regression may be useful or appropriate. This model is based on extrapolation of additional reserve due to exploratory drilling trend and the additional reserve factor which is due to revision of the existing fields. This Estimation model used alongside with Linear Regression Analysis in this study gives improved estimates of the fields considered, hence can be used in other Nigerian Fields with recent production history

  2. Mathematical and statistical methods for actuarial sciences and finance

    CERN Document Server

    Sibillo, Marilena

    2014-01-01

    The interaction between mathematicians and statisticians working in the actuarial and financial fields is producing numerous meaningful scientific results. This volume, comprising a series of four-page papers, gathers new ideas relating to mathematical and statistical methods in the actuarial sciences and finance. The book covers a variety of topics of interest from both theoretical and applied perspectives, including: actuarial models; alternative testing approaches; behavioral finance; clustering techniques; coherent and non-coherent risk measures; credit-scoring approaches; data envelopment analysis; dynamic stochastic programming; financial contagion models; financial ratios; intelligent financial trading systems; mixture normality approaches; Monte Carlo-based methodologies; multicriteria methods; nonlinear parameter estimation techniques; nonlinear threshold models; particle swarm optimization; performance measures; portfolio optimization; pricing methods for structured and non-structured derivatives; r...

  3. Confirmatory Factor Analysis of the Structure of Statistics Anxiety Measure: An Examination of Four Alternative Models

    Directory of Open Access Journals (Sweden)

    Hossein Bevrani, PhD

    2011-09-01

    Full Text Available Objective: The aim of this study is to explore the confirmatory factor analysis results of the Persian adaptation of Statistics Anxiety Measure (SAM, proposed by Earp.Method: The validity and reliability assessments of the scale were performed on 298 college students chosen randomly from Tabriz University in Iran. Confirmatory factor analysis (CFA was carried out to determine the factor structures of the Persian adaptation of SAM.Results: As expected, the second order model provided a better fit to the data than the three alternative models. Conclusions: Hence, SAM provides an equally valid measure for use among college students. The study both expands and adds support to the existing body of math anxiety literature.

  4. An Alternative Method to Project Wind Patterns

    Science.gov (United States)

    Fadillioglu, Cagla; Kiyisuren, I. Cagatay; Collu, Kamil; Turp, M. Tufan; Kurnaz, M. Levent; Ozturk, Tugba

    2016-04-01

    Wind energy is one of the major clean and sustainable energy sources. Beside its various advantages, wind energy has a downside that its performance cannot be projected very accurately in the long-term. In this study, we offer an alternative method which can be used to determine the best location to install a wind turbine in a large area aiming maximum energy performance in the long run. For this purpose, a regional climate model (i.e. RegCM4.4) is combined with a software called Winds on Critical Streamline Surfaces (WOCSS) in order to identify wind patterns for any domains even in a changing climate. As a special case, Çanakkale region is examined due to the terrain profile having both coastal and mountainous features. WOCSS program was run twice for each month in the sample years in a double nested fashion, using the provisional RegCM4.4 wind data between years 2020 and 2040. Modified version of WOCSS provides terrain following flow surfaces and by processing those data, it makes a wind profile output for certain heights specified by the user. The computational time of WOCSS is also in reasonable range. Considering the lack of alternative methods for long-term wind performance projection, the model used in this study is a very good way for obtaining quick indications for wind performance taking the impact of the terrain effects into account. This research has been supported by Boǧaziçi University Research Fund Grant Number 10421.

  5. Nonequilibrium relaxation method – An alternative simulation strategy

    Indian Academy of Sciences (India)

    Nobuyasu Ito

    2005-06-01

    One well-established simulation strategy to study the thermal phases and transitions of a given microscopic model system is the so-called equilibrium method, in which one first realizes the equilibrium ensemble of a finite system and then extrapolates the results to infinite system. This equilibrium method traces over the standard theory of the thermal statistical mechanics, and over the idea of the thermodynamic limit. Recently, an alternative simulation strategy has been developed, which analyzes the nonequilibrium relaxation (NER) process. It is called the NER method. NER method has some advantages over the equilibrium method. The NER method provides a simpler analyzing procedure. This implies less systematic error which is inevitable in the simulation and provides efficient resource usage. The NER method easily treats not only the thermodynamic limit but also other limits, for example, non-Gibbsian nonequilibrium steady states. So the NER method is also relevant for new fields of the statistical physics. Application of the NER method have been expanding to various problems: from basic first- and second-order transitions to advanced and exotic phases like chiral, KT spin-glass and quantum phases. These studies have provided, not only better estimations of transition point and exponents, but also qualitative developments. For example, the universality class of a random system, the nature of the two-dimensional melting and the scaling behavior of spin-glass aging phenomena have been clarified.

  6. Alternative method of removing otoliths from sturgeon

    Science.gov (United States)

    Chalupnicki, Marc A.; Dittman, Dawn E.

    2016-01-01

    Extracting the otoliths (ear bones) from fish that have very thick skulls can be difficult and very time consuming. The common practice of making a transverse vertical incision on the top of the skull with a hand or electrical saw may damage the otolith if not performed correctly. Sturgeons (Acipenseridae) are one family in particular that have a very large and thick skull. A new laboratory method entering the brain cavity from the ventral side of the fish to expose the otoliths was easier than other otolith extraction methods found in the literature. Methods reviewed in the literature are designed for the field and are more efficient at processing large quantities of fish quickly. However, this new technique was designed to be more suited for a laboratory setting when time is not pressing and successful extraction from each specimen is critical. The success of finding and removing otoliths using this technique is very high and does not compromise the structure in any manner. This alternative technique is applicable to other similar fish species for extracting the otoliths.

  7. Development of a Research Methods and Statistics Concept Inventory

    Science.gov (United States)

    Veilleux, Jennifer C.; Chapman, Kate M.

    2017-01-01

    Research methods and statistics are core courses in the undergraduate psychology major. To assess learning outcomes, it would be useful to have a measure that assesses research methods and statistical literacy beyond course grades. In two studies, we developed and provided initial validation results for a research methods and statistical knowledge…

  8. Alternative method for assessing coking coal plasticity

    Energy Technology Data Exchange (ETDEWEB)

    Dzuy Nguyen; Susan Woodhouse; Merrick Mahoney [University of Adelaide (Australia). BHP Billiton Newcastle Technology Centre

    2008-07-15

    Traditional plasticity measurements for coal have a number of limitations associated with the reproducibility of the tests and their use in predicting coking behaviour. This report reviews alternative rheological methods for characterising the plastic behaviour of coking coals. It reviews the application of more fundamental rheological measurements to the coal system as well as reviewing applications of rheology to other physical systems. These systems may act as potential models for the application of fundamental rheological measurements to cokemaking. The systems considered were polymer melts, coal ash melts, lava, bread making and ice cream. These systems were chosen because they exhibit some physically equivalent processes to the processes occurring during cokemaking, eg, the generation of bubbles within a softened system that then resolidifies. A number of recommendations were made; the steady and oscillatory shear squeeze flow techniques be further investigated to determine if the measured rheology characteristics are related to transformations within the coke oven and the characteristics of resultant coke; modification of Gieseler plastometers for more fundamental rheology measurements not be attempted.

  9. Robust inference from multiple test statistics via permutations: a better alternative to the single test statistic approach for randomized trials.

    Science.gov (United States)

    Ganju, Jitendra; Yu, Xinxin; Ma, Guoguang Julie

    2013-01-01

    Formal inference in randomized clinical trials is based on controlling the type I error rate associated with a single pre-specified statistic. The deficiency of using just one method of analysis is that it depends on assumptions that may not be met. For robust inference, we propose pre-specifying multiple test statistics and relying on the minimum p-value for testing the null hypothesis of no treatment effect. The null hypothesis associated with the various test statistics is that the treatment groups are indistinguishable. The critical value for hypothesis testing comes from permutation distributions. Rejection of the null hypothesis when the smallest p-value is less than the critical value controls the type I error rate at its designated value. Even if one of the candidate test statistics has low power, the adverse effect on the power of the minimum p-value statistic is not much. Its use is illustrated with examples. We conclude that it is better to rely on the minimum p-value rather than a single statistic particularly when that single statistic is the logrank test, because of the cost and complexity of many survival trials.

  10. METHODS TO RESTRUCTURE THE STATISTICAL COMMUNITIES

    Directory of Open Access Journals (Sweden)

    Emilia TITAN

    2005-12-01

    Full Text Available In view of knowing the essence of phenomena it is necessary to perform statistical data processing operations. This allows for shifting from individual data to derived, synthetic indicators that highlight the essence of various phenomena. The high volume and diversity of processing operations presuppose developing plans of computerised data processing. To identify distinct and homogenous groups and classes it is necessary to realise well-pondered groupings and classifications that presuppose to comply with the requirements presented in the article.

  11. Statistical models and methods for reliability and survival analysis

    CERN Document Server

    Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo

    2013-01-01

    Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

  12. From alternative methods to a new toxicology.

    Science.gov (United States)

    Hartung, Thomas

    2011-04-01

    Mechanistic toxicology has evolved by relying, to a large extent, on methodologies that substitute or complement traditional animal tests. The biotechnology and informatics revolutions of the last decades have made such technologies broadly available and useful, but regulatory toxicology has been slow to embrace these new approaches. Major validation efforts, however, have delivered the evidence that new approaches do not lower safety standards and can be integrated into regulatory safety assessments. Particularly in the EU, political pressures, such as the REACH legislation and the 7th Amendment to the cosmetic legislation, have prompted the need of new approaches. In the US, the NRC vision report calling for a toxicology for the 21st century (and its most recent adaptation by EPA for their toxicity testing strategy) have initiated a debate about how to create a novel approach based on human cell cultures, lower species, high-throughput testing, and modeling. Lessons learned from the development, validation, and acceptance of alternative methods support the creation of a new approach based on identified toxicity pathways. Conceptual steering and an objective assessment of current practices by evidence-based toxicology (EBT) are required. EBT is modeled on evidence-based medicine, which has demonstrated that rigorous systematic reviews of current practices and meta-analyses of studies provide powerful tools to provide health care professionals and patients with the current best scientific evidence. Similarly, a portal for high-quality reviews of toxicological approaches and tools for the quantitative meta-analyses of data promise to serve as door opener for a new regulatory toxicology.

  13. Alternate methods to teach history of anesthesia.

    Science.gov (United States)

    Desai, Manisha S; Desai, Sukumar P

    2014-02-01

    Residency programs in anesthesiology in the United States struggle to balance the conflicting needs of formal didactic sessions, clinical teaching, and clinical service obligations. As a consequence of the explosion in knowledge about basic and applied sciences related to our specialty, residents and fellows are expected to make substantial efforts to supplement formal lectures with self-study. There is strong evidence to suggest that members of the younger generation use nontraditional methods to acquire information. Although training programs are not required to include topics related to history of anesthesia (HOA) in the didactic curriculum, and despite the fact that such knowledge does not directly impact clinical care, many programs include such lectures and discussions. We describe and discuss our experience with 3 alternate modalities of teaching HOA.First, we provide brief descriptions of HOA-related historical narratives and novels within the domain of popular literature, rather than those that might be considered textbooks. Second, we analyze content in movies and videodiscs dealing with HOA and determine their utility as educational resources. Third, we describe HOA tours to sites in close proximity to our institutions, as well as those in locations elsewhere in the United States and abroad.We suggest that informal HOA teaching can be implemented by every residency program without much effort and without taking away from the traditional curriculum. Participating in this unique and enriching experience may be a means of academic advancement. It is our hope and expectation that graduates from programs that incorporate such exposure to HOA become advocates of history and may choose to devote a part of their academic career toward exploration of HOA.

  14. Statistical methods of estimating mining costs

    Science.gov (United States)

    Long, K.R.

    2011-01-01

    Until it was defunded in 1995, the U.S. Bureau of Mines maintained a Cost Estimating System (CES) for prefeasibility-type economic evaluations of mineral deposits and estimating costs at producing and non-producing mines. This system had a significant role in mineral resource assessments to estimate costs of developing and operating known mineral deposits and predicted undiscovered deposits. For legal reasons, the U.S. Geological Survey cannot update and maintain CES. Instead, statistical tools are under development to estimate mining costs from basic properties of mineral deposits such as tonnage, grade, mineralogy, depth, strip ratio, distance from infrastructure, rock strength, and work index. The first step was to reestimate "Taylor's Rule" which relates operating rate to available ore tonnage. The second step was to estimate statistical models of capital and operating costs for open pit porphyry copper mines with flotation concentrators. For a sample of 27 proposed porphyry copper projects, capital costs can be estimated from three variables: mineral processing rate, strip ratio, and distance from nearest railroad before mine construction began. Of all the variables tested, operating costs were found to be significantly correlated only with strip ratio.

  15. Methods for generating hydroelectric power development alternatives

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Shoou-yuh; Liaw, Shu-liang; Sale, M.J.; Railsback, S.F.

    1989-01-01

    Hydropower development on large rivers can result in a number of environmental impacts, including potential reductions in dissolved oxygen (DO) concentrations. This study presents a methodology for generating different hydropower development alternatives for evaluation. This methodology employs a Streeter-Phelps model to simulate DO, and the Bounded Implicit Enumeration algorithm to solve an optimization model formulated to maximize hydroelectric energy production subject to acceptable DO limits. The upper Ohio River basin was used to illustrate the use and characteristics of the methodology. The results indicate that several alternatives which meet the specified DO constraints can be generated efficiently, meeting both power and environmental objectives. 17 refs., 2 figs., 1 tab.

  16. Innovative statistical methods for public health data

    CERN Document Server

    Wilson, Jeffrey

    2015-01-01

    The book brings together experts working in public health and multi-disciplinary areas to present recent issues in statistical methodological development and their applications. This timely book will impact model development and data analyses of public health research across a wide spectrum of analysis. Data and software used in the studies are available for the reader to replicate the models and outcomes. The fifteen chapters range in focus from techniques for dealing with missing data with Bayesian estimation, health surveillance and population definition and implications in applied latent class analysis, to multiple comparison and meta-analysis in public health data. Researchers in biomedical and public health research will find this book to be a useful reference, and it can be used in graduate level classes.

  17. Methods of contemporary mathematical statistical physics

    CERN Document Server

    2009-01-01

    This volume presents a collection of courses introducing the reader to the recent progress with attention being paid to laying solid grounds and developing various basic tools. An introductory chapter on lattice spin models is useful as a background for other lectures of the collection. The topics include new results on phase transitions for gradient lattice models (with introduction to the techniques of the reflection positivity), stochastic geometry reformulation of classical and quantum Ising models, the localization/delocalization transition for directed polymers. A general rigorous framework for theory of metastability is presented and particular applications in the context of Glauber and Kawasaki dynamics of lattice models are discussed. A pedagogical account of several recently discussed topics in nonequilibrium statistical mechanics with an emphasis on general principles is followed by a discussion of kinetically constrained spin models that are reflecting important peculiar features of glassy dynamic...

  18. Mathematical and statistical methods for multistatic imaging

    CERN Document Server

    Ammari, Habib; Jing, Wenjia; Kang, Hyeonbae; Lim, Mikyoung; Sølna, Knut; Wang, Han

    2013-01-01

    This book covers recent mathematical, numerical, and statistical approaches for multistatic imaging of targets with waves at single or multiple frequencies. The waves can be acoustic, elastic or electromagnetic. They are generated by point sources on a transmitter array and measured on a receiver array. An important problem in multistatic imaging is to quantify and understand the trade-offs between data size, computational complexity, signal-to-noise ratio, and resolution. Another fundamental problem is to have a shape representation well suited to solving target imaging problems from multistatic data. In this book the trade-off between resolution and stability when the data are noisy is addressed. Efficient imaging algorithms are provided and their resolution and stability with respect to noise in the measurements analyzed. It also shows that high-order polarization tensors provide an accurate representation of the target. Moreover, a dictionary-matching technique based on new invariants for the generalized ...

  19. Statistical methods for categorical data analysis

    CERN Document Server

    Powers, Daniel

    2008-01-01

    This book provides a comprehensive introduction to methods and models for categorical data analysis and their applications in social science research. Companion website also available, at https://webspace.utexas.edu/dpowers/www/

  20. Simple statistical methods for software engineering data and patterns

    CERN Document Server

    Pandian, C Ravindranath

    2015-01-01

    Although there are countless books on statistics, few are dedicated to the application of statistical methods to software engineering. Simple Statistical Methods for Software Engineering: Data and Patterns fills that void. Instead of delving into overly complex statistics, the book details simpler solutions that are just as effective and connect with the intuition of problem solvers.Sharing valuable insights into software engineering problems and solutions, the book not only explains the required statistical methods, but also provides many examples, review questions, and case studies that prov

  1. Statistical methods and computing for big data

    Science.gov (United States)

    Wang, Chun; Chen, Ming-Hui; Schifano, Elizabeth; Wu, Jing

    2016-01-01

    Big data are data on a massive scale in terms of volume, intensity, and complexity that exceed the capacity of standard analytic tools. They present opportunities as well as challenges to statisticians. The role of computational statisticians in scientific discovery from big data analyses has been under-recognized even by peer statisticians. This article summarizes recent methodological and software developments in statistics that address the big data challenges. Methodologies are grouped into three classes: subsampling-based, divide and conquer, and online updating for stream data. As a new contribution, the online updating approach is extended to variable selection with commonly used criteria, and their performances are assessed in a simulation study with stream data. Software packages are summarized with focuses on the open source R and R packages, covering recent tools that help break the barriers of computer memory and computing power. Some of the tools are illustrated in a case study with a logistic regression for the chance of airline delay. PMID:27695593

  2. Statistical methods for analysing complex genetic traits

    NARCIS (Netherlands)

    El Galta, Rachid

    2006-01-01

    Complex traits are caused by multiple genetic and environmental factors, and are therefore difficult to study compared with simple Mendelian diseases. The modes of inheritance of Mendelian diseases are often known. Methods to dissect such diseases are well described in literature. For complex geneti

  3. Sparse Inverse Covariance Selection via Alternating Linearization Methods

    CERN Document Server

    Scheinberg, Katya; Goldfarb, Donald

    2010-01-01

    Gaussian graphical models are of great interest in statistical learning. Because the conditional independencies between different nodes correspond to zero entries in the inverse covariance matrix of the Gaussian distribution, one can learn the structure of the graph by estimating a sparse inverse covariance matrix from sample data, by solving a convex maximum likelihood problem with an $\\ell_1$-regularization term. In this paper, we propose a first-order method based on an alternating linearization technique that exploits the problem's special structure; in particular, the subproblems solved in each iteration have closed-form solutions. Moreover, our algorithm obtains an $\\epsilon$-optimal solution in $O(1/\\epsilon)$ iterations. Numerical experiments on both synthetic and real data from gene association networks show that a practical version of this algorithm outperforms other competitive algorithms.

  4. Deferring Elimination of Design Alternatives in Object Oriented Methods

    NARCIS (Netherlands)

    Aksit, Mehmet; Marcelloni, Francesco

    2001-01-01

    While developing systems, software engineers generally have to deal with a large number of design alternatives. Current object-oriented methods aim to eliminate design alternatives whenever they are generated. Alternatives, however, should be eliminated only when sufficient information to take such

  5. Analysis of Statistical Methods Currently used in Toxicology Journals

    OpenAIRE

    Na, Jihye; Yang, Hyeri; Bae, SeungJin; Lim, Kyung-Min

    2014-01-01

    Statistical methods are frequently used in toxicology, yet it is not clear whether the methods employed by the studies are used consistently and conducted based on sound statistical grounds. The purpose of this paper is to describe statistical methods used in top toxicology journals. More specifically, we sampled 30 papers published in 2014 from Toxicology and Applied Pharmacology, Archives of Toxicology, and Toxicological Science and described methodologies used to provide descriptive and in...

  6. Problems and Recommendations for Rural Statistics and Survey Methods

    Institute of Scientific and Technical Information of China (English)

    Chengjun; ZHANG

    2014-01-01

    With constant deepening of the reform and opening-up,national economic system has changed from planned economy to market economy,and rural survey and statistics remain in a difficult transition period. In this period,China needs transforming original statistical mode according to market economic system. All levels of government should report and submit a lot and increasing statistical information. Besides,in this period,townships,villages and counties are faced with old and new conflicts. These conflicts perplex implementation of rural statistics and survey and development of rural statistical undertaking,and also cause researches and thinking of reform of rural statistical and survey methods.

  7. Statistical Methods Used in Gifted Education Journals, 2006-2010

    Science.gov (United States)

    Warne, Russell T.; Lazo, Maria; Ramos, Tammy; Ritter, Nicola

    2012-01-01

    This article describes the statistical methods used in quantitative and mixed methods articles between 2006 and 2010 in five gifted education research journals. Results indicate that the most commonly used statistical methods are means (85.9% of articles), standard deviations (77.8%), Pearson's "r" (47.8%), X[superscript 2] (32.2%), ANOVA (30.7%),…

  8. Statistical methods for assessment of blend homogeneity

    DEFF Research Database (Denmark)

    Madsen, Camilla

    2002-01-01

    as powder blends there is no natural unit or amount to define a sample from the blend, and partly that current technology does not provide a method of universally collecting small representative samples from large static powder beds. In the thesis a number of methods to assess (in)homogeneity are presented...... of internal factors to the blend e.g. the particle size distribution. The relation between particle size distribution and the variation in drug content in blend and tablet samples is discussed. A central problem is to develop acceptance criteria for blends and tablet batches to decide whether the blend...... blend or batch. In the thesis it is shown how to link sampling result and acceptance criteria to the actual quality (homogeneity) of the blend or tablet batch. Also it is discussed how the assurance related to a specific acceptance criteria can be obtained from the corresponding OC-curve. Further...

  9. Statistical methods for handling incomplete data

    CERN Document Server

    Kim, Jae Kwang

    2013-01-01

    ""… this book nicely blends the theoretical material and its application through examples, and will be of interest to students and researchers as a textbook or a reference book. Extensive coverage of recent advances in handling missing data provides resources and guidelines for researchers and practitioners in implementing the methods in new settings. … I plan to use this as a textbook for my teaching and highly recommend it.""-Biometrics, September 2014

  10. Alternative parameter determination methods for a PMSG

    DEFF Research Database (Denmark)

    Kalogiannis, Theodoros; Malz, Elena; Llano, Enrique Muller

    2014-01-01

    One of the fundamental requirements for testing and analysing a Permanent Magnet Synchronous Generator (PMSG) is to obtain its electrical and mechanical parameters. This paper describes the test set up and the procedure for obtaining them. Stator resistance and flux linkage measurements follow IEEE...... standards. In the other hand a new approach for an alternative stator inductance and inertia measurement is analysed. More precisely, the former is obtained through laboratory work based on the locked rotor test, and the latter through a CAD software based on a 3D model. In order to assess and validate...

  11. Review of robust multivariate statistical methods in high dimension.

    Science.gov (United States)

    Filzmoser, Peter; Todorov, Valentin

    2011-10-31

    General ideas of robust statistics, and specifically robust statistical methods for calibration and dimension reduction are discussed. The emphasis is on analyzing high-dimensional data. The discussed methods are applied using the packages chemometrics and rrcov of the statistical software environment R. It is demonstrated how the functions can be applied to real high-dimensional data from chemometrics, and how the results can be interpreted.

  12. Scientific Method, Statistical Method and the Speed of Light

    OpenAIRE

    MacKay, R. J.; Oldford, R.W.

    2000-01-01

    What is “statistical method”? Is it the same as “scientific method”? This paper answers the first question by specifying the elements and procedures common to all statistical investigations and organizing these into a single structure. This structure is illustrated by careful examination of the first scientific study on the speed of light carried out by A. A. Michelson in 1879. Our answer to the second question is negative. To understand this a history on the speed of light ...

  13. Constructivist Pedagogy and Alternative Teaching Methods for Intercultural Education

    Directory of Open Access Journals (Sweden)

    Ramona Lupu

    2014-05-01

    Full Text Available Our research proposes to underline the role of the constructivist pedagogy in the formative accomplishing of the intercultural education objectives, starting from the premise that the intuitive learning and the use of active-participative teaching-learning methods cover in a greater measure this discipline‟s cognitive, affective and psychomotor dimensions. The research design is made up of: 2 homogenous lots of intentionality composed of 70 and 60 students. There were used quantitative and qualitative research methods: structured questionnaire, semi-structured interview, focus-group method, evaluative techniques and statistics applied on a 2 years period. The discovery theory applied in teaching and learning increases the acknowledgement degree regarding the existence of ethnic groups and possible discriminatory actions; the mediator role assumed by the professor stimulates the formation and sedimentation of attitudes envisaged by the intercultural education, respecting with great fidelity the principle of conscious appropriation of knowledge; the use of alternative evaluative methods illustrate an increase of the students‟ school efficaciousness at this discipline and in the same time of their enthusiasm; the principles underlined by the constructivist pedagogy apply with great success in intercultural education.

  14. An Overview of Short-term Statistical Forecasting Methods

    DEFF Research Database (Denmark)

    Elias, Russell J.; Montgomery, Douglas C.; Kulahci, Murat

    2006-01-01

    An overview of statistical forecasting methodology is given, focusing on techniques appropriate to short- and medium-term forecasts. Topics include basic definitions and terminology, smoothing methods, ARIMA models, regression methods, dynamic regression models, and transfer functions. Techniques...

  15. An Overview of Short-term Statistical Forecasting Methods

    DEFF Research Database (Denmark)

    Elias, Russell J.; Montgomery, Douglas C.; Kulahci, Murat

    2006-01-01

    An overview of statistical forecasting methodology is given, focusing on techniques appropriate to short- and medium-term forecasts. Topics include basic definitions and terminology, smoothing methods, ARIMA models, regression methods, dynamic regression models, and transfer functions. Techniques...

  16. Online Statistics Labs in MSW Research Methods Courses: Reducing Reluctance toward Statistics

    Science.gov (United States)

    Elliott, William; Choi, Eunhee; Friedline, Terri

    2013-01-01

    This article presents results from an evaluation of an online statistics lab as part of a foundations research methods course for master's-level social work students. The article discusses factors that contribute to an environment in social work that fosters attitudes of reluctance toward learning and teaching statistics in research methods…

  17. Online Statistics Labs in MSW Research Methods Courses: Reducing Reluctance toward Statistics

    Science.gov (United States)

    Elliott, William; Choi, Eunhee; Friedline, Terri

    2013-01-01

    This article presents results from an evaluation of an online statistics lab as part of a foundations research methods course for master's-level social work students. The article discusses factors that contribute to an environment in social work that fosters attitudes of reluctance toward learning and teaching statistics in research methods…

  18. The use of Statistical Methods in Mechanical Engineering

    Directory of Open Access Journals (Sweden)

    Iram Saleem

    2013-03-01

    Full Text Available Statistics is an important tool to handle the vast data of present era as statistics can interpret all the information in such a beauty that so many conclusions can be extracted from it. The aim of this study is to see the use of statistical methods in Mechanical Engineering (ME therefore; we selected research papers published in 2010 from the well reputed journals in ME under Taylor and Francis Company LTD. More than 350 research papers were downloaded from well reputed ME journals such as Inverse Problem in Science and Engineering (IPSE, Machining Science and Technology (MST, Materials and Manufacturing Processes (MMP, Particulate Science and Technology (PST and Research in Nondestructive Evaluation (RNE. We recorded the statistical techniques/methods used in each research paper. In this study, we presented frequency distribution of descriptive statistics and advance level statistical methods used in five of the ME journals in 2010.

  19. Alternatives to radioimmunoassay: labels and methods

    Energy Technology Data Exchange (ETDEWEB)

    Schall, R.F. Jr.; Tenoso, H.J.

    1981-07-01

    The following labels used as substitutes for radioisotopes in immunoassay systems are reviewed bacteriophages, chemiluminescence precursors, fluorochromes, fluorogens, fluorescence quenchers, enzymes, coenzymes, inhibitors, substrates, various particulates, metal atoms, and stable free radicals. New methods for performing immunoassays with these labels are described where appropriate. Methods that require no separation steps and offer special promise for easy automation are noted. 69 references cited.

  20. Alternative Testing Methods for Predicting Health Risk from Environmental Exposures

    Directory of Open Access Journals (Sweden)

    Annamaria Colacci

    2014-08-01

    Full Text Available Alternative methods to animal testing are considered as promising tools to support the prediction of toxicological risks from environmental exposure. Among the alternative testing methods, the cell transformation assay (CTA appears to be one of the most appropriate approaches to predict the carcinogenic properties of single chemicals, complex mixtures and environmental pollutants. The BALB/c 3T3 CTA shows a good degree of concordance with the in vivo rodent carcinogenesis tests. Whole-genome transcriptomic profiling is performed to identify genes that are transcriptionally regulated by different kinds of exposures. Its use in cell models representative of target organs may help in understanding the mode of action and predicting the risk for human health. Aiming at associating the environmental exposure to health-adverse outcomes, we used an integrated approach including the 3T3 CTA and transcriptomics on target cells, in order to evaluate the effects of airborne particulate matter (PM on toxicological complex endpoints. Organic extracts obtained from PM2.5 and PM1 samples were evaluated in the 3T3 CTA in order to identify effects possibly associated with different aerodynamic diameters or airborne chemical components. The effects of the PM2.5 extracts on human health were assessed by using whole-genome 44 K oligo-microarray slides. Statistical analysis by GeneSpring GX identified genes whose expression was modulated in response to the cell treatment. Then, modulated genes were associated with pathways, biological processes and diseases through an extensive biological analysis. Data derived from in vitro methods and omics techniques could be valuable for monitoring the exposure to toxicants, understanding the modes of action via exposure-associated gene expression patterns and to highlight the role of genes in key events related to adversity.

  1. Alternative Inspection Methods for Single Shell Tanks

    Energy Technology Data Exchange (ETDEWEB)

    Peters, Timothy J.; Alzheimer, James M.; Hurley, David E.

    2010-01-19

    This document was prepared to provide evaluations and recommendations regarding nondestructive evaluation methods that might be used to determine cracks and bowing in the ceiling of waste storage tanks on the Hanford site. The goal was to determine cracks as small as 1/16 in. wide in the ceiling, and bowing as small as 0.25 in. This report describes digital video camera methods that can be used to detect a crack in the ceiling of the dome, and methods for determining the surface topography of the ceiling in the waste storage tanks to detect localized movements in the surface. A literature search, combined with laboratory testing, comprised this study.

  2. PID Techniques: Alternatives to RICH Methods

    CERN Document Server

    Va'vra, Jerry

    2016-01-01

    In this review article we discuss new updates on PID techniques, other than the Cherenkov method. In particular, we discuss recent efforts to develop high resolution timing, placing an emphasis on small scale test results.

  3. Alternating Krylov subspace image restoration methods

    National Research Council Canada - National Science Library

    Abad, J.O; Morigi, S; Reichel, L; Sgallari, F

    2012-01-01

    ... of the Krylov subspace used. However, our solution methods, suitably modified, also can be applied when no bound for the norm of η δ is known. We determine an approximation of the desired image u ˆ by so...

  4. Alternative Therapy of Animals – Homeopathy and Other Alternative Methods of Therapy

    Directory of Open Access Journals (Sweden)

    Løken Torleiv

    2002-03-01

    Full Text Available Alternative therapy of animals is described, in the meaning of alternatives to veterinary therapy traditionally accepted by veterinary faculties and schools and included in their curricula. Alternative therapy composes of different disciplines, of which homeopathy is emphasised in this presentation. Information is given on the use and interest of such therapy among veterinarians and animal owners. Homeopathy as other alternative therapies, may offer great advances, if they induce any effect. Some of the disciplines are based on a scientifically accepted documentation. Others, and homeopathy in particular, are missing such a documentation of effect. The justification of including alternative therapy in treating animals is discussed. Research in alternative therapy of animals is greatly needed, in particular to evaluate therapeutic methods which are in extensive use without any documented effect. An ongoing research project in Norway on the effect of homeopathic treatment of mastitis in cows is shortly presented.

  5. Proximal Alternating Direction Method with Relaxed Proximal Parameters for the Least Squares Covariance Adjustment Problem

    Directory of Open Access Journals (Sweden)

    Minghua Xu

    2014-01-01

    Full Text Available We consider the problem of seeking a symmetric positive semidefinite matrix in a closed convex set to approximate a given matrix. This problem may arise in several areas of numerical linear algebra or come from finance industry or statistics and thus has many applications. For solving this class of matrix optimization problems, many methods have been proposed in the literature. The proximal alternating direction method is one of those methods which can be easily applied to solve these matrix optimization problems. Generally, the proximal parameters of the proximal alternating direction method are greater than zero. In this paper, we conclude that the restriction on the proximal parameters can be relaxed for solving this kind of matrix optimization problems. Numerical experiments also show that the proximal alternating direction method with the relaxed proximal parameters is convergent and generally has a better performance than the classical proximal alternating direction method.

  6. Statistical methods in longitudinal research principles and structuring change

    CERN Document Server

    von Eye, Alexander

    1991-01-01

    These edited volumes present new statistical methods in a way that bridges the gap between theoretical and applied statistics. The volumes cover general problems and issues and more specific topics concerning the structuring of change, the analysis of time series, and the analysis of categorical longitudinal data. The book targets students of development and change in a variety of fields - psychology, sociology, anthropology, education, medicine, psychiatry, economics, behavioural sciences, developmental psychology, ecology, plant physiology, and biometry - with basic training in statistics an

  7. Development and testing of improved statistical wind power forecasting methods.

    Energy Technology Data Exchange (ETDEWEB)

    Mendes, J.; Bessa, R.J.; Keko, H.; Sumaili, J.; Miranda, V.; Ferreira, C.; Gama, J.; Botterud, A.; Zhou, Z.; Wang, J. (Decision and Information Sciences); (INESC Porto)

    2011-12-06

    Wind power forecasting (WPF) provides important inputs to power system operators and electricity market participants. It is therefore not surprising that WPF has attracted increasing interest within the electric power industry. In this report, we document our research on improving statistical WPF algorithms for point, uncertainty, and ramp forecasting. Below, we provide a brief introduction to the research presented in the following chapters. For a detailed overview of the state-of-the-art in wind power forecasting, we refer to [1]. Our related work on the application of WPF in operational decisions is documented in [2]. Point forecasts of wind power are highly dependent on the training criteria used in the statistical algorithms that are used to convert weather forecasts and observational data to a power forecast. In Chapter 2, we explore the application of information theoretic learning (ITL) as opposed to the classical minimum square error (MSE) criterion for point forecasting. In contrast to the MSE criterion, ITL criteria do not assume a Gaussian distribution of the forecasting errors. We investigate to what extent ITL criteria yield better results. In addition, we analyze time-adaptive training algorithms and how they enable WPF algorithms to cope with non-stationary data and, thus, to adapt to new situations without requiring additional offline training of the model. We test the new point forecasting algorithms on two wind farms located in the U.S. Midwest. Although there have been advancements in deterministic WPF, a single-valued forecast cannot provide information on the dispersion of observations around the predicted value. We argue that it is essential to generate, together with (or as an alternative to) point forecasts, a representation of the wind power uncertainty. Wind power uncertainty representation can take the form of probabilistic forecasts (e.g., probability density function, quantiles), risk indices (e.g., prediction risk index) or scenarios

  8. HERBAL MEDICINE AMONG COMPLEMENTARY AND ALTERNATIVE MEDICINE METHODS

    OpenAIRE

    A. Ruban; Rodioniva, T.

    2012-01-01

    Alternative medicine methods may incorporate or base themselves on traditional medicine [1], folk knowledge [2], spiritual beliefs, or newly conceived approaches to healing. The major complementary and alternative medicine systems have many common characteristics, treating the whole person, including a focus on individualizing treatments, promoting self-care and self-healing, and recognizing the spiritual nature of each individual. Complementary and alternative medicine often lacks or has onl...

  9. The estimation of the measurement results with using statistical methods

    Science.gov (United States)

    Velychko, O.; Gordiyenko, T.

    2015-02-01

    The row of international standards and guides describe various statistical methods that apply for a management, control and improvement of processes with the purpose of realization of analysis of the technical measurement results. The analysis of international standards and guides on statistical methods estimation of the measurement results recommendations for those applications in laboratories is described. For realization of analysis of standards and guides the cause-and-effect Ishikawa diagrams concerting to application of statistical methods for estimation of the measurement results are constructed.

  10. Alternative methods for characterization of extracellular vesicles

    Directory of Open Access Journals (Sweden)

    Fatemeh eMomen-Heravi

    2012-09-01

    Full Text Available Extracellular vesicles are nano-sized vesicles released by all cells in vitro as well as in vivo. Their role has been implicated mainly in cell-cell communication, but also in disease biomarkers and more recently in gene delivery. They represent a snapshot of the cell status at the moment of release and carry bioreactive macromolecules such as nucleic acids, proteins and lipids. A major limitation in this emerging new field is the availability/awareness of techniques to isolate and properly characterize Extracellular vesicles. The lack of gold standards makes comparing different studies very difficult and may potentially hinder some Extracellular vesicles -specific evidence. Characterization of Extracellular vesicles has also recently seen many advances with the use of Nanoparticle Tracking Analysis (NTA, flow cytometry, cryo-EM instruments and proteomic technologies. In this review, we discuss the latest developments in translational technologies involving characterization methods including the facts in their support and the challenges they face.

  11. Grade-Average Method: A Statistical Approach for Estimating ...

    African Journals Online (AJOL)

    Grade-Average Method: A Statistical Approach for Estimating Missing Value for Continuous Assessment Marks. ... Journal of the Nigerian Association of Mathematical Physics. Journal Home · ABOUT ... Open Access DOWNLOAD FULL TEXT ...

  12. Methods of quantum field theory in statistical physics

    CERN Document Server

    Abrikosov, A A; Gorkov, L P; Silverman, Richard A

    1975-01-01

    This comprehensive introduction to the many-body theory was written by three renowned physicists and acclaimed by American Scientist as ""a classic text on field theoretic methods in statistical physics."

  13. Steganalytic method based on short and repeated sequence distance statistics

    Institute of Scientific and Technical Information of China (English)

    WANG GuoXin; PING XiJian; XU ManKun; ZHANG Tao; BAO XiRui

    2008-01-01

    According to the distribution characteristics of short and repeated sequence (SRS),a steganalytic method based on the correlation of image bit planes is proposed.Firstly,we provide the conception of SRS distance statistics and deduce its statistical distribution.Because the SRS distance statistics can effectively reflect the correlation of the sequence,SRS has statistical features when the image bit plane sequence equals the image width.Using this characteristic,the steganalytic method is fulfilled by the distinct test of Poisson distribution.Experimental results show a good performance for detecting LSB matching steganographic method in still images.By the way,the proposed method is not designed for specific steganographic algorithms and has good generality.

  14. Longitudinal data analysis a handbook of modern statistical methods

    CERN Document Server

    Fitzmaurice, Garrett; Verbeke, Geert; Molenberghs, Geert

    2008-01-01

    Although many books currently available describe statistical models and methods for analyzing longitudinal data, they do not highlight connections between various research threads in the statistical literature. Responding to this void, Longitudinal Data Analysis provides a clear, comprehensive, and unified overview of state-of-the-art theory and applications. It also focuses on the assorted challenges that arise in analyzing longitudinal data. After discussing historical aspects, leading researchers explore four broad themes: parametric modeling, nonparametric and semiparametric methods, joint

  15. Complex Data Modeling and Computationally Intensive Statistical Methods

    CERN Document Server

    Mantovan, Pietro

    2010-01-01

    The last years have seen the advent and development of many devices able to record and store an always increasing amount of complex and high dimensional data; 3D images generated by medical scanners or satellite remote sensing, DNA microarrays, real time financial data, system control datasets. The analysis of this data poses new challenging problems and requires the development of novel statistical models and computational methods, fueling many fascinating and fast growing research areas of modern statistics. The book offers a wide variety of statistical methods and is addressed to statistici

  16. Method for statistical data analysis of multivariate observations

    CERN Document Server

    Gnanadesikan, R

    1997-01-01

    A practical guide for multivariate statistical techniques-- now updated and revised In recent years, innovations in computer technology and statistical methodologies have dramatically altered the landscape of multivariate data analysis. This new edition of Methods for Statistical Data Analysis of Multivariate Observations explores current multivariate concepts and techniques while retaining the same practical focus of its predecessor. It integrates methods and data-based interpretations relevant to multivariate analysis in a way that addresses real-world problems arising in many areas of inte

  17. A Circular Statistical Method for Extracting Rotation Measures

    Indian Academy of Sciences (India)

    S. Sarala; Pankaj Jain

    2002-03-01

    We propose a new method for the extraction of Rotation Measures from spectral polarization data. The method is based on maximum likelihood analysis and takes into account the circular nature of the polarization data. The method is unbiased and statistically more efficient than the standard 2 procedure.

  18. Analysis of Miami Blue Sampling Method and Potential Alternatives

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — Alternatives are provided regarding methods to monitor the population of butterflies given limited staffing and what levels are needed to assess population trends.

  19. A quantitative method for evaluating alternatives. [aid to decision making

    Science.gov (United States)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  20. Statistical Methods for Single-Particle Electron Cryomicroscopy

    DEFF Research Database (Denmark)

    Jensen, Katrine Hommelhoff

    from the noisy, randomly oriented projection images. Many statistical approaches to SPR have been proposed in the past. Typically, due to the computation time complexity, they rely on approximated maximum likelihood (ML) or maximum a posteriori (MAP) estimate of the structure. All methods presented...... between a MAP approach for estimating the protein structure. The resulting method is statistically optimal under the assumption of the uniform prior in the space of rotations. The marginal posterior is constructed by integrating over the view orientations and maximised by the expectation-maximisation (EM...... in this thesis attempt to solve a specific part of the reconstruction problem in a statistically sound manner. Firstly, we propose two methods for solving the problems (1) and (2). They can ultimately be extended and combined into a statistically sound solution to the full SPR problem. We use Bayesian...

  1. [Evaluation of using statistical methods in selected national medical journals].

    Science.gov (United States)

    Sych, Z

    1996-01-01

    The paper covers the performed evaluation of frequency with which the statistical methods were applied in analyzed works having been published in six selected, national medical journals in the years 1988-1992. For analysis the following journals were chosen, namely: Klinika Oczna, Medycyna Pracy, Pediatria Polska, Polski Tygodnik Lekarski, Roczniki Państwowego Zakładu Higieny, Zdrowie Publiczne. Appropriate number of works up to the average in the remaining medical journals was randomly selected from respective volumes of Pol. Tyg. Lek. The studies did not include works wherein the statistical analysis was not implemented, which referred both to national and international publications. That exemption was also extended to review papers, casuistic ones, reviews of books, handbooks, monographies, reports from scientific congresses, as well as papers on historical topics. The number of works was defined in each volume. Next, analysis was performed to establish the mode of finding out a suitable sample in respective studies, differentiating two categories: random and target selections. Attention was also paid to the presence of control sample in the individual works. In the analysis attention was also focussed on the existence of sample characteristics, setting up three categories: complete, partial and lacking. In evaluating the analyzed works an effort was made to present the results of studies in tables and figures (Tab. 1, 3). Analysis was accomplished with regard to the rate of employing statistical methods in analyzed works in relevant volumes of six selected, national medical journals for the years 1988-1992, simultaneously determining the number of works, in which no statistical methods were used. Concurrently the frequency of applying the individual statistical methods was analyzed in the scrutinized works. Prominence was given to fundamental statistical methods in the field of descriptive statistics (measures of position, measures of dispersion) as well as

  2. A methodology to quantify the differences between alternative methods of heart rate variability measurement.

    Science.gov (United States)

    García-González, M A; Fernández-Chimeno, M; Guede-Fernández, F; Ferrer-Mileo, V; Argelagós-Palau, A; Álvarez-Gómez, L; Parrado, E; Moreno, J; Capdevila, L; Ramos-Castro, J

    2016-01-01

    This work proposes a systematic procedure to report the differences between heart rate variability time series obtained from alternative measurements reporting the spread and mean of the differences as well as the agreement between measuring procedures and quantifying how stationary, random and normal the differences between alternative measurements are. A description of the complete automatic procedure to obtain a differences time series (DTS) from two alternative methods, a proposal of a battery of statistical tests, and a set of statistical indicators to better describe the differences in RR interval estimation are also provided. Results show that the spread and agreement depend on the choice of alternative measurements and that the DTS cannot be considered generally as a white or as a normally distributed process. Nevertheless, in controlled measurements the DTS can be considered as a stationary process.

  3. Analysis of Statistical Methods Currently used in Toxicology Journals.

    Science.gov (United States)

    Na, Jihye; Yang, Hyeri; Bae, SeungJin; Lim, Kyung-Min

    2014-09-01

    Statistical methods are frequently used in toxicology, yet it is not clear whether the methods employed by the studies are used consistently and conducted based on sound statistical grounds. The purpose of this paper is to describe statistical methods used in top toxicology journals. More specifically, we sampled 30 papers published in 2014 from Toxicology and Applied Pharmacology, Archives of Toxicology, and Toxicological Science and described methodologies used to provide descriptive and inferential statistics. One hundred thirteen endpoints were observed in those 30 papers, and most studies had sample size less than 10, with the median and the mode being 6 and 3 & 6, respectively. Mean (105/113, 93%) was dominantly used to measure central tendency, and standard error of the mean (64/113, 57%) and standard deviation (39/113, 34%) were used to measure dispersion, while few studies provide justifications regarding why the methods being selected. Inferential statistics were frequently conducted (93/113, 82%), with one-way ANOVA being most popular (52/93, 56%), yet few studies conducted either normality or equal variance test. These results suggest that more consistent and appropriate use of statistical method is necessary which may enhance the role of toxicology in public health.

  4. Alternating Renewal Process Models for Behavioral Observation: Simulation Methods, Software, and Validity Illustrations

    Science.gov (United States)

    Pustejovsky, James E.; Runyon, Christopher

    2014-01-01

    Direct observation recording procedures produce reductive summary measurements of an underlying stream of behavior. Previous methodological studies of these recording procedures have employed simulation methods for generating random behavior streams, many of which amount to special cases of a statistical model known as the alternating renewal…

  5. Oxygen Abundance Methods in SDSS: View from Modern Statistics

    Indian Academy of Sciences (India)

    Fei Shi; Gang Zhao; James Wicker

    2010-09-01

    Our purpose is to find which is the most reliable one among various oxygen abundance determination methods. We will test the validity of several different oxygen abundance determination methods using methods of modern statistics. These methods include Bayesian analysis and information scoring. We will analyze a sample of ∼ 6000 HII galaxies from the Sloan Digital Sky Survey (SDSS) spectroscopic observations data release four. All methods that we used drew the same conclusion that the method is a more reliable oxygen abundance determination method than the Bayesian metallicity method under the existing telescope ability. The ratios of the likelihoods between the different kinds of methods tell us that the , , and 32 methods are consistent with each other because the and 32 methods are calibrated by method. The Bayesian and 23 methods are consistent with each other because both are calibrated by a galaxy model. In either case, the 2 method is an unreliable method.

  6. Brief guidelines for methods and statistics in medical research

    CERN Document Server

    Ab Rahman, Jamalludin

    2015-01-01

    This book serves as a practical guide to methods and statistics in medical research. It includes step-by-step instructions on using SPSS software for statistical analysis, as well as relevant examples to help those readers who are new to research in health and medical fields. Simple texts and diagrams are provided to help explain the concepts covered, and print screens for the statistical steps and the SPSS outputs are provided, together with interpretations and examples of how to report on findings. Brief Guidelines for Methods and Statistics in Medical Research offers a valuable quick reference guide for healthcare students and practitioners conducting research in health related fields, written in an accessible style.

  7. Statistical Methods for Characterizing Variability in Stellar Spectra

    Science.gov (United States)

    Cisewski, Jessi; Yale Astrostatistics

    2017-01-01

    Recent years have seen a proliferation in the number of exoplanets discovered. One technique for uncovering exoplanets relies on the detection of subtle shifts in the stellar spectra due to the Doppler effect caused by an orbiting object. However, stellar activity can cause distortions in the spectra that mimic the imprint of an orbiting exoplanet. The collection of stellar spectra potentially contains more information than is traditionally used for estimating its radial velocity curve. I will discuss some statistical methods that can be used for characterizing the sources of variability in the spectra. Statistical assessment of stellar spectra is a focus of the Statistical and Applied Mathematical Sciences Institute (SAMSI)'s yearlong program on Statistical, Mathematical and Computational Methods for Astronomy's Working Group IV (Astrophysical Populations).

  8. Fundamentals of modern statistical methods substantially improving power and accuracy

    CERN Document Server

    Wilcox, Rand R

    2001-01-01

    Conventional statistical methods have a very serious flaw They routinely miss differences among groups or associations among variables that are detected by more modern techniques - even under very small departures from normality Hundreds of journal articles have described the reasons standard techniques can be unsatisfactory, but simple, intuitive explanations are generally unavailable Improved methods have been derived, but they are far from obvious or intuitive based on the training most researchers receive Situations arise where even highly nonsignificant results become significant when analyzed with more modern methods Without assuming any prior training in statistics, Part I of this book describes basic statistical principles from a point of view that makes their shortcomings intuitive and easy to understand The emphasis is on verbal and graphical descriptions of concepts Part II describes modern methods that address the problems covered in Part I Using data from actual studies, many examples are include...

  9. Complexity of software trustworthiness and its dynamical statistical analysis methods

    Institute of Scientific and Technical Information of China (English)

    ZHENG ZhiMing; MA ShiLong; LI Wei; JIANG Xin; WEI Wei; MA LiLi; TANG ShaoTing

    2009-01-01

    Developing trusted softwares has become an important trend and a natural choice in the development of software technology and applications.At present,the method of measurement and assessment of software trustworthiness cannot guarantee safe and reliable operations of software systems completely and effectively.Based on the dynamical system study,this paper interprets the characteristics of behaviors of software systems and the basic scientific problems of software trustworthiness complexity,analyzes the characteristics of complexity of software trustworthiness,and proposes to study the software trustworthiness measurement in terms of the complexity of software trustworthiness.Using the dynamical statistical analysis methods,the paper advances an invariant-measure based assessment method of software trustworthiness by statistical indices,and hereby provides a dynamical criterion for the untrustworthiness of software systems.By an example,the feasibility of the proposed dynamical statistical analysis method in software trustworthiness measurement is demonstrated using numerical simulations and theoretical analysis.

  10. Statistical Methods for Quantitatively Detecting Fungal Disease from Fruits’ Images

    OpenAIRE

    Jagadeesh D. Pujari; Yakkundimath, Rajesh Siddaramayya; Byadgi, Abdulmunaf Syedhusain

    2013-01-01

    In this paper we have proposed statistical methods for detecting fungal disease and classifying based on disease severity levels.  Most fruits diseases are caused by bacteria, fungi, virus, etc of which fungi are responsible for a large number of diseases in fruits. In this study images of fruits, affected by different fungal symptoms are collected and categorized based on disease severity. Statistical features like block wise, gray level co-occurrence matrix (GLCM), gray level runlength matr...

  11. Hierarchical modelling for the environmental sciences statistical methods and applications

    CERN Document Server

    Clark, James S

    2006-01-01

    New statistical tools are changing the way in which scientists analyze and interpret data and models. Hierarchical Bayes and Markov Chain Monte Carlo methods for analysis provide a consistent framework for inference and prediction where information is heterogeneous and uncertain, processes are complicated, and responses depend on scale. Nowhere are these methods more promising than in the environmental sciences.

  12. The Metropolis Monte Carlo Method in Statistical Physics

    Science.gov (United States)

    Landau, David P.

    2003-11-01

    A brief overview is given of some of the advances in statistical physics that have been made using the Metropolis Monte Carlo method. By complementing theory and experiment, these have increased our understanding of phase transitions and other phenomena in condensed matter systems. A brief description of a new method, commonly known as "Wang-Landau sampling," will also be presented.

  13. Nucleic acid amplification: Alternative methods of polymerase chain reaction.

    Science.gov (United States)

    Fakruddin, Md; Mannan, Khanjada Shahnewaj Bin; Chowdhury, Abhijit; Mazumdar, Reaz Mohammad; Hossain, Md Nur; Islam, Sumaiya; Chowdhury, Md Alimuddin

    2013-10-01

    Nucleic acid amplification is a valuable molecular tool not only in basic research but also in application oriented fields, such as clinical medicine development, infectious diseases diagnosis, gene cloning and industrial quality control. A comperehensive review of the literature on the principles, applications, challenges and prospects of different alternative methods of polymerase chain reaction (PCR) was performed. PCR was the first nucleic acid amplification method. With the advancement of research, a no of alternative nucleic acid amplification methods has been developed such as loop mediated isothermal amplification, nucleic acid sequence based amplification, strand displacement amplification, multiple displacement amplification. Most of the alternative methods are isothermal obviating the need for thermal cyclers. Though principles of most of the alternate methods are relatively complex than that of PCR, they offer better applicability and sensitivity in cases where PCR has limitations. Most of the alternate methods still have to prove themselves through extensive validation studies and are not available in commercial form; they pose the potentiality to be used as replacements of PCR. Continuous research is going on in different parts of the world to make these methods viable technically and economically.

  14. Descriptive and inferential statistical methods used in burns research.

    Science.gov (United States)

    Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars

    2010-05-01

    Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals

  15. Alternate Location Method of a Robot Team in Unknown Environment

    Institute of Scientific and Technical Information of China (English)

    WANG Jian-zhong; LIU Jing-jing

    2008-01-01

    The alternate location method of a robot team is proposed. Three of the robots are kept still as beacon robots, not always the same ones, while the others are regarded as mobile robots. The mobile robots alternatively measure the distance between one of them and three beacon robots with ultrasonic measurement module. The distance data are combined with its dead-reckoning information using iterated extended Kalman filter(IEKF) to realize the optimal estimate of its position. According to the condition the future beacon robots positions should be desired ones, the target function and the nonlinear constrain equations are set up which are used by nonlinear optimization algorithm to estimate the position of the future beacon robots. By alternately changing the robots roles as active beacon, the alternate location in unknown environment can be realized. Process and result of the simulation test are given and the position estimation error is within±10mm, which proves the validity of this method.

  16. Alternative methods in toxicity testing: the current approach

    OpenAIRE

    Araújo,Gabrielle Luck de; Campos,Maria Augusta Amaral; Valente,Maria Anete Santana; Silva,Sarah Cristina Teixeira; França,Flávia Dayrell; Chaves,Miriam Martins; Tagliati, Carlos Alberto

    2014-01-01

    Alternative methods are being developed to reduce, refine, and replace (3Rs) animals used in experiments, aimed at protecting animal welfare. The present study reports alternative tests which are based on the principles of the 3Rs and the efforts made to validate these tests. In Europe, several methodologies have already been implemented, such as tests of irritability, cell viability, and phototoxicity as well as in vitro mathematical models together with the use of in silico tools. This is a...

  17. Academic Training Lecture: Statistical Methods for Particle Physics

    CERN Multimedia

    PH Department

    2012-01-01

    2, 3, 4 and 5 April 2012 Academic Training Lecture  Regular Programme from 11:00 to 12:00 -  Bldg. 222-R-001 - Filtration Plant Statistical Methods for Particle Physics by Glen Cowan (Royal Holloway) The series of four lectures will introduce some of the important statistical methods used in Particle Physics, and should be particularly relevant to those involved in the analysis of LHC data. The lectures will include an introduction to statistical tests, parameter estimation, and the application of these tools to searches for new phenomena.  Both frequentist and Bayesian methods will be described, with particular emphasis on treatment of systematic uncertainties.  The lectures will also cover unfolding, that is, estimation of a distribution in binned form where the variable in question is subject to measurement errors.

  18. Three Methods for Occupation Coding Based on Statistical Learning

    Directory of Open Access Journals (Sweden)

    Gweon Hyukjun

    2017-03-01

    Full Text Available Occupation coding, an important task in official statistics, refers to coding a respondent’s text answer into one of many hundreds of occupation codes. To date, occupation coding is still at least partially conducted manually, at great expense. We propose three methods for automatic coding: combining separate models for the detailed occupation codes and for aggregate occupation codes, a hybrid method that combines a duplicate-based approach with a statistical learning algorithm, and a modified nearest neighbor approach. Using data from the German General Social Survey (ALLBUS, we show that the proposed methods improve on both the coding accuracy of the underlying statistical learning algorithm and the coding accuracy of duplicates where duplicates exist. Further, we find defining duplicates based on ngram variables (a concept from text mining is preferable to one based on exact string matches.

  19. Alternative methods for top quark mass measurements at the CMS

    CERN Document Server

    Kim, Ji Hyun

    2016-01-01

    The top quark mass is a fundamental parameter of the standard model and together with the W boson mass and the Higgs boson mass it provides a strong self-consistency check of the electroweak theory. Recently several new measurements of the top quark mass using alternative observables and reconstruction methods are performed by the CMS collaborations at the CERN LHC. Alternative methods can give a insight by providing different systematic sensitivities while standard ones are currently limited by jet energy uncertainties. We introduce various results from new methods including the one using a charmed meson, which are found to be consistent with what is obtained in standard measurements.

  20. Statistical Methods for Particle Physics (4/4)

    CERN Document Server

    CERN. Geneva

    2012-01-01

    The series of four lectures will introduce some of the important statistical methods used in Particle Physics, and should be particularly relevant to those involved in the analysis of LHC data. The lectures will include an introduction to statistical tests, parameter estimation, and the application of these tools to searches for new phenomena. Both frequentist and Bayesian methods will be described, with particular emphasis on treatment of systematic uncertainties. The lectures will also cover unfolding, that is, estimation of a distribution in binned form where the variable in question is subject to measurement errors.

  1. Statistical Methods for Particle Physics (2/4)

    CERN Document Server

    CERN. Geneva

    2012-01-01

    The series of four lectures will introduce some of the important statistical methods used in Particle Physics, and should be particularly relevant to those involved in the analysis of LHC data. The lectures will include an introduction to statistical tests, parameter estimation, and the application of these tools to searches for new phenomena. Both frequentist and Bayesian methods will be described, with particular emphasis on treatment of systematic uncertainties. The lectures will also cover unfolding, that is, estimation of a distribution in binned form where the variable in question is subject to measurement errors.

  2. Statistical Methods for Particle Physics (1/4)

    CERN Document Server

    CERN. Geneva

    2012-01-01

    The series of four lectures will introduce some of the important statistical methods used in Particle Physics, and should be particularly relevant to those involved in the analysis of LHC data. The lectures will include an introduction to statistical tests, parameter estimation, and the application of these tools to searches for new phenomena. Both frequentist and Bayesian methods will be described, with particular emphasis on treatment of systematic uncertainties. The lectures will also cover unfolding, that is, estimation of a distribution in binned form where the variable in question is subject to measurement errors.

  3. Statistical Methods for Particle Physics (3/4)

    CERN Document Server

    CERN. Geneva

    2012-01-01

    The series of four lectures will introduce some of the important statistical methods used in Particle Physics, and should be particularly relevant to those involved in the analysis of LHC data. The lectures will include an introduction to statistical tests, parameter estimation, and the application of these tools to searches for new phenomena. Both frequentist and Bayesian methods will be described, with particular emphasis on treatment of systematic uncertainties. The lectures will also cover unfolding, that is, estimation of a distribution in binned form where the variable in question is subject to measurement errors.

  4. Understanding common statistical methods, Part I: descriptive methods, probability, and continuous data.

    Science.gov (United States)

    Skinner, Carl G; Patel, Manish M; Thomas, Jerry D; Miller, Michael A

    2011-01-01

    Statistical methods are pervasive in medical research and general medical literature. Understanding general statistical concepts will enhance our ability to critically appraise the current literature and ultimately improve the delivery of patient care. This article intends to provide an overview of the common statistical methods relevant to medicine.

  5. A novel statistical method for classifying habitat generalists and specialists

    DEFF Research Database (Denmark)

    Chazdon, Robin L; Chao, Anne; Colwell, Robert K

    2011-01-01

    We develop a novel statistical approach for classifying generalists and specialists in two distinct habitats. Using a multinomial model based on estimated species relative abundance in two habitats, our method minimizes bias due to differences in sampling intensities between two habitat types...... as well as bias due to insufficient sampling within each habitat. The method permits a robust statistical classification of habitat specialists and generalists, without excluding rare species a priori. Based on a user-defined specialization threshold, the model classifies species into one of four groups...... fraction (57.7%) of bird species with statistical confidence. Based on a conservative specialization threshold and adjustment for multiple comparisons, 64.4% of tree species in the full sample were too rare to classify with confidence. Among the species classified, OG specialists constituted the largest...

  6. Urban Fire Risk Clustering Method Based on Fire Statistics

    Institute of Scientific and Technical Information of China (English)

    WU Lizhi; REN Aizhu

    2008-01-01

    Fire statistics and fire analysis have become important ways for us to understand the law of fire,prevent the occurrence of fire, and improve the ability to control fire. According to existing fire statistics, the weighted fire risk calculating method characterized by the number of fire occurrence, direct economic losses,and fire casualties was put forward. On the basis of this method, meanwhile having improved K-mean clus-tering arithmetic, this paper established fire dsk K-mean clustering model, which could better resolve the automatic classifying problems towards fire risk. Fire risk cluster should be classified by the absolute dis-tance of the target instead of the relative distance in the traditional cluster arithmetic. Finally, for applying the established model, this paper carded out fire risk clustering on fire statistics from January 2000 to December 2004 of Shenyang in China. This research would provide technical support for urban fire management.

  7. Statistical methods with applications to demography and life insurance

    CERN Document Server

    Khmaladze, Estáte V

    2013-01-01

    Suitable for statisticians, mathematicians, actuaries, and students interested in the problems of insurance and analysis of lifetimes, Statistical Methods with Applications to Demography and Life Insurance presents contemporary statistical techniques for analyzing life distributions and life insurance problems. It not only contains traditional material but also incorporates new problems and techniques not discussed in existing actuarial literature. The book mainly focuses on the analysis of an individual life and describes statistical methods based on empirical and related processes. Coverage ranges from analyzing the tails of distributions of lifetimes to modeling population dynamics with migrations. To help readers understand the technical points, the text covers topics such as the Stieltjes, Wiener, and Itô integrals. It also introduces other themes of interest in demography, including mixtures of distributions, analysis of longevity and extreme value theory, and the age structure of a population. In addi...

  8. Landslide Susceptibility Statistical Methods: A Critical and Systematic Literature Review

    Science.gov (United States)

    Mihir, Monika; Malamud, Bruce; Rossi, Mauro; Reichenbach, Paola; Ardizzone, Francesca

    2014-05-01

    Landslide susceptibility assessment, the subject of this systematic review, is aimed at understanding the spatial probability of slope failures under a set of geomorphological and environmental conditions. It is estimated that about 375 landslides that occur globally each year are fatal, with around 4600 people killed per year. Past studies have brought out the increasing cost of landslide damages which primarily can be attributed to human occupation and increased human activities in the vulnerable environments. Many scientists, to evaluate and reduce landslide risk, have made an effort to efficiently map landslide susceptibility using different statistical methods. In this paper, we do a critical and systematic landslide susceptibility literature review, in terms of the different statistical methods used. For each of a broad set of studies reviewed we note: (i) study geography region and areal extent, (ii) landslide types, (iii) inventory type and temporal period covered, (iv) mapping technique (v) thematic variables used (vi) statistical models, (vii) assessment of model skill, (viii) uncertainty assessment methods, (ix) validation methods. We then pulled out broad trends within our review of landslide susceptibility, particularly regarding the statistical methods. We found that the most common statistical methods used in the study of landslide susceptibility include logistic regression, artificial neural network, discriminant analysis and weight of evidence. Although most of the studies we reviewed assessed the model skill, very few assessed model uncertainty. In terms of geographic extent, the largest number of landslide susceptibility zonations were in Turkey, Korea, Spain, Italy and Malaysia. However, there are also many landslides and fatalities in other localities, particularly India, China, Philippines, Nepal and Indonesia, Guatemala, and Pakistan, where there are much fewer landslide susceptibility studies available in the peer-review literature. This

  9. Investigating salt frost scaling by using statistical methods

    DEFF Research Database (Denmark)

    Hasholt, Marianne Tange; Clemmensen, Line Katrine Harder

    2010-01-01

    A large data set comprising data for 118 concrete mixes on mix design, air void structure, and the outcome of freeze/thaw testing according to SS 13 72 44 has been analysed by use of statistical methods. The results show that with regard to mix composition, the most important parameter...

  10. Statistical methods for cosmological parameter selection and estimation

    CERN Document Server

    Liddle, Andrew R

    2009-01-01

    The estimation of cosmological parameters from precision observables is an important industry with crucial ramifications for particle physics. This article discusses the statistical methods presently used in cosmological data analysis, highlighting the main assumptions and uncertainties. The topics covered are parameter estimation, model selection, multi-model inference, and experimental design, all primarily from a Bayesian perspective.

  11. Kansas's forests, 2005: statistics, methods, and quality assurance

    Science.gov (United States)

    Patrick D. Miles; W. Keith Moser; Charles J. Barnett

    2011-01-01

    The first full annual inventory of Kansas's forests was completed in 2005 after 8,868 plots were selected and 468 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of Kansas inventory is presented...

  12. Optimization of statistical methods impact on quantitative proteomics data

    NARCIS (Netherlands)

    Pursiheimo, A.; Vehmas, A.P.; Afzal, S.; Suomi, T.; Chand, T.; Strauss, L.; Poutanen, M.; Rokka, A.; Corthals, G.L.; Elo, L.L.

    2015-01-01

    As tools for quantitative label-free mass spectrometry (MS) rapidly develop, a consensus about the best practices is not apparent. In the work described here we compared popular statistical methods for detecting differential protein expression from quantitative MS data using both controlled

  13. Application of statistical methods at copper wire manufacturing

    Directory of Open Access Journals (Sweden)

    Z. Hajduová

    2009-01-01

    Full Text Available Six Sigma is a method of management that strives for near perfection. The Six Sigma methodology uses data and rigorous statistical analysis to identify defects in a process or product, reduce variability and achieve as close to zero defects as possible. The paper presents the basic information on this methodology.

  14. Peer-Assisted Learning in Research Methods and Statistics

    Science.gov (United States)

    Stone, Anna; Meade, Claire; Watling, Rosamond

    2012-01-01

    Feedback from students on a Level 1 Research Methods and Statistics module, studied as a core part of a BSc Psychology programme, highlighted demand for additional tutorials to help them to understand basic concepts. Students in their final year of study commonly request work experience to enhance their employability. All students on the Level 1…

  15. Investigating salt frost scaling by using statistical methods

    DEFF Research Database (Denmark)

    Hasholt, Marianne Tange; Clemmensen, Line Katrine Harder

    2010-01-01

    A large data set comprising data for 118 concrete mixes on mix design, air void structure, and the outcome of freeze/thaw testing according to SS 13 72 44 has been analysed by use of statistical methods. The results show that with regard to mix composition, the most important parameter is the equ...

  16. Recent development on statistical methods for personalized medicine discovery.

    Science.gov (United States)

    Zhao, Yingqi; Zeng, Donglin

    2013-03-01

    It is well documented that patients can show significant heterogeneous responses to treatments so the best treatment strategies may require adaptation over individuals and time. Recently, a number of new statistical methods have been developed to tackle the important problem of estimating personalized treatment rules using single-stage or multiple-stage clinical data. In this paper, we provide an overview of these methods and list a number of challenges.

  17. Statistical inference methods for two crossing survival curves: a comparison of methods.

    Science.gov (United States)

    Li, Huimin; Han, Dong; Hou, Yawen; Chen, Huilin; Chen, Zheng

    2015-01-01

    A common problem that is encountered in medical applications is the overall homogeneity of survival distributions when two survival curves cross each other. A survey demonstrated that under this condition, which was an obvious violation of the assumption of proportional hazard rates, the log-rank test was still used in 70% of studies. Several statistical methods have been proposed to solve this problem. However, in many applications, it is difficult to specify the types of survival differences and choose an appropriate method prior to analysis. Thus, we conducted an extensive series of Monte Carlo simulations to investigate the power and type I error rate of these procedures under various patterns of crossing survival curves with different censoring rates and distribution parameters. Our objective was to evaluate the strengths and weaknesses of tests in different situations and for various censoring rates and to recommend an appropriate test that will not fail for a wide range of applications. Simulation studies demonstrated that adaptive Neyman's smooth tests and the two-stage procedure offer higher power and greater stability than other methods when the survival distributions cross at early, middle or late times. Even for proportional hazards, both methods maintain acceptable power compared with the log-rank test. In terms of the type I error rate, Renyi and Cramér-von Mises tests are relatively conservative, whereas the statistics of the Lin-Xu test exhibit apparent inflation as the censoring rate increases. Other tests produce results close to the nominal 0.05 level. In conclusion, adaptive Neyman's smooth tests and the two-stage procedure are found to be the most stable and feasible approaches for a variety of situations and censoring rates. Therefore, they are applicable to a wider spectrum of alternatives compared with other tests.

  18. A new statistical method for mapping QTLs underlying endosperm traits

    Institute of Scientific and Technical Information of China (English)

    HU Zhiqiu; XU Chenwu

    2005-01-01

    Genetic expression for an endosperm trait in seeds of cereal crops may be controlled simultaneously by the triploid endosperm genotypes and the diploid maternal genotypes. However, current statistical methods for mapping quantitative trait loci (QTLs) underlying endosperm traits have not been effective in dealing with the putative maternal genetic effects. Combining the quantitative genetic model for diploid maternal traits with triploid endosperm traits, here we propose a new statistical method for mapping QTLs controlling endosperm traits with maternal genetic effects. This method applies the data set of both DNA molecular marker genotypes of each plant in segregation population and the quantitative observations of single endosperms in each plant to map QTL. The maximum likelihood method implemented via the expectation-maximization algorithm was used to the estimate parameters of a putative QTL. Since this method involves the maternal effect that may contribute to endosperm traits, it might be more congruent with the genetics of endosperm traits and more helpful to increasing the precision of QTL mapping. The simulation results show the proposed method provides accurate estimates of the QTL effects and locations with high statistical power.

  19. Using statistical methods of quality management in logistics processes

    Directory of Open Access Journals (Sweden)

    Tkachenko Alla

    2016-04-01

    Full Text Available The purpose of the paper is to study the application of statistical methods of logistics process quality management at a large industrial enterprise and testing the theoretical studies. The analysis of the publications shows that a significant number of works by both Ukrainian and foreign authors has been dedicated to the research of quality management, while statistical methods of quality management have only been thoroughly analyzed by a small number of researchers, since these methods are referred to as classical, that is, those that are considered well-known and do not require special attention of modern scholars. In the authors’ opinion, the logistics process is a process of transformation and movement of material and accompanying flows by ensuring management freedom under the conditions of sequential interdependencies; standardization; synchronization; sharing information, and consistency of incentives, using innovative methods and models. In our study, we have shown that the management of logistics processes should use such statistical methods of quality management as descriptive statistics, experiment planning, hypotheses testing, measurement analysis, process opportunities analysis, regression analysis, reliability analysis, sampling, modeling, maps of statistical process control, specification of statistical tolerance, time series analysis. The proposed statistical methods of logistics processes quality management have been tested at the large industrial enterprise JSC "Dniepropetrovsk Aggregate Plant" that specializes in manufacturing hydraulic control valves. The findings suggest that the main purpose in the sphere of logistics processes quality is the continuous improvement of the mining equipment production quality through the use of innovative processes, advanced management systems and information technology. This will enable the enterprise to meet the requirements and expectations of their customers. It has been proved that the

  20. An Alternative Surgical Method for Treatment of Osteoid Osteoma.

    Science.gov (United States)

    Gökalp, Mehmet Ata; Gözen, Abdurrahim; Ünsal, Seyyid Şerif; Önder, Haci; Güner, Savaş

    2016-02-22

    BACKGROUND An osteoid osteoma is a benign bone tumor that tends to be osteoma can be treated with various conservative and surgical methods, but these have some risks and difficulties. The purpose of the present study was to present an alternative treatment method for osteoid osteoma and the results we obtained. MATERIAL AND METHODS In the period from 2010 to 2014, 10 patients with osteoid osteoma underwent nidus excision by using a safe alternative method in an operating room (OR) with no computed tomography (CT). The localization of the tumor was determined by use of a CT-guided Kirschner wire in the radiology unit, then, in the OR the surgical intervention was performed without removing the Kirschner wire. RESULTS Following the alternative intervention, all the patients were completely relieved of pain. In the follow-up, no recurrence or complication occurred. CONCLUSIONS The presented alternative method for treating osteoid osteoma is an efficient and practical procedure for surgeons working in clinics that lack specialized equipment.

  1. An Alternative Surgical Method for Treatment of Osteoid Osteoma

    Science.gov (United States)

    Gökalp, Mehmet Ata; Gözen, Abdurrahim; Ünsal, Seyyid Şerif; Önder, Haci; Güner, Savaş

    2016-01-01

    Background An osteoid osteoma is a benign bone tumor that tends to be osteoma can be treated with various conservative and surgical methods, but these have some risks and difficulties. The purpose of the present study was to present an alternative treatment method for osteoid osteoma and the results we obtained. Material/Methods In the period from 2010 to 2014, 10 patients with osteoid osteoma underwent nidus excision by using a safe alternative method in an operating room (OR) with no computed tomography (CT). The localization of the tumor was determined by use of a CT-guided Kirschner wire in the radiology unit, then, in the OR the surgical intervention was performed without removing the Kirschner wire. Results Following the alternative intervention, all the patients were completely relieved of pain. In the follow-up, no recurrence or complication occurred. Conclusions The presented alternative method for treating osteoid osteoma is an efficient and practical procedure for surgeons working in clinics that lack specialized equipment. PMID:26898923

  2. Statistical Properties of Fluctuations: A Method to Check Market Behavior

    CERN Document Server

    Panigrahi, Prasanta K; Manimaran, P; Ahalpara, Dilip P

    2009-01-01

    We analyze the Bombay stock exchange (BSE) price index over the period of last 12 years. Keeping in mind the large fluctuations in last few years, we carefully find out the transient, non-statistical and locally structured variations. For that purpose, we make use of Daubechies wavelet and characterize the fractal behavior of the returns using a recently developed wavelet based fluctuation analysis method. the returns show a fat-tail distribution as also weak non-statistical behavior. We have also carried out continuous wavelet as well as Fourier power spectral analysis to characterize the periodic nature and correlation properties of the time series.

  3. System and method for statistically monitoring and analyzing sensed conditions

    Science.gov (United States)

    Pebay, Philippe P.; Brandt, James M. , Gentile; Ann C. , Marzouk; Youssef M. , Hale; Darrian J. , Thompson; David C.

    2010-07-13

    A system and method of monitoring and analyzing a plurality of attributes for an alarm condition is disclosed. The attributes are processed and/or unprocessed values of sensed conditions of a collection of a statistically significant number of statistically similar components subjected to varying environmental conditions. The attribute values are used to compute the normal behaviors of some of the attributes and also used to infer parameters of a set of models. Relative probabilities of some attribute values are then computed and used along with the set of models to determine whether an alarm condition is met. The alarm conditions are used to prevent or reduce the impact of impending failure.

  4. From Microphysics to Macrophysics Methods and Applications of Statistical Physics

    CERN Document Server

    Balian, Roger

    2007-01-01

    This text not only provides a thorough introduction to statistical physics and thermodynamics but also exhibits the universality of the chain of ideas that leads from the laws of microphysics to the macroscopic behaviour of matter. A wide range of applications teaches students how to make use of the concepts, and many exercises will help to deepen their understanding. Drawing on both quantum mechanics and classical physics, the book follows modern research in statistical physics. Volume I discusses in detail the probabilistic description of quantum or classical systems, the Boltzmann-Gibbs distributions, the conservation laws, and the interpretation of entropy as missing information. Thermodynamics and electromagnetism in matter are dealt with, as well as applications to gases, both dilute and condensed, and to phase transitions. Volume II applies statistical methods to systems governed by quantum effects, in particular to solid state physics, explaining properties due to the crystal structure or to the latti...

  5. Applied statistical methods in agriculture, health and life sciences

    CERN Document Server

    Lawal, Bayo

    2014-01-01

    This textbook teaches crucial statistical methods to answer research questions using a unique range of statistical software programs, including MINITAB and R. This textbook is developed for undergraduate students in agriculture, nursing, biology and biomedical research. Graduate students will also find it to be a useful way to refresh their statistics skills and to reference software options. The unique combination of examples is approached using MINITAB and R for their individual strengths. Subjects covered include among others data description, probability distributions, experimental design, regression analysis, randomized design and biological assay. Unlike other biostatistics textbooks, this text also includes outliers, influential observations in regression and an introduction to survival analysis. Material is taken from the author's extensive teaching and research in Africa, USA and the UK. Sample problems, references and electronic supplementary material accompany each chapter.

  6. Predicting recreational water quality advisories: A comparison of statistical methods

    Science.gov (United States)

    Brooks, Wesley R.; Corsi, Steven R.; Fienen, Michael N.; Carvin, Rebecca B.

    2016-01-01

    Epidemiological studies indicate that fecal indicator bacteria (FIB) in beach water are associated with illnesses among people having contact with the water. In order to mitigate public health impacts, many beaches are posted with an advisory when the concentration of FIB exceeds a beach action value. The most commonly used method of measuring FIB concentration takes 18–24 h before returning a result. In order to avoid the 24 h lag, it has become common to ”nowcast” the FIB concentration using statistical regressions on environmental surrogate variables. Most commonly, nowcast models are estimated using ordinary least squares regression, but other regression methods from the statistical and machine learning literature are sometimes used. This study compares 14 regression methods across 7 Wisconsin beaches to identify which consistently produces the most accurate predictions. A random forest model is identified as the most accurate, followed by multiple regression fit using the adaptive LASSO.

  7. Statistical disclosure control for microdata methods and applications in R

    CERN Document Server

    Templ, Matthias

    2017-01-01

    This book on statistical disclosure control presents the theory, applications and software implementation of the traditional approach to (micro)data anonymization, including data perturbation methods, disclosure risk, data utility, information loss and methods for simulating synthetic data. Introducing readers to the R packages sdcMicro and simPop, the book also features numerous examples and exercises with solutions, as well as case studies with real-world data, accompanied by the underlying R code to allow readers to reproduce all results. The demand for and volume of data from surveys, registers or other sources containing sensible information on persons or enterprises have increased significantly over the last several years. At the same time, privacy protection principles and regulations have imposed restrictions on the access and use of individual data. Proper and secure microdata dissemination calls for the application of statistical disclosure control methods to the data before release. This book is in...

  8. An alternative method for assessing early mortality in contemporary populations.

    Science.gov (United States)

    Wiley, A S; Pike, I L

    1998-11-01

    Biological anthropologists are interested in a population's early mortality rates for a variety of reasons. Early mortality (infant or juvenile) is of obvious importance to those interested in demography, but early mortality statistics are useful for life history analysis, paleodemography, and human adaptability studies, among others. In general, the form of mortality statistics is derived from demography, where chronological age is the gold standard for statistical calculation and comparison. However, there are numerous problems associated with the collection, analysis, and interpretation of early mortality statistics based on age, particularly for anthropological research, which is often conducted in small or non-calendrical-age numerate populations. The infant mortality rate (IMR), for example, is notoriously difficult to determine in populations where accurate accounting of age is not routine, and yet it is widely used in demography, public health, medicine, and social science research. Here we offer an alternative to age-based early mortality statistics that makes use of human biologists' interest in, and skill at, assessing human growth and development. Our proposal is to use developmental stages of juveniles instead of relying exclusively on age as the basis for mortality statistics. Death or survival according to a developmental stage (such as crawling or weaning) may provide more accurate data that are also more closely related to the cause of death. Developmental stages have the added advantage of putting infants and children back at the center of the discussion of early mortality by focusing on their activities in relation to their environment. A case study from the Turkana population of Kenya illustrates the use of developmental stages in describing early mortality.

  9. 27 CFR 24.22 - Alternate method or procedure.

    Science.gov (United States)

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Alternate method or procedure. 24.22 Section 24.22 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX AND TRADE BUREAU, DEPARTMENT OF THE TREASURY LIQUORS WINE Administrative and Miscellaneous Provisions...

  10. A fast alternating projection method for complex frequency estimation

    CERN Document Server

    Andersson, Fredrik; Ivert, Per-Anders

    2011-01-01

    The problem of approximating a sampled function using sums of a fixed number of complex exponentials is considered. We use alternating projections between fixed rank matrices and Hankel matrices to obtain such an approximation. Convergence, convergence rates and error estimates for this technique are proven, and fast algorithms are developed. We compare the numerical results obtain with the MUSIC and ESPRIT methods.

  11. Innovative Teaching Practice: Traditional and Alternative Methods (Challenges and Implications)

    Science.gov (United States)

    Nurutdinova, Aida R.; Perchatkina, Veronika G.; Zinatullina, Liliya M.; Zubkova, Guzel I.; Galeeva, Farida T.

    2016-01-01

    The relevance of the present issue is caused be the strong need in alternative methods of learning foreign language and the need in language training and retraining for the modern professionals. The aim of the article is to identify the basic techniques and skills in using various modern techniques in the context of modern educational tasks. The…

  12. 40 CFR 35.6315 - Alternative methods for obtaining property.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 1 2010-07-01 2010-07-01 false Alternative methods for obtaining property. 35.6315 Section 35.6315 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GRANTS AND OTHER FEDERAL ASSISTANCE STATE AND LOCAL ASSISTANCE Cooperative Agreements and Superfund State Contracts for Superfund Response Actions...

  13. Alternative method of retesting UF{sub 6} cylinders

    Energy Technology Data Exchange (ETDEWEB)

    Christ, R. [Nuclear Crago + Service GmbH, Hanau (Germany)

    1991-12-31

    The paper describes an alternative method to perform the periodic inspection of UF{sub 6} cylinders. The hydraulic test is replaced by ultrasonic checking of wall thickness and by magnetic particle testing of all the weld seams. Information about the legal background, the air leak test and the qualification of inspectors is also given.

  14. Alternating Anderson-Richardson method: An efficient alternative to preconditioned Krylov methods for large, sparse linear systems

    CERN Document Server

    Suryanarayana, Phanish; Pask, John E

    2016-01-01

    We generalize the recently proposed Alternating Anderson-Jacobi (AAJ) method (Pratapa et al., J. Comput. Phys. (2016), 306, 43--54) to include preconditioning, and demonstrate its efficiency and scaling in the solution of large, sparse linear systems on parallel computers. The resulting preconditioned Alternating Anderson-Richardson (AAR) method reduces to the AAJ method for a particular choice of preconditioner. The AAR method employs Anderson extrapolation at periodic intervals within a preconditioned Richardson iteration to accelerate convergence. In this work, we develop a version of the method that is particularly well suited for scalable high-performance computing. In applications to Helmholtz and Poisson equations, we show that the strong and weak parallel scaling of AAR is superior to both Generalized Minimal Residual (GMRES) and Conjugate Gradient (CG) methods, using the same preconditioning, in large-scale parallel calculations employing up to 110,592 computational cores. Moreover, we find that the ...

  15. Statistical methods of SNP data analysis with applications

    CERN Document Server

    Bulinski, Alexander; Shashkin, Alexey; Yaskov, Pavel

    2011-01-01

    Various statistical methods important for genetic analysis are considered and developed. Namely, we concentrate on the multifactor dimensionality reduction, logic regression, random forests and stochastic gradient boosting. These methods and their new modifications, e.g., the MDR method with "independent rule", are used to study the risk of complex diseases such as cardiovascular ones. The roles of certain combinations of single nucleotide polymorphisms and external risk factors are examined. To perform the data analysis concerning the ischemic heart disease and myocardial infarction the supercomputer SKIF "Chebyshev" of the Lomonosov Moscow State University was employed.

  16. SOLVING PROBLEMS OF STATISTICS WITH THE METHODS OF INFORMATION THEORY

    Directory of Open Access Journals (Sweden)

    Lutsenko Y. V.

    2015-02-01

    Full Text Available The article presents a theoretical substantiation, methods of numerical calculations and software implementation of the decision of problems of statistics, in particular the study of statistical distributions, methods of information theory. On the basis of empirical data by calculation we have determined the number of observations used for the analysis of statistical distributions. The proposed method of calculating the amount of information is not based on assumptions about the independence of observations and the normal distribution, i.e., is non-parametric and ensures the correct modeling of nonlinear systems, and also allows comparable to process heterogeneous (measured in scales of different types data numeric and non-numeric nature that are measured in different units. Thus, ASC-analysis and "Eidos" system is a modern innovation (ready for implementation technology solving problems of statistical methods of information theory. This article can be used as a description of the laboratory work in the disciplines of: intelligent systems; knowledge engineering and intelligent systems; intelligent technologies and knowledge representation; knowledge representation in intelligent systems; foundations of intelligent systems; introduction to neuromaturation and methods neural networks; fundamentals of artificial intelligence; intelligent technologies in science and education; knowledge management; automated system-cognitive analysis and "Eidos" intelligent system which the author is developing currently, but also in other disciplines associated with the transformation of data into information, and its transformation into knowledge and application of this knowledge to solve problems of identification, forecasting, decision making and research of the simulated subject area (which is virtually all subjects in all fields of science

  17. International Harmonization and Cooperation in the Validation of Alternative Methods.

    Science.gov (United States)

    Barroso, João; Ahn, Il Young; Caldeira, Cristiane; Carmichael, Paul L; Casey, Warren; Coecke, Sandra; Curren, Rodger; Desprez, Bertrand; Eskes, Chantra; Griesinger, Claudius; Guo, Jiabin; Hill, Erin; Roi, Annett Janusch; Kojima, Hajime; Li, Jin; Lim, Chae Hyung; Moura, Wlamir; Nishikawa, Akiyoshi; Park, HyeKyung; Peng, Shuangqing; Presgrave, Octavio; Singer, Tim; Sohn, Soo Jung; Westmoreland, Carl; Whelan, Maurice; Yang, Xingfen; Yang, Ying; Zuang, Valérie

    The development and validation of scientific alternatives to animal testing is important not only from an ethical perspective (implementation of 3Rs), but also to improve safety assessment decision making with the use of mechanistic information of higher relevance to humans. To be effective in these efforts, it is however imperative that validation centres, industry, regulatory bodies, academia and other interested parties ensure a strong international cooperation, cross-sector collaboration and intense communication in the design, execution, and peer review of validation studies. Such an approach is critical to achieve harmonized and more transparent approaches to method validation, peer-review and recommendation, which will ultimately expedite the international acceptance of valid alternative methods or strategies by regulatory authorities and their implementation and use by stakeholders. It also allows achieving greater efficiency and effectiveness by avoiding duplication of effort and leveraging limited resources. In view of achieving these goals, the International Cooperation on Alternative Test Methods (ICATM) was established in 2009 by validation centres from Europe, USA, Canada and Japan. ICATM was later joined by Korea in 2011 and currently also counts with Brazil and China as observers. This chapter describes the existing differences across world regions and major efforts carried out for achieving consistent international cooperation and harmonization in the validation and adoption of alternative approaches to animal testing.

  18. Applied systems ecology: models, data, and statistical methods

    Energy Technology Data Exchange (ETDEWEB)

    Eberhardt, L L

    1976-01-01

    In this report, systems ecology is largely equated to mathematical or computer simulation modelling. The need for models in ecology stems from the necessity to have an integrative device for the diversity of ecological data, much of which is observational, rather than experimental, as well as from the present lack of a theoretical structure for ecology. Different objectives in applied studies require specialized methods. The best predictive devices may be regression equations, often non-linear in form, extracted from much more detailed models. A variety of statistical aspects of modelling, including sampling, are discussed. Several aspects of population dynamics and food-chain kinetics are described, and it is suggested that the two presently separated approaches should be combined into a single theoretical framework. It is concluded that future efforts in systems ecology should emphasize actual data and statistical methods, as well as modelling.

  19. Statistical methods for detecting differentially methylated loci and regions

    Directory of Open Access Journals (Sweden)

    Mark D Robinson

    2014-09-01

    Full Text Available DNA methylation, the reversible addition of methyl groups at CpG dinucleotides, represents an important regulatory layer associated with gene expression. Changed methylation status has been noted across diverse pathological states, including cancer. The rapid development and uptake of microarrays and large scale DNA sequencing has prompted an explosion of data analytic methods for processing and discovering changes in DNA methylation across varied data types. In this mini-review, we present a compact and accessible discussion of many of the salient challenges, such as experimental design, statistical methods for differential methylation detection, critical considerations such as cell type composition and the potential confounding that can arise from batch effects. From a statistical perspective, our main interests include the use of empirical Bayes or hierarchical models, which have proved immensely powerful in genomics, and the procedures by which false discovery control is achieved.

  20. Multivariate Statistical Process Control Process Monitoring Methods and Applications

    CERN Document Server

    Ge, Zhiqiang

    2013-01-01

      Given their key position in the process control industry, process monitoring techniques have been extensively investigated by industrial practitioners and academic control researchers. Multivariate statistical process control (MSPC) is one of the most popular data-based methods for process monitoring and is widely used in various industrial areas. Effective routines for process monitoring can help operators run industrial processes efficiently at the same time as maintaining high product quality. Multivariate Statistical Process Control reviews the developments and improvements that have been made to MSPC over the last decade, and goes on to propose a series of new MSPC-based approaches for complex process monitoring. These new methods are demonstrated in several case studies from the chemical, biological, and semiconductor industrial areas.   Control and process engineers, and academic researchers in the process monitoring, process control and fault detection and isolation (FDI) disciplines will be inter...

  1. Multivariate methods and forecasting with IBM SPSS statistics

    CERN Document Server

    Aljandali, Abdulkader

    2017-01-01

    This is the second of a two-part guide to quantitative analysis using the IBM SPSS Statistics software package; this volume focuses on multivariate statistical methods and advanced forecasting techniques. More often than not, regression models involve more than one independent variable. For example, forecasting methods are commonly applied to aggregates such as inflation rates, unemployment, exchange rates, etc., that have complex relationships with determining variables. This book introduces multivariate regression models and provides examples to help understand theory underpinning the model. The book presents the fundamentals of multivariate regression and then moves on to examine several related techniques that have application in business-orientated fields such as logistic and multinomial regression. Forecasting tools such as the Box-Jenkins approach to time series modeling are introduced, as well as exponential smoothing and naïve techniques. This part also covers hot topics such as Factor Analysis, Dis...

  2. Statistical Methods for Thermonuclear Reaction Rates and Nucleosynthesis Simulations

    CERN Document Server

    Iliadis, Christian; Coc, Alain; Timmes, F X; Champagne, Art E

    2014-01-01

    Rigorous statistical methods for estimating thermonuclear reaction rates and nucleosynthesis are becoming increasingly established in nuclear astrophysics. The main challenge being faced is that experimental reaction rates are highly complex quantities derived from a multitude of different measured nuclear parameters (e.g., astrophysical S-factors, resonance energies and strengths, particle and gamma-ray partial widths). We discuss the application of the Monte Carlo method to two distinct, but related, questions. First, given a set of measured nuclear parameters, how can one best estimate the resulting thermonuclear reaction rates and associated uncertainties? Second, given a set of appropriate reaction rates, how can one best estimate the abundances from nucleosynthesis (i.e., reaction network) calculations? The techniques described here provide probability density functions that can be used to derive statistically meaningful reaction rates and final abundances for any desired coverage probability. Examples ...

  3. Statistical methods for longitudinal data with agricultural applications

    DEFF Research Database (Denmark)

    Anantharama Ankinakatte, Smitha

    The PhD study focuses on modeling two kings of longitudinal data arising in agricultural applications: continuous time series data and discrete longitudinal data. Firstly, two statistical methods, neural networks and generalized additive models, are applied to predict masistis using multivariate...... algorithm. This was found to compare favourably with the algorithm implemented in the well-known Beagle software. Finally, an R package to apply APFA models developed as part of the PhD project is described...

  4. Diametral creep prediction of pressure tube using statistical regression methods

    Energy Technology Data Exchange (ETDEWEB)

    Kim, D. [Korea Advanced Inst. of Science and Technology, Daejeon (Korea, Republic of); Lee, J.Y. [Korea Electric Power Research Inst., Daejeon (Korea, Republic of); Na, M.G. [Chosun Univ., Gwangju (Korea, Republic of); Jang, C. [Korea Advanced Inst. of Science and Technology, Daejeon (Korea, Republic of)

    2010-07-01

    Diametral creep prediction of pressure tube in CANDU reactor is an important factor for ROPT calculation. In this study, pressure tube diametral creep prediction models were developed using statistical regression method such as linear mixed model for longitudinal data analysis. Inspection and operating condition data of Wolsong unit 1 and 2 reactors were used. Serial correlation model and random coefficient model were developed for pressure tube diameter prediction. Random coefficient model provided more accurate results than serial correlation model. (author)

  5. Application of Method of Multicriteria Alternatives for Land Assessment

    Directory of Open Access Journals (Sweden)

    Pavel V. Grigorev

    2017-06-01

    Full Text Available This article discusses the use of the multicriteria alternatives method for the assessment of a real estate object taking into account the concept of a system of standards, rules and requirements in the field of valuation activities, considering international standards for valuation. The main means for work and costs associated with allotment and development of the built-up area are indicated. In the work, the assessment of four sites is carried out taking into account three parameters: the distance from the construction site to the center by car; cost of 1 ha of land of each of the plots; deterioration of the centralized heat supply networks. The results show that the method of multicriteria alternatives is objective and optimal when comparing land sites on the criteria with different units of measurements. The advantage of this method is the possibility to apply it to evaluation in different areas of the economy.

  6. Statistical method for detecting structural change in the growth process.

    Science.gov (United States)

    Ninomiya, Yoshiyuki; Yoshimoto, Atsushi

    2008-03-01

    Due to competition among individual trees and other exogenous factors that change the growth environment, each tree grows following its own growth trend with some structural changes in growth over time. In the present article, a new method is proposed to detect a structural change in the growth process. We formulate the method as a simple statistical test for signal detection without constructing any specific model for the structural change. To evaluate the p-value of the test, the tube method is developed because the regular distribution theory is insufficient. Using two sets of tree diameter growth data sampled from planted forest stands of Cryptomeria japonica in Japan, we conduct an analysis of identifying the effect of thinning on the growth process as a structural change. Our results demonstrate that the proposed method is useful to identify the structural change caused by thinning. We also provide the properties of the method in terms of the size and power of the test.

  7. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  8. Alternative methods for the control of postharvest citrus diseases.

    Science.gov (United States)

    Talibi, I; Boubaker, H; Boudyach, E H; Ait Ben Aoumar, A

    2014-07-01

    The postharvest diseases of citrus fruit cause considerable losses during storage and transportation. These diseases are managed principally by the application of synthetic fungicides. However, the increasing concern for health hazards and environmental pollution due to chemical use has required the development of alternative strategies for the control of postharvest citrus diseases. Management of postharvest diseases using microbial antagonists, natural plant-derived products and Generally Recognized As Safe compounds has been demonstrated to be most suitable to replace the synthetic fungicides, which are either being banned or recommended for limited use. However, application of these alternatives by themselves may not always provide a commercially acceptable level of control of postharvest citrus diseases comparable to that obtained with synthetic fungicides. To provide more effective disease control, a multifaceted approach based on the combination of different postharvest treatments has been adopted. Actually, despite the distinctive features of these alternative methods, several reasons hinder the commercial use of such treatments. Consequently, research should emphasize the development of appropriate tools to effectively implement these alternative methods to commercial citrus production.

  9. AN ALTERNATIVE GREEN SCREEN KEYING METHOD FOR FILM VISUAL EFFECTS

    Directory of Open Access Journals (Sweden)

    Jin Zhi

    2015-04-01

    Full Text Available This study focuses on a green screen keying method developed especially for film visual effects. There are a series of ways of using existing tools for creating mattes from green or blue screen plates. However, it is still a time-consuming process, and the results vary especially when it comes to retaining tiny details, such as hair and fur. This paper introduces an alternative concept and method for retaining edge details of characters on a green screen plate, also, a number of connected mathematical equations are explored. At the end of this study, a simplified process of applying this method in real productions is also tested.

  10. Literature in Focus: Statistical Methods in Experimental Physics

    CERN Multimedia

    2007-01-01

    Frederick James was a high-energy physicist who became the CERN "expert" on statistics and is now well-known around the world, in part for this famous text. The first edition of Statistical Methods in Experimental Physics was originally co-written with four other authors and was published in 1971 by North Holland (now an imprint of Elsevier). It became such an important text that demand for it has continued for more than 30 years. Fred has updated it and it was released in a second edition by World Scientific in 2006. It is still a top seller and there is no exaggeration in calling it «the» reference on the subject. A full review of the title appeared in the October CERN Courier.Come and meet the author to hear more about how this book has flourished during its 35-year lifetime. Frederick James Statistical Methods in Experimental Physics Monday, 26th of November, 4 p.m. Council Chamber (Bldg. 503-1-001) The author will be introduced...

  11. Testing alternative ground water models using cross-validation and other methods

    Science.gov (United States)

    Foglia, L.; Mehl, S.W.; Hill, M.C.; Perona, P.; Burlando, P.

    2007-01-01

    Many methods can be used to test alternative ground water models. Of concern in this work are methods able to (1) rank alternative models (also called model discrimination) and (2) identify observations important to parameter estimates and predictions (equivalent to the purpose served by some types of sensitivity analysis). Some of the measures investigated are computationally efficient; others are computationally demanding. The latter are generally needed to account for model nonlinearity. The efficient model discrimination methods investigated include the information criteria: the corrected Akaike information criterion, Bayesian information criterion, and generalized cross-validation. The efficient sensitivity analysis measures used are dimensionless scaled sensitivity (DSS), composite scaled sensitivity, and parameter correlation coefficient (PCC); the other statistics are DFBETAS, Cook's D, and observation-prediction statistic. Acronyms are explained in the introduction. Cross-validation (CV) is a computationally intensive nonlinear method that is used for both model discrimination and sensitivity analysis. The methods are tested using up to five alternative parsimoniously constructed models of the ground water system of the Maggia Valley in southern Switzerland. The alternative models differ in their representation of hydraulic conductivity. A new method for graphically representing CV and sensitivity analysis results for complex models is presented and used to evaluate the utility of the efficient statistics. The results indicate that for model selection, the information criteria produce similar results at much smaller computational cost than CV. For identifying important observations, the only obviously inferior linear measure is DSS; the poor performance was expected because DSS does not include the effects of parameter correlation and PCC reveals large parameter correlations. ?? 2007 National Ground Water Association.

  12. Fragment Identification and Statistics Method of Hypervelocity Impact SPH Simulation

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xiaotian; JIA Guanghui; HUANG Hai

    2011-01-01

    A comprehensive treatment to the fragment identification and statistics for the smoothed particle hydrodynamics (SPH) simulation of hypervelocity impact is presented.Based on SPH method, combined with finite element method (FEM), the computation is performed.The fragments are identified by a new pre- and post-processing algorithm and then converted into a binary graph.The number of fragments and the attached SPH particles are determined by counting the quantity of connected domains on the binary graph.The size, velocity vector and mass of each fragment are calculated by the particles' summation and weighted average.The dependence of this method on finite element edge length and simulation terminal time is discussed.An example of tungsten rods impacting steel plates is given for calibration.The computation results match experiments well and demonstrate the effectiveness of this method.

  13. Quantitative EEG Applying the Statistical Recognition Pattern Method

    DEFF Research Database (Denmark)

    Engedal, Knut; Snaedal, Jon; Hoegh, Peter

    2015-01-01

    BACKGROUND/AIM: The aim of this study was to examine the discriminatory power of quantitative EEG (qEEG) applying the statistical pattern recognition (SPR) method to separate Alzheimer's disease (AD) patients from elderly individuals without dementia and from other dementia patients. METHODS...... accepted criteria by at least 2 clinicians. EEGs were recorded in a standardized way and analyzed independently of the clinical diagnoses, using the SPR method. RESULTS: In receiver operating characteristic curve analyses, the qEEGs separated AD patients from healthy elderly individuals with an area under...... the curve (AUC) of 0.90, representing a sensitivity of 84% and a specificity of 81%. The qEEGs further separated patients with Lewy body dementia or Parkinson's disease dementia from AD patients with an AUC of 0.9, a sensitivity of 85% and a specificity of 87%. CONCLUSION: qEEG using the SPR method could...

  14. A review of statistical methods for preprocessing oligonucleotide microarrays.

    Science.gov (United States)

    Wu, Zhijin

    2009-12-01

    Microarrays have become an indispensable tool in biomedical research. This powerful technology not only makes it possible to quantify a large number of nucleic acid molecules simultaneously, but also produces data with many sources of noise. A number of preprocessing steps are therefore necessary to convert the raw data, usually in the form of hybridisation images, to measures of biological meaning that can be used in further statistical analysis. Preprocessing of oligonucleotide arrays includes image processing, background adjustment, data normalisation/transformation and sometimes summarisation when multiple probes are used to target one genomic unit. In this article, we review the issues encountered in each preprocessing step and introduce the statistical models and methods in preprocessing.

  15. Top-quark mass measurements at the LHC: alternative methods

    CERN Document Server

    Vos, Marcel

    2016-01-01

    Alternative top quark mass determinations can provide inputs to the world average with orthogonal systematic uncertainties and may help to refine the interpretation of the standard method. Among a number of recent results I focus on the extractions by ATLAS and CMS of the top quark pole mass from the top quark pair and tt + 1 jet production cross-section, which have now reached a precision of 1%.

  16. Top-quark mass measurements at the LHC: alternative methods

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00240580; The ATLAS collaboration

    2016-01-01

    Alternative top quark mass determinations can provide inputs to the world average with orthogonal systematic uncertainties and may help to refine the interpretation of the standard method. Among a number of recent results I focus on the extractions by ATLAS and CMS of the top quark pole mass from the \\ttbar{} pair and \\ttbar{} + 1 jet production cross-section, which have now reached a precision of 1\\%.

  17. ALTERNATING DIRECTION FINITE ELEMENT METHOD FOR SOME REACTION DIFFUSION MODELS

    Institute of Scientific and Technical Information of China (English)

    江成顺; 刘蕴贤; 沈永明

    2004-01-01

    This paper is concerned with some nonlinear reaction - diffusion models. To solve this kind of models, the modified Laplace finite element scheme and the alternating direction finite element scheme are established for the system of patrical differential equations. Besides, the finite difference method is utilized for the ordinary differential equation in the models. Moreover, by the theory and technique of prior estimates for the differential equations, the convergence analyses and the optimal L2- norm error estimates are demonstrated.

  18. Integrated Parasite Management for Livestock - Alternative control methods

    Directory of Open Access Journals (Sweden)

    Souvik Paul1

    Full Text Available Internal parasites are considered by some to be one of the most economically important constraints in raising livestock. The growing concern about the resistance of internal parasites to all classes of dewormers has caused people to look for alternatives. As dewormers lose their effectiveness, the livestock community fears increasing economic losses from worms. There is no one thing that can be given or done to replace chemical dewormers. It will take a combination of extremely good management techniques and possibly some alternative therapies. It is not wise to think that one can just stop deworming animals with chemical dewormers. It is something one will need to change gradually, observing and testing animals and soil, in order to monitor the progress. Alternative parasite control is an area that is receiving a lot of interest and attention. Programs and research will continue in the pursuit of parasite control, using alternative and more management-intensive methods. [Veterinary World 2010; 3(9.000: 431-435

  19. Surveying immigrants without sampling frames - evaluating the success of alternative field methods.

    Science.gov (United States)

    Reichel, David; Morales, Laura

    2017-01-01

    This paper evaluates the sampling methods of an international survey, the Immigrant Citizens Survey, which aimed at surveying immigrants from outside the European Union (EU) in 15 cities in seven EU countries. In five countries, no sample frame was available for the target population. Consequently, alternative ways to obtain a representative sample had to be found. In three countries 'location sampling' was employed, while in two countries traditional methods were used with adaptations to reach the target population. The paper assesses the main methodological challenges of carrying out a survey among a group of immigrants for whom no sampling frame exists. The samples of the survey in these five countries are compared to results of official statistics in order to assess the accuracy of the samples obtained through the different sampling methods. It can be shown that alternative sampling methods can provide meaningful results in terms of core demographic characteristics although some estimates differ to some extent from the census results.

  20. Evolutionary Computation Methods and their applications in Statistics

    Directory of Open Access Journals (Sweden)

    Francesco Battaglia

    2013-05-01

    Full Text Available A brief discussion of the genesis of evolutionary computation methods, their relationship to artificial intelligence, and the contribution of genetics and Darwin’s theory of natural evolution is provided. Then, the main evolutionary computation methods are illustrated: evolution strategies, genetic algorithms, estimation of distribution algorithms, differential evolution, and a brief description of some evolutionary behavior methods such as ant colony and particle swarm optimization. We also discuss the role of the genetic algorithm for multivariate probability distribution random generation, rather than as a function optimizer. Finally, some relevant applications of genetic algorithm to statistical problems are reviewed: selection of variables in regression, time series model building, outlier identification, cluster analysis, design of experiments.

  1. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  2. Bayesian statistic methods and theri application in probabilistic simulation models

    Directory of Open Access Journals (Sweden)

    Sergio Iannazzo

    2007-03-01

    Full Text Available Bayesian statistic methods are facing a rapidly growing level of interest and acceptance in the field of health economics. The reasons of this success are probably to be found on the theoretical fundaments of the discipline that make these techniques more appealing to decision analysis. To this point should be added the modern IT progress that has developed different flexible and powerful statistical software framework. Among them probably one of the most noticeably is the BUGS language project and its standalone application for MS Windows WinBUGS. Scope of this paper is to introduce the subject and to show some interesting applications of WinBUGS in developing complex economical models based on Markov chains. The advantages of this approach reside on the elegance of the code produced and in its capability to easily develop probabilistic simulations. Moreover an example of the integration of bayesian inference models in a Markov model is shown. This last feature let the analyst conduce statistical analyses on the available sources of evidence and exploit them directly as inputs in the economic model.

  3. Application of Statistical Process Control Methods for IDS

    Directory of Open Access Journals (Sweden)

    Muhammad Sadiq Ali Khan

    2012-11-01

    Full Text Available As technology improves, attackers are trying to get access to the network system resources by so many means. Open loop holes in the network allow them to penetrate in the network more easily; statistical methods have great importance in the area of computer and network security, in detecting the malfunctioning of the network system. Development of internet security solution needed to protect the system and to with stand prolonged and diverse attack. In this paper Statistical approach has been used, conventionally Statistical Control Charts has been used for quality characteristics however in IDS abnormal access can be easily detected and appropriate control limit can be established. Two different charts are investigated and Shewhart chart based on average has produced better accuracy. The approach used here for intrusion detection in such a way that if the data packet is drastically different from normal variation then it can be classified as attack. In other words a system variation may be due to some special reason. If these causes are investigated then natural variation and abnormal variation can be distinguished which can be used for distinction of behaviors of the system.

  4. Statistical and Mathematical Methods for Synoptic Time Domain Surveys

    Science.gov (United States)

    Mahabal, Ashish A.; SAMSI Synoptic Surveys Time Domain Working Group

    2017-01-01

    Recent advances in detector technology, electronics, data storage, and computation have enabled astronomers to collect larger and larger datasets, and moreover, pose interesting questions to answer with those data. The complexity of the data allows data science techniques to be used. These have to be grounded in sound techniques. Identify interesting mathematical and statistical challenges and working on their solutions is one of the aims of the year-long ‘Statistical, Mathematical and Computational Methods for Astronomy (ASTRO)’ program of SAMSI. Of the many working groups that have been formed, one is on Synoptic Time Domain Surveys. Within this we have various subgroups discussing topics such as Designing Statistical Features for Optimal Classification, Scheduling Observations, Incorporating Unstructured Information, Detecting Outliers, Lightcurve Decomposition and Interpolation, Domain Adaptation, and also Designing a Data Challenge. We will briefly highlight some of the work going on in these subgroups along with their interconnections, and the plans for the near future. We will also highlight the overlaps with the other SAMSI working groups and also indicate how the wider astronomy community can both participate and benefit from the activities.

  5. Statistical analysis of the precision of the Match method

    Directory of Open Access Journals (Sweden)

    R. Lehmann

    2005-05-01

    Full Text Available The Match method quantifies chemical ozone loss in the polar stratosphere. The basic idea consists in calculating the forward trajectory of an air parcel that has been probed by an ozone measurement (e.g., by an ozone sonde or satellite and finding a second ozone measurement close to this trajectory. Such an event is called a ''match''. A rate of chemical ozone destruction can be obtained by a statistical analysis of several tens of such match events. Information on the uncertainty of the calculated rate can be inferred from the scatter of the ozone mixing ratio difference (second measurement minus first measurement associated with individual matches. A standard analysis would assume that the errors of these differences are statistically independent. However, this assumption may be violated because different matches can share a common ozone measurement, so that the errors associated with these match events become statistically dependent. Taking this effect into account, we present an analysis of the uncertainty of the final Match result. It has been applied to Match data from the Arctic winters 1995, 1996, 2000, and 2003. For these ozone-sonde Match studies the effect of the error correlation on the uncertainty estimates is rather small: compared to a standard error analysis, the uncertainty estimates increase by 15% on average. However, the effect is more pronounced for typical satellite Match analyses: for an Antarctic satellite Match study (2003, the uncertainty estimates increase by 60% on average.

  6. Hybrid perturbation methods based on statistical time series models

    Science.gov (United States)

    San-Juan, Juan Félix; San-Martín, Montserrat; Pérez, Iván; López, Rosario

    2016-04-01

    In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.

  7. Classification of Specialized Farms Applying Multivariate Statistical Methods

    Directory of Open Access Journals (Sweden)

    Zuzana Hloušková

    2017-01-01

    Full Text Available Classification of specialized farms applying multivariate statistical methods The paper is aimed at application of advanced multivariate statistical methods when classifying cattle breeding farming enterprises by their economic size. Advantage of the model is its ability to use a few selected indicators compared to the complex methodology of current classification model that requires knowledge of detailed structure of the herd turnover and structure of cultivated crops. Output of the paper is intended to be applied within farm structure research focused on future development of Czech agriculture. As data source, the farming enterprises database for 2014 has been used, from the FADN CZ system. The predictive model proposed exploits knowledge of actual size classes of the farms tested. Outcomes of the linear discriminatory analysis multifactor classification method have supported the chance of filing farming enterprises in the group of Small farms (98 % filed correctly, and the Large and Very Large enterprises (100 % filed correctly. The Medium Size farms have been correctly filed at 58.11 % only. Partial shortages of the process presented have been found when discriminating Medium and Small farms.

  8. Hypoplastic thumb type IIIB: An alternative method for surgical repair

    Directory of Open Access Journals (Sweden)

    Salih Onur Basat

    2014-08-01

    Full Text Available Hypoplastic thumb is the second most common congenital deformity of the thumb. Thumb hypoplasia is characterized by diminished thumb size, metacarpal adduction, metacarpophalangeal joint instability, and thenar muscle hypoplasia. In the literature, different classification types of hypoplastic thumb have been used and different treatment methods described. In this case we presented an alternative palliative treatment method for a ten-year-old patient with modified Blauth's classification type IIIB hypoplastic thumb and one-year follow-up results. [Hand Microsurg 2014; 3(2.000: 59-61

  9. Optimization of Statistical Methods Impact on Quantitative Proteomics Data.

    Science.gov (United States)

    Pursiheimo, Anna; Vehmas, Anni P; Afzal, Saira; Suomi, Tomi; Chand, Thaman; Strauss, Leena; Poutanen, Matti; Rokka, Anne; Corthals, Garry L; Elo, Laura L

    2015-10-02

    As tools for quantitative label-free mass spectrometry (MS) rapidly develop, a consensus about the best practices is not apparent. In the work described here we compared popular statistical methods for detecting differential protein expression from quantitative MS data using both controlled experiments with known quantitative differences for specific proteins used as standards as well as "real" experiments where differences in protein abundance are not known a priori. Our results suggest that data-driven reproducibility-optimization can consistently produce reliable differential expression rankings for label-free proteome tools and are straightforward in their application.

  10. Estimated Accuracy of Three Common Trajectory Statistical Methods

    Science.gov (United States)

    Kabashnikov, Vitaliy P.; Chaikovsky, Anatoli P.; Kucsera, Tom L.; Metelskaya, Natalia S.

    2011-01-01

    Three well-known trajectory statistical methods (TSMs), namely concentration field (CF), concentration weighted trajectory (CWT), and potential source contribution function (PSCF) methods were tested using known sources and artificially generated data sets to determine the ability of TSMs to reproduce spatial distribution of the sources. In the works by other authors, the accuracy of the trajectory statistical methods was estimated for particular species and at specified receptor locations. We have obtained a more general statistical estimation of the accuracy of source reconstruction and have found optimum conditions to reconstruct source distributions of atmospheric trace substances. Only virtual pollutants of the primary type were considered. In real world experiments, TSMs are intended for application to a priori unknown sources. Therefore, the accuracy of TSMs has to be tested with all possible spatial distributions of sources. An ensemble of geographical distributions of virtual sources was generated. Spearman s rank order correlation coefficient between spatial distributions of the known virtual and the reconstructed sources was taken to be a quantitative measure of the accuracy. Statistical estimates of the mean correlation coefficient and a range of the most probable values of correlation coefficients were obtained. All the TSMs that were considered here showed similar close results. The maximum of the ratio of the mean correlation to the width of the correlation interval containing the most probable correlation values determines the optimum conditions for reconstruction. An optimal geographical domain roughly coincides with the area supplying most of the substance to the receptor. The optimal domain s size is dependent on the substance decay time. Under optimum reconstruction conditions, the mean correlation coefficients can reach 0.70 0.75. The boundaries of the interval with the most probable correlation values are 0.6 0.9 for the decay time of 240 h

  11. Concepts and methods in modern theoretical chemistry statistical mechanics

    CERN Document Server

    Ghosh, Swapan Kumar

    2013-01-01

    Concepts and Methods in Modern Theoretical Chemistry: Statistical Mechanics, the second book in a two-volume set, focuses on the dynamics of systems and phenomena. A new addition to the series Atoms, Molecules, and Clusters, this book offers chapters written by experts in their fields. It enables readers to learn how concepts from ab initio quantum chemistry and density functional theory (DFT) can be used to describe, understand, and predict chemical dynamics. This book covers a wide range of subjects, including discussions on the following topics: Time-dependent DFT Quantum fluid dynamics (QF

  12. Prediction of skin sensitizers using alternative methods to animal experimentation.

    Science.gov (United States)

    Johansson, Henrik; Lindstedt, Malin

    2014-07-01

    Regulatory frameworks within the European Union demand that chemical substances are investigated for their ability to induce sensitization, an adverse health effect caused by the human immune system in response to chemical exposure. A recent ban on the use of animal tests within the cosmetics industry has led to an urgent need for alternative animal-free test methods that can be used for assessment of chemical sensitizers. To date, no such alternative assay has yet completed formal validation. However, a number of assays are in development and the understanding of the biological mechanisms of chemical sensitization has greatly increased during the last decade. In this MiniReview, we aim to summarize and give our view on the recent progress of method development for alternative assessment of chemical sensitizers. We propose that integrated testing strategies should comprise complementary assays, providing measurements of a wide range of mechanistic events, to perform well-educated risk assessments based on weight of evidence. © 2014 Nordic Association for the Publication of BCPT (former Nordic Pharmacological Society).

  13. Osmotic shock as alternative method to control Acanthaster planci

    Institute of Scientific and Technical Information of China (English)

    Jairo Rivera-Posada; Leigh Owens

    2014-01-01

    Objective:To test six osmotic stressors as alternative methods to control Acanthaster planci (A. planci) outbreaks by exploiting their incapacity to tolerate drastic changes in osmolarity. Finding more effective ways to control A. planci outbreaks is one of the most immediate and effective ways by which to reverse rapid declines in the abundance of live coral cover in the Indo-Pacific. Methods: A total of 10 mL of each of the following chemicals: sodium chloride, ethylenediaminetetraacetic acid, sodium carbonate, sodium cholate, sodium deoxycholate, urea and mannitol were injected into individual healthy sea stars to examine which chemicals induced disease and death. Results:Four out of six chemicals used in this study induced disease. Sodium chloride, sodium cholate, sodium deoxycholate and ethylenediaminetetraacetic acid are capable of inducing death in injected sea stars offering an alternative option to control A. planci outbreaks. Conclusions: Hyperosmotic stress is a viable alternative to control A. planci outbreaks as massive cell death results when acute hypertonicity exceeds a certain level.

  14. Methods in probability and statistical inference. Progress report, June 15, 1976--June 14, 1977. [Dept. of Statistics, Univ. of Chicago

    Energy Technology Data Exchange (ETDEWEB)

    Perlman, M D

    1977-03-01

    Research activities of the Department of Statistics, University of Chicago, during the period 15 June 1976 to 14 June 1977 are reviewed. Individual projects were carried out in the following eight areas: statistical computing--approximations to statistical tables and functions; numerical computation of boundary-crossing probabilities for Brownian motion and related stochastic processes; probabilistic methods in statistical mechanics; combining independent tests of significance; small-sample efficiencies of tests and estimates; improved procedures for simultaneous estimation and testing of many correlations; statistical computing and improved regression methods; and comparison of several populations. Brief summaries of these projects are given, along with other administrative information. (RWR)

  15. Tips and Tricks for Successful Application of Statistical Methods to Biological Data.

    Science.gov (United States)

    Schlenker, Evelyn

    2016-01-01

    This chapter discusses experimental design and use of statistics to describe characteristics of data (descriptive statistics) and inferential statistics that test the hypothesis posed by the investigator. Inferential statistics, based on probability distributions, depend upon the type and distribution of the data. For data that are continuous, randomly and independently selected, as well as normally distributed more powerful parametric tests such as Student's t test and analysis of variance (ANOVA) can be used. For non-normally distributed or skewed data, transformation of the data (using logarithms) may normalize the data allowing use of parametric tests. Alternatively, with skewed data nonparametric tests can be utilized, some of which rely on data that are ranked prior to statistical analysis. Experimental designs and analyses need to balance between committing type 1 errors (false positives) and type 2 errors (false negatives). For a variety of clinical studies that determine risk or benefit, relative risk ratios (random clinical trials and cohort studies) or odds ratios (case-control studies) are utilized. Although both use 2 × 2 tables, their premise and calculations differ. Finally, special statistical methods are applied to microarray and proteomics data, since the large number of genes or proteins evaluated increase the likelihood of false discoveries. Additional studies in separate samples are used to verify microarray and proteomic data. Examples in this chapter and references are available to help continued investigation of experimental designs and appropriate data analysis.

  16. Hysterectomy—Current Methods and Alternatives for Benign Indications

    Directory of Open Access Journals (Sweden)

    Michail S. Papadopoulos

    2010-01-01

    Full Text Available Hysterectomy is the commonest gynecologic operation performed not only for malignant disease but also for many benign conditions such as fibroids, endometrial hyperplasia, adenomyosis, uterine prolapse, dysfunctional uterine bleeding, and cervical intraepithelial neoplasia. There are many approaches to hysterectomy for benign disease: abdominal hysterectomy, vaginal hysterectomy, laparoscopic assisted vaginal hysterectomy (LAVH where a vaginal hysterectomy is assisted by laparoscopic procedures that do not include uterine artery ligation, total laparoscopic hysterectomy (TLH where the laparoscopic procedures include uterine artery ligation, and subtotal laparoscopic hysterectomy (STLH where there is no vaginal component and the uterine body is removed using a morcelator. In the last decades, many new techniques, alternative to hysterectomy with conservation of the uterus have been developed. They use modern technologies and their results are promising and in many cases comparable with hysterectomy. This paper is a review of all the existing hysterectomy techniques and the alternative methods for benign indications.

  17. Alternative Methods for Measuring Obesity in African American Women

    Science.gov (United States)

    Clark, Ashley E.; Taylor, Jacquelyn Y.; Wu, Chun Yi; Smith, Jennifer A.

    2013-01-01

    The use of body mass index (BMI) may not be the most appropriate measurement tool in determining obesity in diverse populations. We studied a convenience sample of 108 African American (AA) women to determine the best method for measuring obesity in this at-risk population. The purpose of this study was to determine if percent body fat (PBF) and percent body water (PBW) could be used as alternatives to BMI in predicting obesity and risk for hypertension (HTN) among AA women. After accounting for age, BMI, and the use of anti-hypertensive medication, PBF (p = 0.0125) and PBW (p = 0.0297) were significantly associated with systolic blood pressure, while BMI was not. Likewise, PBF (p = 0.0316) was significantly associated with diastolic blood pressure, while PBW and BMI were not. Thus, health care practitioners should consider alternative anthropometric measurements such as PBF when assessing obesity in AA women. PMID:23483836

  18. Visualization methods for statistical analysis of microarray clusters

    Directory of Open Access Journals (Sweden)

    Li Kai

    2005-05-01

    Full Text Available Abstract Background The most common method of identifying groups of functionally related genes in microarray data is to apply a clustering algorithm. However, it is impossible to determine which clustering algorithm is most appropriate to apply, and it is difficult to verify the results of any algorithm due to the lack of a gold-standard. Appropriate data visualization tools can aid this analysis process, but existing visualization methods do not specifically address this issue. Results We present several visualization techniques that incorporate meaningful statistics that are noise-robust for the purpose of analyzing the results of clustering algorithms on microarray data. This includes a rank-based visualization method that is more robust to noise, a difference display method to aid assessments of cluster quality and detection of outliers, and a projection of high dimensional data into a three dimensional space in order to examine relationships between clusters. Our methods are interactive and are dynamically linked together for comprehensive analysis. Further, our approach applies to both protein and gene expression microarrays, and our architecture is scalable for use on both desktop/laptop screens and large-scale display devices. This methodology is implemented in GeneVAnD (Genomic Visual ANalysis of Datasets and is available at http://function.princeton.edu/GeneVAnD. Conclusion Incorporating relevant statistical information into data visualizations is key for analysis of large biological datasets, particularly because of high levels of noise and the lack of a gold-standard for comparisons. We developed several new visualization techniques and demonstrated their effectiveness for evaluating cluster quality and relationships between clusters.

  19. A method for statistically comparing spatial distribution maps

    Directory of Open Access Journals (Sweden)

    Reynolds Mary G

    2009-01-01

    Full Text Available Abstract Background Ecological niche modeling is a method for estimation of species distributions based on certain ecological parameters. Thus far, empirical determination of significant differences between independently generated distribution maps for a single species (maps which are created through equivalent processes, but with different ecological input parameters, has been challenging. Results We describe a method for comparing model outcomes, which allows a statistical evaluation of whether the strength of prediction and breadth of predicted areas is measurably different between projected distributions. To create ecological niche models for statistical comparison, we utilized GARP (Genetic Algorithm for Rule-Set Production software to generate ecological niche models of human monkeypox in Africa. We created several models, keeping constant the case location input records for each model but varying the ecological input data. In order to assess the relative importance of each ecological parameter included in the development of the individual predicted distributions, we performed pixel-to-pixel comparisons between model outcomes and calculated the mean difference in pixel scores. We used a two sample Student's t-test, (assuming as null hypothesis that both maps were identical to each other regardless of which input parameters were used to examine whether the mean difference in corresponding pixel scores from one map to another was greater than would be expected by chance alone. We also utilized weighted kappa statistics, frequency distributions, and percent difference to look at the disparities in pixel scores. Multiple independent statistical tests indicated precipitation as the single most important independent ecological parameter in the niche model for human monkeypox disease. Conclusion In addition to improving our understanding of the natural factors influencing the distribution of human monkeypox disease, such pixel-to-pixel comparison

  20. Validation of Alternative In Vitro Methods to Animal Testing: Concepts, Challenges, Processes and Tools.

    Science.gov (United States)

    Griesinger, Claudius; Desprez, Bertrand; Coecke, Sandra; Casey, Warren; Zuang, Valérie

    test method for a given purpose. Relevance encapsulates the scientific basis of the test method, its capacity to predict adverse effects in the "target system" (i.e. human health or the environment) as well as its applicability for the intended purpose. In this chapter we focus on the validation of non-animal in vitro alternative testing methods and review the concepts, challenges, processes and tools fundamental to the validation of in vitro methods intended for hazard testing of chemicals. We explore major challenges and peculiarities of validation in this area. Based on the notion that validation per se is a scientific endeavour that needs to adhere to key scientific principles, namely objectivity and appropriate choice of methodology, we examine basic aspects of study design and management, and provide illustrations of statistical approaches to describe predictive performance of validated test methods as well as their reliability.

  1. FOREWORD: Special issue on Statistical and Probabilistic Methods for Metrology

    Science.gov (United States)

    Bich, Walter; Cox, Maurice G.

    2006-08-01

    This special issue of Metrologia is the first that is not devoted to units, or constants, or measurement techniques in some specific field of metrology, but to the generic topic of statistical and probabilistic methods for metrology. The number of papers on this subject in measurement journals, and in Metrologia in particular, has continued to increase over the years, driven by the publication of the Guide to the Expression of Uncertainty in Measurement (GUM) [1] and the Mutual Recognition Arrangement (MRA) of the CIPM [2]. The former stimulated metrologists to think in greater depth about the appropriate modelling of their measurements, in order to provide uncertainty evaluations associated with measurement results. The latter obliged the metrological community to investigate reliable measures for assessing the calibration and measurement capabilities declared by the national metrology institutes (NMIs). Furthermore, statistical analysis of measurement data became even more important than hitherto, with the need, on the one hand, to treat the greater quantities of data provided by sophisticated measurement systems, and, on the other, to deal appropriately with relatively small sets of data that are difficult or expensive to obtain. The importance of supporting the GUM and extending its provisions was recognized by the formation in the year 2000 of Working Group 1, Measurement uncertainty, of the Joint Committee for Guides in Metrology. The need to provide guidance on key comparison data evaluation was recognized by the formation in the year 2001 of the BIPM Director's Advisory Group on Uncertainty. A further international initiative was the revision, in the year 2004, of the remit and title of a working group of ISO/TC 69, Application of Statistical Methods, to reflect the need to concentrate more on statistical methods to support measurement uncertainty evaluation. These international activities are supplemented by national programmes such as the Software Support

  2. Hybrid Perturbation methods based on Statistical Time Series models

    CERN Document Server

    San-Juan, Juan Félix; Pérez, Iván; López, Rosario

    2016-01-01

    In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of a...

  3. Nonlinear diffusion methods based on robust statistics for noise removal

    Institute of Scientific and Technical Information of China (English)

    JIA Di-ye; HUANG Feng-gang; SU Han

    2007-01-01

    A novel smoothness term of Bayesian regularization framework based on M-estimation of robust statistics is proposed, and from this term a class of fourth-order nonlinear diffusion methods is proposed. These methods attempt to approximate an observed image with a piecewise linear image, which looks more natural than piecewise constant image used to approximate an observed image by P-M[1] model. It is known that M-estimators and W-estimators are essentially equivalent and solve the same minimization problem. Then, we propose PL bilateral filter from equivalent W-estimator. This new model is designed for piecewise linear image filtering,which is more effective than normal bilateral filter.

  4. A test statistic for the affected-sib-set method.

    Science.gov (United States)

    Lange, K

    1986-07-01

    This paper discusses generalizations of the affected-sib-pair method. First, the requirement that sib identity-by-descent relations be known unambiguously is relaxed by substituting sib identity-by-state relations. This permits affected sibs to be used even when their parents are unavailable for typing. In the limit of an infinite number of marker alleles each of infinitesimal population frequency, the identity-by-state relations coincide with the usual identity-by-descent relations. Second, a weighted pairs test statistic is proposed that covers affected sib sets of size greater than two. These generalizations make the affected-sib-pair method a more powerful technique for detecting departures from independent segregation of disease and marker phenotypes. A sample calculation suggests such a departure for tuberculoid leprosy and the HLA D locus.

  5. Statistical Inference Methods for Sparse Biological Time Series Data

    Directory of Open Access Journals (Sweden)

    Voit Eberhard O

    2011-04-01

    Full Text Available Abstract Background Comparing metabolic profiles under different biological perturbations has become a powerful approach to investigating the functioning of cells. The profiles can be taken as single snapshots of a system, but more information is gained if they are measured longitudinally over time. The results are short time series consisting of relatively sparse data that cannot be analyzed effectively with standard time series techniques, such as autocorrelation and frequency domain methods. In this work, we study longitudinal time series profiles of glucose consumption in the yeast Saccharomyces cerevisiae under different temperatures and preconditioning regimens, which we obtained with methods of in vivo nuclear magnetic resonance (NMR spectroscopy. For the statistical analysis we first fit several nonlinear mixed effect regression models to the longitudinal profiles and then used an ANOVA likelihood ratio method in order to test for significant differences between the profiles. Results The proposed methods are capable of distinguishing metabolic time trends resulting from different treatments and associate significance levels to these differences. Among several nonlinear mixed-effects regression models tested, a three-parameter logistic function represents the data with highest accuracy. ANOVA and likelihood ratio tests suggest that there are significant differences between the glucose consumption rate profiles for cells that had been--or had not been--preconditioned by heat during growth. Furthermore, pair-wise t-tests reveal significant differences in the longitudinal profiles for glucose consumption rates between optimal conditions and heat stress, optimal and recovery conditions, and heat stress and recovery conditions (p-values Conclusion We have developed a nonlinear mixed effects model that is appropriate for the analysis of sparse metabolic and physiological time profiles. The model permits sound statistical inference procedures

  6. 48 CFR 32.503-9 - Liquidation rates-alternate method.

    Science.gov (United States)

    2010-10-01

    ...-alternate method. 32.503-9 Section 32.503-9 Federal Acquisition Regulations System FEDERAL ACQUISITION... Liquidation rates—alternate method. (a) The liquidation rate determined under 32.503-8 shall apply throughout... the alternate method in this 32.503-9. The objective of the alternate liquidation rate method is...

  7. Are Statistics Labs Worth the Effort?--Comparison of Introductory Statistics Courses Using Different Teaching Methods

    Directory of Open Access Journals (Sweden)

    Jose H. Guardiola

    2010-01-01

    Full Text Available This paper compares the academic performance of students in three similar elementary statistics courses taught by the same instructor, but with the lab component differing among the three. One course is traditionally taught without a lab component; the second with a lab component using scenarios and an extensive use of technology, but without explicit coordination between lab and lecture; and the third using a lab component with an extensive use of technology that carefully coordinates the lab with the lecture. Extensive use of technology means, in this context, using Minitab software in the lab section, doing homework and quizzes using MyMathlab ©, and emphasizing interpretation of computer output during lectures. Initially, an online instrument based on Gardner’s multiple intelligences theory, is given to students to try to identify students’ learning styles and intelligence types as covariates. An analysis of covariance is performed in order to compare differences in achievement. In this study there is no attempt to measure difference in student performance across the different treatments. The purpose of this study is to find indications of associations among variables that support the claim that statistics labs could be associated with superior academic achievement in one of these three instructional environments. Also, this study tries to identify individual student characteristics that could be associated with superior academic performance. This study did not find evidence of any individual student characteristics that could be associated with superior achievement. The response variable was computed as percentage of correct answers for the three exams during the semester added together. The results of this study indicate a significant difference across these three different instructional methods, showing significantly higher mean scores for the response variable on students taking the lab component that was carefully coordinated with

  8. ALTERNATIVE METHODS FOR CONDUCTING COMPARATIVE ANALYSES OF CADASTRAL SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The conception of an efficient cadastral system is an important element in the development of each coun-try. It is crucial for the efficient operation of the real estate market-the security and liberty of making transactions, register-ing a property, planning operations, the introduction of an ad valorem tax on property and more rational use of space. InEurope there are different types of cadastral systems, because the countries in Europe have different cultural back-grounds, different economical and social backgrounds. Through the centuries, many types of cadastral systems evolvedand their differences often depend upon local cultural heritage, physical geography, land use, technology, etc. Compara-tive analyses of cadastral systems have been the subjects of many publications and studies in world literature. It was as-sessed that the useful tools in conducting comparative analyses of various cadastral systems include the procedures of statisti-cal inference. This paper presents the results of a project to compare the performance of ten cadastral systems international-ly by creating appropriate integrated indicators of a cadastral system using statistical technique. Such indicators willmake it possible to compare different cadastral systems and present them hierarchically in relation to their quality, struc-ture, as well as legal, organizational and technological solutions. From a good number of methods available, techniquesoriginating from two spheres of statistic inference were selected: distribution free methods and multivariate analysis meth-ods. For analyses with the distribution free methods, FRIEDMAN's test (FRIENDMAN's non-parametric variance analy-sis) as well as KENDALL's test (KENDALL's compatibility ratio) were selected. For analyses with the multivariate analy-sis methods, factor analysis was selected.

  9. Complementary and Alternative Medicine Methods in Chronic Renal Failure

    Directory of Open Access Journals (Sweden)

    Zeynep Erdogan

    2014-08-01

    Full Text Available Despite its long history, use of complementary and alternative medicine (CAM methods has increased dramatically only after 1990s. Up to 57% of patients with chronic renal use CAM methods.These patienys use CAM methods to overcome hypertension, fatigue, constipation, leg edema, pain, cramps, anxiety, depression, sleep disorders, to cope with symptoms such as itching, to stop the progression of kidney disease and to improve their quality of life. Methods used are herbal products and food supplements, acupressure, acupuncture, homeopathy, exercise, aromatherapy, yoga and reflexology. Nephrotoxic effect of several CAM therapies used in patients with renal impairment could disturb hemodynamics by reducing the glomerular filtration rate. For this reason, health care providers should question patients about used of CAM, methods. Communication with patients should be clear and should not act judgmental. Health care personnel should learn more about CAM methods in order to avoid unwanted situations that could develop after the application of CAM methods. Patients should be informed correctly and scientifically about these methods to avoid harmful and unnecessary uses. [Archives Medical Review Journal 2014; 23(4.000: 770-786

  10. Alternative method for direct measurement of tibial slope

    Directory of Open Access Journals (Sweden)

    Stijak Lazar

    2014-01-01

    Full Text Available Background/Aim. The tibial slope is one of the most frequently cited anatomical causes of anterior cruciate ligament trauma. The aim of this study was to determine the possibility of direct measuring of the tibial slope of the knee without prior soft tissue dissection in cadavers. Methods. Measurement was performed on the two groups of samples: osteological and cadaveric. The osteological group consisted of 102 matured tibiae and measurement was performed: indirectly by sagittal photographing of the tibia, and directly by a set of parallel bars. The cadaveric group consisted of 50 cadaveric knees and measurement was performed directly by a set of parallel bars. The difference and correlation between indirect and the direct measurements were observed, which included also measuring of the difference and correlation of the tibial slope on the medial and lateral condyles. Results. A statistically significant difference between the direct and indirect method of measuring (p 0.05. However, the slope on the medial condyle, as well as indirect measurement showed a statistically significant difference (p < 0.01. Conclusion. By the use of a set of parallel bars it is possible to measure the tibial slope directly without removal of the soft tissue. The results of indirect, photographic measurement did not statistically differ from the results of direct measurement of the tibial slope.

  11. Fast alternating projection methods for constrained tomographic reconstruction.

    Science.gov (United States)

    Liu, Li; Han, Yongxin; Jin, Mingwu

    2017-01-01

    The alternating projection algorithms are easy to implement and effective for large-scale complex optimization problems, such as constrained reconstruction of X-ray computed tomography (CT). A typical method is to use projection onto convex sets (POCS) for data fidelity, nonnegative constraints combined with total variation (TV) minimization (so called TV-POCS) for sparse-view CT reconstruction. However, this type of method relies on empirically selected parameters for satisfactory reconstruction and is generally slow and lack of convergence analysis. In this work, we use a convex feasibility set approach to address the problems associated with TV-POCS and propose a framework using full sequential alternating projections or POCS (FS-POCS) to find the solution in the intersection of convex constraints of bounded TV function, bounded data fidelity error and non-negativity. The rationale behind FS-POCS is that the mathematically optimal solution of the constrained objective function may not be the physically optimal solution. The breakdown of constrained reconstruction into an intersection of several feasible sets can lead to faster convergence and better quantification of reconstruction parameters in a physical meaningful way than that in an empirical way of trial-and-error. In addition, for large-scale optimization problems, first order methods are usually used. Not only is the condition for convergence of gradient-based methods derived, but also a primal-dual hybrid gradient (PDHG) method is used for fast convergence of bounded TV. The newly proposed FS-POCS is evaluated and compared with TV-POCS and another convex feasibility projection method (CPTV) using both digital phantom and pseudo-real CT data to show its superior performance on reconstruction speed, image quality and quantification.

  12. Comparison of prediction performance using statistical postprocessing methods

    Science.gov (United States)

    Han, Keunhee; Choi, JunTae; Kim, Chansoo

    2016-11-01

    As the 2018 Winter Olympics are to be held in Pyeongchang, both general weather information on Pyeongchang and specific weather information on this region, which can affect game operation and athletic performance, are required. An ensemble prediction system has been applied to provide more accurate weather information, but it has bias and dispersion due to the limitations and uncertainty of its model. In this study, homogeneous and nonhomogeneous regression models as well as Bayesian model averaging (BMA) were used to reduce the bias and dispersion existing in ensemble prediction and to provide probabilistic forecast. Prior to applying the prediction methods, reliability of the ensemble forecasts was tested by using a rank histogram and a residualquantile-quantile plot to identify the ensemble forecasts and the corresponding verifications. The ensemble forecasts had a consistent positive bias, indicating over-forecasting, and were under-dispersed. To correct such biases, statistical post-processing methods were applied using fixed and sliding windows. The prediction skills of methods were compared by using the mean absolute error, root mean square error, continuous ranked probability score, and continuous ranked probability skill score. Under the fixed window, BMA exhibited better prediction skill than the other methods in most observation station. Under the sliding window, on the other hand, homogeneous and non-homogeneous regression models with positive regression coefficients exhibited better prediction skill than BMA. In particular, the homogeneous regression model with positive regression coefficients exhibited the best prediction skill.

  13. Statistical methods for the detection and analysis of radioactive sources

    Science.gov (United States)

    Klumpp, John

    We consider four topics from areas of radioactive statistical analysis in the present study: Bayesian methods for the analysis of count rate data, analysis of energy data, a model for non-constant background count rate distributions, and a zero-inflated model of the sample count rate. The study begins with a review of Bayesian statistics and techniques for analyzing count rate data. Next, we consider a novel system for incorporating energy information into count rate measurements which searches for elevated count rates in multiple energy regions simultaneously. The system analyzes time-interval data in real time to sequentially update a probability distribution for the sample count rate. We then consider a "moving target" model of background radiation in which the instantaneous background count rate is a function of time, rather than being fixed. Unlike the sequential update system, this model assumes a large body of pre-existing data which can be analyzed retrospectively. Finally, we propose a novel Bayesian technique which allows for simultaneous source detection and count rate analysis. This technique is fully compatible with, but independent of, the sequential update system and moving target model.

  14. A Statistical Method to Distinguish Functional Brain Networks

    Science.gov (United States)

    Fujita, André; Vidal, Maciel C.; Takahashi, Daniel Y.

    2017-01-01

    One major problem in neuroscience is the comparison of functional brain networks of different populations, e.g., distinguishing the networks of controls and patients. Traditional algorithms are based on search for isomorphism between networks, assuming that they are deterministic. However, biological networks present randomness that cannot be well modeled by those algorithms. For instance, functional brain networks of distinct subjects of the same population can be different due to individual characteristics. Moreover, networks of subjects from different populations can be generated through the same stochastic process. Thus, a better hypothesis is that networks are generated by random processes. In this case, subjects from the same group are samples from the same random process, whereas subjects from different groups are generated by distinct processes. Using this idea, we developed a statistical test called ANOGVA to test whether two or more populations of graphs are generated by the same random graph model. Our simulations' results demonstrate that we can precisely control the rate of false positives and that the test is powerful to discriminate random graphs generated by different models and parameters. The method also showed to be robust for unbalanced data. As an example, we applied ANOGVA to an fMRI dataset composed of controls and patients diagnosed with autism or Asperger. ANOGVA identified the cerebellar functional sub-network as statistically different between controls and autism (p < 0.001). PMID:28261045

  15. Jet Noise Diagnostics Supporting Statistical Noise Prediction Methods

    Science.gov (United States)

    Bridges, James E.

    2006-01-01

    compared against measurements of mean and rms velocity statistics over a range of jet speeds and temperatures. Models for flow parameters used in the acoustic analogy, most notably the space-time correlations of velocity, have been compared against direct measurements, and modified to better fit the observed data. These measurements have been extremely challenging for hot, high speed jets, and represent a sizeable investment in instrumentation development. As an intermediate check that the analysis is predicting the physics intended, phased arrays have been employed to measure source distributions for a wide range of jet cases. And finally, careful far-field spectral directivity measurements have been taken for final validation of the prediction code. Examples of each of these experimental efforts will be presented. The main result of these efforts is a noise prediction code, named JeNo, which is in middevelopment. JeNo is able to consistently predict spectral directivity, including aft angle directivity, for subsonic cold jets of most geometries. Current development on JeNo is focused on extending its capability to hot jets, requiring inclusion of a previously neglected second source associated with thermal fluctuations. A secondary result of the intensive experimentation is the archiving of various flow statistics applicable to other acoustic analogies and to development of time-resolved prediction methods. These will be of lasting value as we look ahead at future challenges to the aeroacoustic experimentalist.

  16. Axial electron channeling statistical method of site occupancy determination

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Multibeams dynamical theory of electron diffraction has been used to calculate the fast electron thickness-integrated probability density on Ti and Al sites in the γ-TiAl phase as a function of the incident electron beam orientation along \\[100\\], \\[110\\] and \\[011\\] zone axes, with the effect of absorption considered. Both of the calculation and experiments show that there are big differences in electron channeling effect for different zone axes or the same axis but with different orientations, so we should choose proper zone axis and suitable incident beam tilting angles when using the axial electron channeling statistical method to determine the site occupancies of impurities. It is suggested to calculate the channeling effect map before the experiments.

  17. NEW METHOD FOR CALCULATION OF STATISTIC MISTAKE IN MARKETING INVESTIGATIONS

    Directory of Open Access Journals (Sweden)

    V. A. Koldachiov

    2008-01-01

    Full Text Available An idea of a new method  is that while breaking-down analysis sample in some sub-samples there is a probability that an actual value for general body will be inside the interval between the highest and lowest average meaning of sub-sample is much higher of the probability that the given value will be  beyond the limits of the indicated interval. In this case a size of the interval appears to be less than analogous parameter while making calculation with the help of the Stewdent formula.Thus, it is possible to reach high accuracy in results of marketing investigations while preserving analysis sample size or reducing the necessary size of analysis sample while preserving level of statistical mistake.

  18. Statistical methods for determining the effect of mammography screening

    DEFF Research Database (Denmark)

    Lophaven, Søren

    2016-01-01

    In an overview of five randomised controlled trials from Sweden, a reduction of 29% was found in breast cancer mortality in women aged 50-69 at randomisation after a follow up of 5-13 years. Organised, population based, mammography service screening was introduced on the basis of these resultsin...... the municipality of Copenhagen in 1991, in the county of Fyn in 1993 and in the municipality of Frederiksberg in 1994, although reduced mortality in randomised controlled trials does not necessarily mean that screening also works in routine health care. In the rest of Denmark mammography screening was introdueed...... in 2007-2008. Women aged 50-69 were invited to screening every second year. Taking advantage of the registers of population and health, we present statistical methods for evaluating the effect of mammography screening on breast cancer mortality (Olsen et al. 2005, Njor et al. 2015 and Weedon-Fekjær etal...

  19. Bayesian Analysis of Multiple Populations I: Statistical and Computational Methods

    CERN Document Server

    Stenning, D C; Robinson, E; van Dyk, D A; von Hippel, T; Sarajedini, A; Stein, N

    2016-01-01

    We develop a Bayesian model for globular clusters composed of multiple stellar populations, extending earlier statistical models for open clusters composed of simple (single) stellar populations (vanDyk et al. 2009, Stein et al. 2013). Specifically, we model globular clusters with two populations that differ in helium abundance. Our model assumes a hierarchical structuring of the parameters in which physical properties---age, metallicity, helium abundance, distance, absorption, and initial mass---are common to (i) the cluster as a whole or to (ii) individual populations within a cluster, or are unique to (iii) individual stars. An adaptive Markov chain Monte Carlo (MCMC) algorithm is devised for model fitting that greatly improves convergence relative to its precursor non-adaptive MCMC algorithm. Our model and computational tools are incorporated into an open-source software suite known as BASE-9. We use numerical studies to demonstrate that our method can recover parameters of two-population clusters, and al...

  20. Statistical methods for determining the effect of mammography screening

    DEFF Research Database (Denmark)

    Lophaven, Søren

    2016-01-01

    In an overview of five randomised controlled trials from Sweden, a reduction of 29% was found in breast cancer mortality in women aged 50-69 at randomisation after a follow up of 5-13 years. Organised, population based, mammography service screening was introduced on the basis of these resultsin...... the municipality of Copenhagen in 1991, in the county of Fyn in 1993 and in the municipality of Frederiksberg in 1994, although reduced mortality in randomised controlled trials does not necessarily mean that screening also works in routine health care. In the rest of Denmark mammography screening was introdueed...... in 2007-2008. Women aged 50-69 were invited to screening every second year. Taking advantage of the registers of population and health, we present statistical methods for evaluating the effect of mammography screening on breast cancer mortality (Olsen et al. 2005, Njor et al. 2015 and Weedon-Fekjær etal...

  1. Self-assessment: an alternative method of assessing speaking skills

    Directory of Open Access Journals (Sweden)

    Ekaterini Chalkia

    2012-02-01

    Full Text Available The present study focuses on self-assessment as an alternative method of assessing the speaking skills of a group of sixth graders of a Greek State Primary School. The paper consists of two parts. In the first part, traditional and alternative assessment approaches are compared and a literature review on self-assessment is presented. In the second part the methodology and the findings of the study are presented. The study was carried out by means of a questionnaire and observation notes. This was done in order to draw conclusions on the benefits of self-assessment, the difficulties students faced while carrying out self-assessment as well as to reveal the extent to which students improved their speaking skills after being involved in self-assessment. The findings revealed that the students were positive towards self-assessment. Although self-assessment was of limited duration, it turned out to be a worthwhile activity as it fostered motivation and sensitized the students to take a more active role in the learning process. It also enabled them to notice their strengths and weaknesses and improve their speaking skills. The study also revealed the practical difficulties the students faced in carrying out their self-assessment. Finally, the study concludes with recommendations for further research into this specific assessment method.

  2. Quality in statistics education : Determinants of course outcomes in methods & statistics education at universities and colleges

    NARCIS (Netherlands)

    Verhoeven, P.S.

    2009-01-01

    Although Statistics is not a very popular course according to most students, a majority of students still take it, as it is mandatory at most Social Science departments. Therefore it takes special teacher’s skills to teach statistics. In order to do so it is essential for teachers to know what stude

  3. Assessment Methods in Statistical Education An International Perspective

    CERN Document Server

    Bidgood, Penelope; Jolliffe, Flavia

    2010-01-01

    This book is a collaboration from leading figures in statistical education and is designed primarily for academic audiences involved in teaching statistics and mathematics. The book is divided in four sections: (1) Assessment using real-world problems, (2) Assessment statistical thinking, (3) Individual assessment (4) Successful assessment strategies.

  4. An alternative method to specify the degree of resonator stability

    Indian Academy of Sciences (India)

    Jogy George; K Ranganathan; T P S Nathan

    2007-04-01

    We present an alternative method to specify the stability of real stable resonators. We introduce the degree of optical stability or the parameter, which specify the stability of resonators in a numerical scale ranging from 0 to 100%. The value of zero corresponds to marginally stable resonator and < 0 corresponds to unstable resonator. Also, three definitions of the S parameter are provided: in terms of &, & R0 and 12. It may be noticed from the present formalism that the maximum degree of stability with = 1 automatically corresponds to 12 = 1/2. We also describe the method to measure the parameter from the output beam characteristics and parameter. A possible correlation between the parameter and the misalignment tolerance is also discussed.

  5. Outcome modelling strategies in epidemiology: traditional methods and basic alternatives.

    Science.gov (United States)

    Greenland, Sander; Daniel, Rhian; Pearce, Neil

    2016-04-01

    Controlling for too many potential confounders can lead to or aggravate problems of data sparsity or multicollinearity, particularly when the number of covariates is large in relation to the study size. As a result, methods to reduce the number of modelled covariates are often deployed. We review several traditional modelling strategies, including stepwise regression and the 'change-in-estimate' (CIE) approach to deciding which potential confounders to include in an outcome-regression model for estimating effects of a targeted exposure. We discuss their shortcomings, and then provide some basic alternatives and refinements that do not require special macros or programming. Throughout, we assume the main goal is to derive the most accurate effect estimates obtainable from the data and commercial software. Allowing that most users must stay within standard software packages, this goal can be roughly approximated using basic methods to assess, and thereby minimize, mean squared error (MSE).

  6. Discussion: "Comparison of Statistical Methods for Assessing Spatial Correlations Between Maps of Different Arterial Properties" (Rowland, E. M., Mohamied, Y., Chooi, K. Y., Bailey, E. L., and Weinberg, P. D., 2015, ASME J. Biomech. Eng., 137(10), p. 101003): An Alternative Approach Using Segmentation Based on Local Hemodynamics.

    Science.gov (United States)

    Himburg, Heather A; Grzybowski, Deborah M; Hazel, Andrew L; LaMack, Jeffrey A; Friedman, Morton H

    2016-09-01

    The biological response of living arteries to mechanical forces is an important component of the atherosclerotic process and is responsible, at least in part, for the well-recognized spatial variation in atherosusceptibility in man. Experiments to elucidate this response often generate maps of force and response variables over the arterial surface, from which the force-response relationship is sought. Rowland et al. discussed several statistical approaches to the spatial autocorrelation that confounds the analysis of such maps and applied them to maps of hemodynamic stress and vascular response obtained by averaging these variables in multiple animals. Here, we point out an alternative approach, in which discrete surface regions are defined by the hemodynamic stress levels they experience, and the stress and response in each animal are treated separately. This approach, applied properly, is insensitive to autocorrelation and less sensitive to the effect of confounding hemodynamic variables. The analysis suggests an inverse relation between permeability and shear that differs from that in Rowland et al. Possible sources of this difference are suggested.

  7. Emperical Laws in Economics Uncovered Using Methods in Statistical Mechanics

    Science.gov (United States)

    Stanley, H. Eugene

    2001-06-01

    In recent years, statistical physicists and computational physicists have determined that physical systems which consist of a large number of interacting particles obey universal "scaling laws" that serve to demonstrate an intrinsic self-similarity operating in such systems. Further, the parameters appearing in these scaling laws appear to be largely independent of the microscopic details. Since economic systems also consist of a large number of interacting units, it is plausible that scaling theory can be usefully applied to economics. To test this possibility using realistic data sets, a number of scientists have begun analyzing economic data using methods of statistical physics [1]. We have found evidence for scaling (and data collapse), as well as universality, in various quantities, and these recent results will be reviewed in this talk--starting with the most recent study [2]. We also propose models that may lead to some insight into these phenomena. These results will be discussed, as well as the overall rationale for why one might expect scaling principles to hold for complex economic systems. This work on which this talk is based is supported by BP, and was carried out in collaboration with L. A. N. Amaral S. V. Buldyrev, D. Canning, P. Cizeau, X. Gabaix, P. Gopikrishnan, S. Havlin, Y. Lee, Y. Liu, R. N. Mantegna, K. Matia, M. Meyer, C.-K. Peng, V. Plerou, M. A. Salinger, and M. H. R. Stanley. [1.] See, e.g., R. N. Mantegna and H. E. Stanley, Introduction to Econophysics: Correlations & Complexity in Finance (Cambridge University Press, Cambridge, 1999). [2.] P. Gopikrishnan, B. Rosenow, V. Plerou, and H. E. Stanley, "Identifying Business Sectors from Stock Price Fluctuations," e-print cond-mat/0011145; V. Plerou, P. Gopikrishnan, L. A. N. Amaral, X. Gabaix, and H. E. Stanley, "Diffusion and Economic Fluctuations," Phys. Rev. E (Rapid Communications) 62, 3023-3026 (2000); P. Gopikrishnan, V. Plerou, X. Gabaix, and H. E. Stanley, "Statistical Properties of

  8. Statistical methods for detecting periodic fragments in DNA sequence data

    Directory of Open Access Journals (Sweden)

    Ying Hua

    2011-04-01

    Full Text Available Abstract Background Period 10 dinucleotides are structurally and functionally validated factors that influence the ability of DNA to form nucleosomes, histone core octamers. Robust identification of periodic signals in DNA sequences is therefore required to understand nucleosome organisation in genomes. While various techniques for identifying periodic components in genomic sequences have been proposed or adopted, the requirements for such techniques have not been considered in detail and confirmatory testing for a priori specified periods has not been developed. Results We compared the estimation accuracy and suitability for confirmatory testing of autocorrelation, discrete Fourier transform (DFT, integer period discrete Fourier transform (IPDFT and a previously proposed Hybrid measure. A number of different statistical significance procedures were evaluated but a blockwise bootstrap proved superior. When applied to synthetic data whose period-10 signal had been eroded, or for which the signal was approximately period-10, the Hybrid technique exhibited superior properties during exploratory period estimation. In contrast, confirmatory testing using the blockwise bootstrap procedure identified IPDFT as having the greatest statistical power. These properties were validated on yeast sequences defined from a ChIP-chip study where the Hybrid metric confirmed the expected dominance of period-10 in nucleosome associated DNA but IPDFT identified more significant occurrences of period-10. Application to the whole genomes of yeast and mouse identified ~ 21% and ~ 19% respectively of these genomes as spanned by period-10 nucleosome positioning sequences (NPS. Conclusions For estimating the dominant period, we find the Hybrid period estimation method empirically to be the most effective for both eroded and approximate periodicity. The blockwise bootstrap was found to be effective as a significance measure, performing particularly well in the problem of

  9. Fast gain calibration in radio astronomy using alternating direction implicit methods: Analysis and applications

    CERN Document Server

    Salvini, Stefano

    2014-01-01

    Context. Modern radio astronomical arrays have (or will have) more than one order of magnitude more receivers than classical synthesis arrays, such as the VLA and the WSRT. This makes gain calibration a computationally demanding task. Several alternating direction implicit (ADI) approaches have therefore been proposed that reduce numerical complexity for this task from $\\mathcal{O}(P^3)$ to $\\mathcal{O}(P^2)$, where $P$ is the number of receive paths to be calibrated. Aims. We present an ADI method, show that it converges to the optimal solution, and assess its numerical, computational and statistical performance. We also discuss its suitability for application in self-calibration and report on its successful application in LOFAR standard pipelines. Methods. Convergence is proved by rigorous mathematical analysis using a contraction mapping. Its numerical, algorithmic, and statistical performance, as well as its suitability for application in self-calibration, are assessed using simulations. Results. Our simu...

  10. System Synthesis in Preliminary Aircraft Design Using Statistical Methods

    Science.gov (United States)

    DeLaurentis, Daniel; Mavris, Dimitri N.; Schrage, Daniel P.

    1996-01-01

    This paper documents an approach to conceptual and early preliminary aircraft design in which system synthesis is achieved using statistical methods, specifically Design of Experiments (DOE) and Response Surface Methodology (RSM). These methods are employed in order to more efficiently search the design space for optimum configurations. In particular, a methodology incorporating three uses of these techniques is presented. First, response surface equations are formed which represent aerodynamic analyses, in the form of regression polynomials, which are more sophisticated than generally available in early design stages. Next, a regression equation for an Overall Evaluation Criterion is constructed for the purpose of constrained optimization at the system level. This optimization, though achieved in an innovative way, is still traditional in that it is a point design solution. The methodology put forward here remedies this by introducing uncertainty into the problem, resulting in solutions which are probabilistic in nature. DOE/RSM is used for the third time in this setting. The process is demonstrated through a detailed aero-propulsion optimization of a High Speed Civil Transport. Fundamental goals of the methodology, then, are to introduce higher fidelity disciplinary analyses to the conceptual aircraft synthesis and provide a roadmap for transitioning from point solutions to probabilistic designs (and eventually robust ones).

  11. Evaluation of alternative methods for the disinfection of toothbrushes

    Directory of Open Access Journals (Sweden)

    Edson Yukio Komiyama

    2010-03-01

    Full Text Available The aim of this study was to evaluate alternative methods for the disinfection of toothbrushes considering that most of the previously proposed methods are expensive and cannot be easily implemented. Two-hundred toothbrushes with standardized dimensions and bristles were included in the study. The toothbrushes were divided into 20 experimental groups (n = 10, according to microorganism considered and chemical agent used. The toothbrushes were contaminated in vitro by standardized suspensions of Streptococcus mutans, Streptococcus pyogenes, Staphylococcus aureus or Candida albicans. The following disinfectants were tested: 0.12% chlorhexidine digluconate, 50% white vinegar, a triclosan-containing dentifrice solution, and a perborate-based tablet solution. The disinfection method was immersion in the disinfectant for 10 min. After the disinfection procedure, the number of remaining microbial cells was evaluated. The values of cfu/toothbrush of each group of microorganism after disinfection were compared by Kruskal-Wallis ANOVA and Dunn's test for multiple comparisons (5%. The chlorhexidine digluconate solution was the most effective disinfectant. The triclosan-based dentifrice solution promoted a significant reduction of all microorganisms' counts in relation to the control group. As to the disinfection with 50% vinegar, a significant reduction was observed for all the microorganisms, except for C. albicans. The sodium perborate solution was the less effective against the tested microorganisms. Solutions based on triclosan-containing dentifrice may be considered effective, nontoxic, cost-effective, and an easily applicable alternative for the disinfection of toothbrushes. The vinegar solution reduced the presence of S. aureus, S. mutans and S. pyogenes on toothbrushes.

  12. The European Partnership for Alternative Approaches to Animal Testing (EPAA): Promoting Alternative Methods in Europe and Beyond

    OpenAIRE

    COZIGOU, Gwenole; Crozier, Jonathan; Hendriksen, Coenraad; Manou, Irene; Ramirez-Hernandez, Tzutzuy; Weissenhorn, Renate

    2015-01-01

    Here in we introduce the European Partnership for Alternative Approaches to Animal Testing (EPAA) and its activities, which are focused on international cooperation toward alternative methods. The EPAA is one of the leading organizations in Europe for the promotion of alternative approaches to animal testing. Its innovative public–private partnership structure enables a consensus-driven dialogue across 7 industry sectors to facilitate interaction between regulators and regulated stakeholders....

  13. Hydrologic extremes - an intercomparison of multiple gridded statistical downscaling methods

    Science.gov (United States)

    Werner, Arelia T.; Cannon, Alex J.

    2016-04-01

    Gridded statistical downscaling methods are the main means of preparing climate model data to drive distributed hydrological models. Past work on the validation of climate downscaling methods has focused on temperature and precipitation, with less attention paid to the ultimate outputs from hydrological models. Also, as attention shifts towards projections of extreme events, downscaling comparisons now commonly assess methods in terms of climate extremes, but hydrologic extremes are less well explored. Here, we test the ability of gridded downscaling models to replicate historical properties of climate and hydrologic extremes, as measured in terms of temporal sequencing (i.e. correlation tests) and distributional properties (i.e. tests for equality of probability distributions). Outputs from seven downscaling methods - bias correction constructed analogues (BCCA), double BCCA (DBCCA), BCCA with quantile mapping reordering (BCCAQ), bias correction spatial disaggregation (BCSD), BCSD using minimum/maximum temperature (BCSDX), the climate imprint delta method (CI), and bias corrected CI (BCCI) - are used to drive the Variable Infiltration Capacity (VIC) model over the snow-dominated Peace River basin, British Columbia. Outputs are tested using split-sample validation on 26 climate extremes indices (ClimDEX) and two hydrologic extremes indices (3-day peak flow and 7-day peak flow). To characterize observational uncertainty, four atmospheric reanalyses are used as climate model surrogates and two gridded observational data sets are used as downscaling target data. The skill of the downscaling methods generally depended on reanalysis and gridded observational data set. However, CI failed to reproduce the distribution and BCSD and BCSDX the timing of winter 7-day low-flow events, regardless of reanalysis or observational data set. Overall, DBCCA passed the greatest number of tests for the ClimDEX indices, while BCCAQ, which is designed to more accurately resolve event

  14. An alternative method for order tracking using autopower spectrum

    Directory of Open Access Journals (Sweden)

    Guido R Guercioni

    2015-11-01

    Full Text Available Order tracking is a method of analysis used by engineers in the diagnosis of rotating machinery. In many applications, order analysis of non-stationary signals is required. The direct extraction of the amplitude information from the short-time Fourier transform may lead to inaccurate vibration-level estimation in the case of fast changes in the signal frequency content. This article discusses spectral smearing, which is the main reason of the problem, and its sensitivity to the characteristics of the signal (frequency and amplitude variations and to the input parameters of discrete Fourier transform analysis (window size and type. Through the years, many different approaches to perform order analysis have been developed; this article introduces a novel method for order tracking based on the short-time Fourier transform, which applies a compensation of the smearing effect based on an invariant information contained in autopower spectrum. The limitations and capabilities of the proposed method with respect to other existing techniques are discussed: considering the accuracy of the results, low requirements of computational resources, and ease of implementation, this method proves a valid alternative to currently used techniques.

  15. Improved statistical method for temperature and salinity quality control

    Science.gov (United States)

    Gourrion, Jérôme; Szekely, Tanguy

    2017-04-01

    Climate research and Ocean monitoring benefit from the continuous development of global in-situ hydrographic networks in the last decades. Apart from the increasing volume of observations available on a large range of temporal and spatial scales, a critical aspect concerns the ability to constantly improve the quality of the datasets. In the context of the Coriolis Dataset for ReAnalysis (CORA) version 4.2, a new quality control method based on a local comparison to historical extreme values ever observed is developed, implemented and validated. Temperature, salinity and potential density validity intervals are directly estimated from minimum and maximum values from an historical reference dataset, rather than from traditional mean and standard deviation estimates. Such an approach avoids strong statistical assumptions on the data distributions such as unimodality, absence of skewness and spatially homogeneous kurtosis. As a new feature, it also allows addressing simultaneously the two main objectives of an automatic quality control strategy, i.e. maximizing the number of good detections while minimizing the number of false alarms. The reference dataset is presently built from the fusion of 1) all ARGO profiles up to late 2015, 2) 3 historical CTD datasets and 3) the Sea Mammals CTD profiles from the MEOP database. All datasets are extensively and manually quality controlled. In this communication, the latest method validation results are also presented. The method has already been implemented in the latest version of the delayed-time CMEMS in-situ dataset and will be deployed soon in the equivalent near-real time products.

  16. On the statistical equivalence of restrained-ensemble simulations with the maximum entropy method.

    Science.gov (United States)

    Roux, Benoît; Weare, Jonathan

    2013-02-28

    An issue of general interest in computer simulations is to incorporate information from experiments into a structural model. An important caveat in pursuing this goal is to avoid corrupting the resulting model with spurious and arbitrary biases. While the problem of biasing thermodynamic ensembles can be formulated rigorously using the maximum entropy method introduced by Jaynes, the approach can be cumbersome in practical applications with the need to determine multiple unknown coefficients iteratively. A popular alternative strategy to incorporate the information from experiments is to rely on restrained-ensemble molecular dynamics simulations. However, the fundamental validity of this computational strategy remains in question. Here, it is demonstrated that the statistical distribution produced by restrained-ensemble simulations is formally consistent with the maximum entropy method of Jaynes. This clarifies the underlying conditions under which restrained-ensemble simulations will yield results that are consistent with the maximum entropy method.

  17. Determination of Reference Catalogs for Meridian Observations Using Statistical Method

    Science.gov (United States)

    Li, Z. Y.

    2014-09-01

    The meridian observational data are useful for developing high-precision planetary ephemerides of the solar system. These historical data are provided by the jet propulsion laboratory (JPL) or the Institut De Mecanique Celeste Et De Calcul Des Ephemerides (IMCCE). However, we find that the reference systems (realized by the fundamental catalogs FK3 (Third Fundamental Catalogue), FK4 (Fourth Fundamental Catalogue), and FK5 (Fifth Fundamental Catalogue), or Hipparcos), to which the observations are referred, are not given explicitly for some sets of data. The incompleteness of information prevents us from eliminating the systematic effects due to the different fundamental catalogs. The purpose of this paper is to specify clearly the reference catalogs of these observations with the problems in their records by using the JPL DE421 ephemeris. The data for the corresponding planets in the geocentric celestial reference system (GCRS) obtained from the DE421 are transformed to the apparent places with different hypothesis regarding the reference catalogs. Then the validations of the hypothesis are tested by two kinds of statistical quantities which are used to indicate the significance of difference between the original and transformed data series. As a result, this method is proved to be effective for specifying the reference catalogs, and the missed information is determined unambiguously. Finally these meridian data are transformed to the GCRS for further applications in the development of planetary ephemerides.

  18. Methods in probability and statistical inference. Final report, June 15, 1975-June 30, 1979. [Dept. of Statistics, Univ. of Chicago

    Energy Technology Data Exchange (ETDEWEB)

    Wallace, D L; Perlman, M D

    1980-06-01

    This report describes the research activities of the Department of Statistics, University of Chicago, during the period June 15, 1975 to July 30, 1979. Nine research projects are briefly described on the following subjects: statistical computing and approximation techniques in statistics; numerical computation of first passage distributions; probabilities of large deviations; combining independent tests of significance; small-sample efficiencies of tests and estimates; improved procedures for simultaneous estimation and testing of many correlations; statistical computing and improved regression methods; comparison of several populations; and unbiasedness in multivariate statistics. A description of the statistical consultation activities of the Department that are of interest to DOE, in particular, the scientific interactions between the Department and the scientists at Argonne National Laboratories, is given. A list of publications issued during the term of the contract is included.

  19. ALTERNATIVE FIELD METHODS TO TREAT MERCURY IN SOIL

    Energy Technology Data Exchange (ETDEWEB)

    Ernest F. Stine Jr; Steven T. Downey

    2002-08-14

    U.S. Department of Energy (DOE) used large quantities of mercury in the uranium separating process from the 1950s until the late 1980s in support of national defense. Some of this mercury, as well as other hazardous metals and radionuclides, found its way into, and under, several buildings, soil and subsurface soils and into some of the surface waters. Several of these areas may pose potential health or environmental risks and must be dealt with under current environmental regulations. DOE's National Energy Technology Laboratory (NETL) awarded a contract ''Alternative Field Methods to Treat Mercury in Soil'' to IT Group, Knoxville TN (IT) and its subcontractor NFS, Erwin, TN to identify remedial methods to clean up mercury-contaminated high-clay content soils using proven treatment chemistries. The sites of interest were the Y-12 National Security Complex located in Oak Ridge, Tennessee, the David Witherspoon properties located in Knoxville, Tennessee, and at other similarly contaminated sites. The primary laboratory-scale contract objectives were (1) to safely retrieve and test samples of contaminated soil in an approved laboratory and (2) to determine an acceptable treatment method to ensure that the mercury does not leach from the soil above regulatory levels. The leaching requirements were to meet the TC (0.2 mg/l) and UTS (0.025 mg/l) TCLP criteria. In-situ treatments were preferred to control potential mercury vapors emissions and liquid mercury spills associated with ex-situ treatments. All laboratory work was conducted in IT's and NFS laboratories. Mercury contaminated nonradioactive soil from under the Alpha 2 building in the Y-12 complex was used. This soils contained insufficient levels of leachable mercury and resulted in TCLP mercury concentrations that were similar to the applicable LDR limits. The soil was spiked at multiple levels with metallic (up to 6000 mg/l) and soluble mercury compounds (up to 500 mg/kg) to

  20. Statistical methods for decision making in mine action

    DEFF Research Database (Denmark)

    Larsen, Jan

    The lecture discusses the basics of statistical decision making in connection with humanitarian mine action. There is special focus on: 1) requirements for mine detection; 2) design and evaluation of mine equipment; 3) performance improvement by statistical learning and information fusion; 4...

  1. An alternative method for centrifugal compressor loading factor modelling

    Science.gov (United States)

    Galerkin, Y.; Drozdov, A.; Rekstin, A.; Soldatova, K.

    2017-08-01

    The loading factor at design point is calculated by one or other empirical formula in classical design methods. Performance modelling as a whole is out of consideration. Test data of compressor stages demonstrates that loading factor versus flow coefficient at the impeller exit has a linear character independent of compressibility. Known Universal Modelling Method exploits this fact. Two points define the function – loading factor at design point and at zero flow rate. The proper formulae include empirical coefficients. A good modelling result is possible if the choice of coefficients is based on experience and close analogs. Earlier Y. Galerkin and K. Soldatova had proposed to define loading factor performance by the angle of its inclination to the ordinate axis and by the loading factor at zero flow rate. Simple and definite equations with four geometry parameters were proposed for loading factor performance calculated for inviscid flow. The authors of this publication have studied the test performance of thirteen stages of different types. The equations are proposed with universal empirical coefficients. The calculation error lies in the range of plus to minus 1,5%. The alternative model of a loading factor performance modelling is included in new versions of the Universal Modelling Method.

  2. Statistics a guide to the use of statistical methods in the physical sciences

    CERN Document Server

    Barlow, Roger J

    1989-01-01

    The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition F. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A.C. Phillips Computing for Scienti

  3. ALTERNATIVE FIELD METHODS TO TREAT MERCURY IN SOIL

    Energy Technology Data Exchange (ETDEWEB)

    Ernie F. Stine

    2002-08-14

    The Department of Energy (DOE) currently has mercury (Hg) contaminated materials and soils at the various sites. Figure 1-1 (from http://www.ct.ornl.gov/stcg.hg/) shows the estimated distribution of mercury contaminated waste at the various DOE sites. Oak Ridge and Idaho sites have the largest deposits of contaminated materials. The majorities of these contaminated materials are soils, sludges, debris, and waste waters. This project concerns treatment of mercury contaminated soils. The technology is applicable to many DOE sites, in-particular, the Y-12 National Security Complex in Oak Ridge Tennessee and Idaho National Engineering and Environmental Laboratory (INEEL). These sites have the majority of the soils and sediments contaminated with mercury. The soils may also be contaminated with other hazardous metals and radionuclides. At the Y12 plant, the baseline treatment method for mercury contaminated soil is low temperature thermal desorption (LTTD), followed by on-site landfill disposal. LTTD is relatively expensive (estimated cost of treatment which exclude disposal cost for the collect mercury is greater than $740/per cubic yard [cy] at Y-12), does not treat any of the metal or radionuclides. DOE is seeking a less costly alternative to the baseline technology. As described in the solicitation (DE-RA-01NT41030), this project initially focused on evaluating cost-effective in-situ alternatives to stabilize or remove the mercury (Hg) contamination from high-clay content soil. It was believed that ex-situ treatment of soil contaminated with significant quantities of free-liquid mercury might pose challenges during excavation and handling. Such challenges may include controlling potential mercury vapors and containing liquid mercury beads. As described below, the focus of this project was expanded to include consideration of ex-situ treatment after award of the contract to International Technology Corporation (IT). After award of the contract, IT became part of Shaw

  4. Alternative methods for determining shrinkage in restorative resin composites.

    Science.gov (United States)

    de Melo Monteiro, Gabriela Queiroz; Montes, Marcos Antonio Japiassú Resende; Rolim, Tiago Vieira; de Oliveira Mota, Cláudia Cristina Brainer; de Barros Correia Kyotoku, Bernardo; Gomes, Anderson Stevens Leônidas; de Freitas, Anderson Zanardi

    2011-08-01

    The purpose of this study was to evaluate polymerization shrinkage of resin composites using a coordinate measuring machine, optical coherence tomography and a more widely known method, such as Archimedes Principle. Two null hypothesis were tested: (1) there are no differences between the materials tested; (2) there are no differences between the methods used for polymerization shrinkage measurements. Polymerization shrinkage of seven resin-based dental composites (Filtek Z250™, Filtek Z350™, Filtek P90™/3M ESPE, Esthet-X™, TPH Spectrum™/Dentsply 4 Seasons™, Tetric Ceram™/Ivoclar-Vivadent) was measured. For coordinate measuring machine measurements, composites were applied to a cylindrical Teflon mold (7 mm × 2 mm), polymerized and removed from the mold. The difference between the volume of the mold and the volume of the specimen was calculated as a percentage. Optical coherence tomography was also used for linear shrinkage evaluations. The thickness of the specimens was measured before and after photoactivation. Polymerization shrinkage was also measured using Archimedes Principle of buoyancy (n=5). Statistical analysis of the data was performed with ANOVA and the Games-Howell test. The results show that polymerization shrinkage values vary with the method used. Despite numerical differences the ranking of the resins was very similar with Filtek P90 presenting the lowest shrinkage values. Because of the variations in the results, reported values could only be used to compare materials within the same method. However, it is possible rank composites for polymerization shrinkage and to relate these data from different test methods. Independently of the method used, reduced polymerization shrinkage was found for silorane resin-based composite. Copyright © 2011 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  5. Robust Control Methods for On-Line Statistical Learning

    Directory of Open Access Journals (Sweden)

    Capobianco Enrico

    2001-01-01

    Full Text Available The issue of controlling that data processing in an experiment results not affected by the presence of outliers is relevant for statistical control and learning studies. Learning schemes should thus be tested for their capacity of handling outliers in the observed training set so to achieve reliable estimates with respect to the crucial bias and variance aspects. We describe possible ways of endowing neural networks with statistically robust properties by defining feasible error criteria. It is convenient to cast neural nets in state space representations and apply both Kalman filter and stochastic approximation procedures in order to suggest statistically robustified solutions for on-line learning.

  6. Statistical methods in joint modeling of longitudinal and survival data

    Science.gov (United States)

    Dempsey, Walter

    Survival studies often generate not only a survival time for each patient but also a sequence of health measurements at annual or semi-annual check-ups while the patient remains alive. Such a sequence of random length accompanied by a survival time is called a survival process. Ordinarily robust health is associated with longer survival, so the two parts of a survival process cannot be assumed independent. The first part of the thesis is concerned with a general technique---reverse alignment---for constructing statistical models for survival processes. A revival model is a regression model in the sense that it incorporates covariate and treatment effects into both the distribution of survival times and the joint distribution of health outcomes. The revival model also determines a conditional survival distribution given the observed history, which describes how the subsequent survival distribution is determined by the observed progression of health outcomes. The second part of the thesis explores the concept of a consistent exchangeable survival process---a joint distribution of survival times in which the risk set evolves as a continuous-time Markov process with homogeneous transition rates. A correspondence with the de Finetti approach of constructing an exchangeable survival process by generating iid survival times conditional on a completely independent hazard measure is shown. Several specific processes are detailed, showing how the number of blocks of tied failure times grows asymptotically with the number of individuals in each case. In particular, we show that the set of Markov survival processes with weakly continuous predictive distributions can be characterized by a two-dimensional family called the harmonic process. The outlined methods are then applied to data, showing how they can be easily extended to handle censoring and inhomogeneity among patients.

  7. A comparative assessment of statistical methods for extreme weather analysis

    Science.gov (United States)

    Schlögl, Matthias; Laaha, Gregor

    2017-04-01

    Extreme weather exposure assessment is of major importance for scientists and practitioners alike. We compare different extreme value approaches and fitting methods with respect to their value for assessing extreme precipitation and temperature impacts. Based on an Austrian data set from 25 meteorological stations representing diverse meteorological conditions, we assess the added value of partial duration series over the standardly used annual maxima series in order to give recommendations for performing extreme value statistics of meteorological hazards. Results show the merits of the robust L-moment estimation, which yielded better results than maximum likelihood estimation in 62 % of all cases. At the same time, results question the general assumption of the threshold excess approach (employing partial duration series, PDS) being superior to the block maxima approach (employing annual maxima series, AMS) due to information gain. For low return periods (non-extreme events) the PDS approach tends to overestimate return levels as compared to the AMS approach, whereas an opposite behavior was found for high return levels (extreme events). In extreme cases, an inappropriate threshold was shown to lead to considerable biases that may outperform the possible gain of information from including additional extreme events by far. This effect was neither visible from the square-root criterion, nor from standardly used graphical diagnosis (mean residual life plot), but from a direct comparison of AMS and PDS in synoptic quantile plots. We therefore recommend performing AMS and PDS approaches simultaneously in order to select the best suited approach. This will make the analyses more robust, in cases where threshold selection and dependency introduces biases to the PDS approach, but also in cases where the AMS contains non-extreme events that may introduce similar biases. For assessing the performance of extreme events we recommend conditional performance measures that focus

  8. Statistical Models and Methods for Network Meta-Analysis.

    Science.gov (United States)

    Madden, L V; Piepho, H-P; Paul, P A

    2016-08-01

    Meta-analysis, the methodology for analyzing the results from multiple independent studies, has grown tremendously in popularity over the last four decades. Although most meta-analyses involve a single effect size (summary result, such as a treatment difference) from each study, there are often multiple treatments of interest across the network of studies in the analysis. Multi-treatment (or network) meta-analysis can be used for simultaneously analyzing the results from all the treatments. However, the methodology is considerably more complicated than for the analysis of a single effect size, and there have not been adequate explanations of the approach for agricultural investigations. We review the methods and models for conducting a network meta-analysis based on frequentist statistical principles, and demonstrate the procedures using a published multi-treatment plant pathology data set. A major advantage of network meta-analysis is that correlations of estimated treatment effects are automatically taken into account when an appropriate model is used. Moreover, treatment comparisons may be possible in a network meta-analysis that are not possible in a single study because all treatments of interest may not be included in any given study. We review several models that consider the study effect as either fixed or random, and show how to interpret model-fitting output. We further show how to model the effect of moderator variables (study-level characteristics) on treatment effects, and present one approach to test for the consistency of treatment effects across the network. Online supplemental files give explanations on fitting the network meta-analytical models using SAS.

  9. Understanding data better with Bayesian and global statistical methods

    CERN Document Server

    Press, W H

    1996-01-01

    To understand their data better, astronomers need to use statistical tools that are more advanced than traditional ``freshman lab'' statistics. As an illustration, the problem of combining apparently incompatible measurements of a quantity is presented from both the traditional, and a more sophisticated Bayesian, perspective. Explicit formulas are given for both treatments. Results are shown for the value of the Hubble Constant, and a 95% confidence interval of 66 < H0 < 82 (km/s/Mpc) is obtained.

  10. Teaching biology through statistics: application of statistical methods in genetics and zoology courses.

    Science.gov (United States)

    Colon-Berlingeri, Migdalisel; Burrowes, Patricia A

    2011-01-01

    Incorporation of mathematics into biology curricula is critical to underscore for undergraduate students the relevance of mathematics to most fields of biology and the usefulness of developing quantitative process skills demanded in modern biology. At our institution, we have made significant changes to better integrate mathematics into the undergraduate biology curriculum. The curricular revision included changes in the suggested course sequence, addition of statistics and precalculus as prerequisites to core science courses, and incorporating interdisciplinary (math-biology) learning activities in genetics and zoology courses. In this article, we describe the activities developed for these two courses and the assessment tools used to measure the learning that took place with respect to biology and statistics. We distinguished the effectiveness of these learning opportunities in helping students improve their understanding of the math and statistical concepts addressed and, more importantly, their ability to apply them to solve a biological problem. We also identified areas that need emphasis in both biology and mathematics courses. In light of our observations, we recommend best practices that biology and mathematics academic departments can implement to train undergraduates for the demands of modern biology.

  11. Failure of the Volume Function in Granular Statistical Mechanics and an Alternative Formulation.

    Science.gov (United States)

    Blumenfeld, Raphael; Amitai, Shahar; Jordan, Joe F; Hihinashvili, Rebecca

    2016-04-08

    We first show that the currently accepted statistical mechanics for granular matter is flawed. The reason is that it is based on the volume function, which depends only on a minute fraction of all the structural degrees of freedom and is unaffected by most of the configurational microstates. Consequently, the commonly used partition function underestimates the entropy severely. We then propose a new formulation, replacing the volume function with a connectivity function that depends on all the structural degrees of freedom and accounts correctly for the entire entropy. We discuss the advantages of the new formalism and derive explicit results for two- and three-dimensional systems. We test the formalism by calculating the entropy of an experimental two-dimensional system, as a function of system size, and showing that it is an extensive variable.

  12. Understanding Alternative Education: A Mixed Methods Examination of Student Experiences

    Science.gov (United States)

    Farrelly, Susan Glassett; Daniels, Erika

    2014-01-01

    Alternative education plays a critical role in the opportunity gap that persists in the US public education system. However, there has been little research on alternative schools. Scaffolded by a theoretical framework constructed from critical theory, self-determination theory (SDT) and student voice, this research examined how well students in…

  13. 77 FR 8865 - Recent Postings of Broadly Applicable Alternative Test Methods

    Science.gov (United States)

    2012-02-15

    ... AGENCY Recent Postings of Broadly Applicable Alternative Test Methods AGENCY: Environmental Protection.... Background Broadly applicable alternative test method approval decisions made by the EPA in 2011 under the...). Source owners and operators may voluntarily use these broadly applicable alternative test methods subject...

  14. Cluster Size Statistic and Cluster Mass Statistic: Two Novel Methods for Identifying Changes in Functional Connectivity Between Groups or Conditions

    Science.gov (United States)

    Ing, Alex; Schwarzbauer, Christian

    2014-01-01

    Functional connectivity has become an increasingly important area of research in recent years. At a typical spatial resolution, approximately 300 million connections link each voxel in the brain with every other. This pattern of connectivity is known as the functional connectome. Connectivity is often compared between experimental groups and conditions. Standard methods used to control the type 1 error rate are likely to be insensitive when comparisons are carried out across the whole connectome, due to the huge number of statistical tests involved. To address this problem, two new cluster based methods – the cluster size statistic (CSS) and cluster mass statistic (CMS) – are introduced to control the family wise error rate across all connectivity values. These methods operate within a statistical framework similar to the cluster based methods used in conventional task based fMRI. Both methods are data driven, permutation based and require minimal statistical assumptions. Here, the performance of each procedure is evaluated in a receiver operator characteristic (ROC) analysis, utilising a simulated dataset. The relative sensitivity of each method is also tested on real data: BOLD (blood oxygen level dependent) fMRI scans were carried out on twelve subjects under normal conditions and during the hypercapnic state (induced through the inhalation of 6% CO2 in 21% O2 and 73%N2). Both CSS and CMS detected significant changes in connectivity between normal and hypercapnic states. A family wise error correction carried out at the individual connection level exhibited no significant changes in connectivity. PMID:24906136

  15. Cluster size statistic and cluster mass statistic: two novel methods for identifying changes in functional connectivity between groups or conditions.

    Science.gov (United States)

    Ing, Alex; Schwarzbauer, Christian

    2014-01-01

    Functional connectivity has become an increasingly important area of research in recent years. At a typical spatial resolution, approximately 300 million connections link each voxel in the brain with every other. This pattern of connectivity is known as the functional connectome. Connectivity is often compared between experimental groups and conditions. Standard methods used to control the type 1 error rate are likely to be insensitive when comparisons are carried out across the whole connectome, due to the huge number of statistical tests involved. To address this problem, two new cluster based methods--the cluster size statistic (CSS) and cluster mass statistic (CMS)--are introduced to control the family wise error rate across all connectivity values. These methods operate within a statistical framework similar to the cluster based methods used in conventional task based fMRI. Both methods are data driven, permutation based and require minimal statistical assumptions. Here, the performance of each procedure is evaluated in a receiver operator characteristic (ROC) analysis, utilising a simulated dataset. The relative sensitivity of each method is also tested on real data: BOLD (blood oxygen level dependent) fMRI scans were carried out on twelve subjects under normal conditions and during the hypercapnic state (induced through the inhalation of 6% CO2 in 21% O2 and 73%N2). Both CSS and CMS detected significant changes in connectivity between normal and hypercapnic states. A family wise error correction carried out at the individual connection level exhibited no significant changes in connectivity.

  16. An alternative calibration method for counting P-32 reactor monitors

    Energy Technology Data Exchange (ETDEWEB)

    Quirk, T.J. [Applied Nuclear Technologies, Sandia National Laboratories, MS 1143, PO Box 5800, Albuquerque, NM 87185-1143 (United States); Vehar, D.W. [Sandia National Laboratories, Albuquerque, NM 87185-1143 (United States)

    2011-07-01

    Radioactivation of sulfur is a common technique used to measure fast neutron fluences in test and research reactors. Elemental sulfur can be pressed into pellets and used as monitors. The {sup 32}S(n, p) {sup 32}P reaction has a practical threshold of about 3 MeV and its cross section and associated uncertainties are well characterized [1]. The product {sup 32P} emits a beta particle with a maximum energy of 1710 keV [2]. This energetic beta particle allows pellets to be counted intact. ASTM Standard Test Method for Measuring Reaction Rates and Fast-Neutron Fluences by Radioactivation of Sulfur-32 (E265) [3] details a method of calibration for counting systems and subsequent analysis of results. This method requires irradiation of sulfur monitors in a fast-neutron field whose spectrum and intensity are well known. The resultant decay-corrected count rate is then correlated to the known fast neutron fluence. The Radiation Metrology Laboratory (RML) at Sandia has traditionally performed calibration irradiations of sulfur pellets using the {sup 252}Cf spontaneous fission neutron source at the National Inst. of Standards and Technology (NIST) [4] as a transfer standard. However, decay has reduced the intensity of NIST's source; thus lowering the practical upper limits of available fluence. As of May 2010, neutron emission rates have decayed to approximately 3 e8 n/s. In practice, this degradation of capabilities precludes calibrations at the highest fluence levels produced at test reactors and limits the useful range of count rates that can be measured. Furthermore, the reduced availability of replacement {sup 252}Cf threatens the long-term viability of the NIST {sup 252}Cf facility for sulfur pellet calibrations. In lieu of correlating count rate to neutron fluence in a reference field the total quantity of {sup 32}P produced in a pellet can be determined by absolute counting methods. This offers an attractive alternative to extended {sup 252}Cf exposures because

  17. Alternative production methods to face global molybdenum-99 supply shortage.

    Science.gov (United States)

    Lyra, Maria; Charalambatou, Paraskevi; Roussou, Eirini; Fytros, Stavros; Baka, Irini

    2011-01-01

    The sleeping giant of molybdenum-99 ((99)Mo) production is grinding to a halt and the world is wondering how this happened. Fewer than 10 reactors in the world are capable of producing radio nuclides for medicine; approximately 50% of the world's supply of raw material comes from National Research Universal (NRU) reactor in Canada. Many of these reactors, like the NRU, are old and aging. No one of these reactors, and probably not even all of them in combination, can replace the production of NRU. As the healthcare industry faces an aging population and the demand for diagnostic services using (99m)Tc continues to rise, the need for a consistent, reliable supply of (99)Mo has become increasingly important, so alternative methods to produce (99)Mo or even directly (99m)Tc had to be considered to avoid a supply shortage in the coming years. This need guides to the production of (99)Mo by replacing the Highly Enriched Uranium (HEU) target in a nuclear reactor with Low Enriched Uranium (LEU) and furthermore to the use of accelerators for manufacturing (99)Mo or for directly producing (99m)Tc.

  18. Assessing Partnership Alternatives in an IT Network Employing Analytical Methods

    Directory of Open Access Journals (Sweden)

    Vahid Reza Salamat

    2016-01-01

    Full Text Available One of the main critical success factors for the companies is their ability to build and maintain an effective collaborative network. This is more critical in the IT industry where the development of sustainable competitive advantage requires an integration of various resources, platforms, and capabilities provided by various actors. Employing such a collaborative network will dramatically change the operations management and promote flexibility and agility. Despite its importance, there is a lack of an analytical tool on collaborative network building process. In this paper, we propose an optimization model employing AHP and multiobjective programming for collaborative network building process based on two interorganizational relationships’ theories, namely, (i transaction cost theory and (ii resource-based view, which are representative of short-term and long-term considerations. The five different methods were employed to solve the formulation and their performances were compared. The model is implemented in an IT company who was in process of developing a large-scale enterprise resource planning (ERP system. The results show that the collaborative network formed through this selection process was more efficient in terms of cost, time, and development speed. The framework offers novel theoretical underpinning and analytical solutions and can be used as an effective tool in selecting network alternatives.

  19. A six-beam method to measure turbulence statistics using ground-based wind lidars

    Directory of Open Access Journals (Sweden)

    A. Sathe

    2014-10-01

    Full Text Available A so-called six-beam method is proposed to measure atmospheric turbulence using a ground-based wind lidar. This method requires measurement of the radial velocity variances at five equally spaced azimuth angles on the base of a scanning cone and one measurement at the center of the scanning circle, i.e.using a vertical beam at the same height. The scanning configuration is optimized to minimize the sum of the random errors in the measurement of the second-order moments of the components (u,v, w of the wind field. We present this method as an alternative to the so-called velocity azimuth display (VAD method that is routinely used in commercial wind lidars, and which usually results in significant averaging effects of measured turbulence. In the VAD method, the high frequency radial velocity measurements are used instead of their variances. The measurements are performed using a pulsed lidar (WindScanner, and the derived turbulence statistics (using both methods such as the u and v variances are compared with those obtained from a reference cup anemometer and a wind vane at 89 m height under different atmospheric stabilities. The measurements show that in comparison to the reference cup anemometer, depending on the atmospheric stability and the wind field component, the six-beam method measures between 85–101% of the reference turbulence, whereas the VAD method measures between 66–87% of the reference turbulence.

  20. A six-beam method to measure turbulence statistics using ground-based wind lidars

    Science.gov (United States)

    Sathe, A.; Mann, J.; Vasiljevic, N.; Lea, G.

    2015-02-01

    A so-called six-beam method is proposed to measure atmospheric turbulence using a ground-based wind lidar. This method requires measurement of the radial velocity variances at five equally spaced azimuth angles on the base of a scanning cone and one measurement at the centre of the scanning circle, i.e.using a vertical beam at the same height. The scanning configuration is optimized to minimize the sum of the random errors in the measurement of the second-order moments of the components (u,v, w) of the wind field. We present this method as an alternative to the so-called velocity azimuth display (VAD) method that is routinely used in commercial wind lidars, and which usually results in significant averaging effects of measured turbulence. In the VAD method, the high frequency radial velocity measurements are used instead of their variances. The measurements are performed using a pulsed lidar (WindScanner), and the derived turbulence statistics (using both methods) such as the u and v variances are compared with those obtained from a reference cup anemometer and a wind vane at 89 m height under different atmospheric stabilities. The measurements show that in comparison to the reference cup anemometer, depending on the atmospheric stability and the wind field component, the six-beam method measures between 85 and 101% of the reference turbulence, whereas the VAD method measures between 66 and 87% of the reference turbulence.

  1. The European Partnership for Alternative Approaches to Animal Testing (EPAA): promoting alternative methods in Europe and beyond.

    Science.gov (United States)

    Cozigou, Gwenole; Crozier, Jonathan; Hendriksen, Coenraad; Manou, Irene; Ramirez-Hernandez, Tzutzuy; Weissenhorn, Renate

    2015-03-01

    Here in we introduce the European Partnership for Alternative Approaches to Animal Testing (EPAA) and its activities, which are focused on international cooperation toward alternative methods. The EPAA is one of the leading organizations in Europe for the promotion of alternative approaches to animal testing. Its innovative public-private partnership structure enables a consensus-driven dialogue across 7 industry sectors to facilitate interaction between regulators and regulated stakeholders. Through a brief description of EPAA's activities and organizational structure, we first articulate the value of this collaboration; we then focus on 2 key projects driven by EPAA. The first project aims to address research gaps on stem cells for safety testing, whereas the second project strives for an approach toward demonstration of consistency in vaccine batch release testing. We highlight the growing need for harmonization of international acceptance and implementation of alternative approaches and for increased international collaboration to foster progress on nonanimal alternatives.

  2. Statistical methods and applications from a historical perspective selected issues

    CERN Document Server

    Mignani, Stefania

    2014-01-01

    The book showcases a selection of peer-reviewed papers, the preliminary versions of which were presented at a conference held 11-13 June 2011 in Bologna and organized jointly by the Italian Statistical Society (SIS), the National Institute of Statistics (ISTAT) and the Bank of Italy. The theme of the conference was "Statistics in the 150 years of the Unification of Italy." The celebration of the anniversary of Italian unification provided the opportunity to examine and discuss the methodological aspects and applications from a historical perspective and both from a national and international point of view. The critical discussion on the issues of the past has made it possible to focus on recent advances, considering the studies of socio-economic and demographic changes in European countries.

  3. Statistic Non-Parametric Methods of Measurement and Interpretation of Existing Statistic Connections within Seaside Hydro Tourism

    OpenAIRE

    MIRELA SECARĂ

    2008-01-01

    Tourism represents an important field of economic and social life in our country, and the main sector of the economy of Constanta County is the balneary touristic capitalization of Romanian seaside. In order to statistically analyze hydro tourism on Romanian seaside, we have applied non-parametric methods of measuring and interpretation of existing statistic connections within seaside hydro tourism. Major objective of this research is represented by hydro tourism re-establishment on Romanian ...

  4. TREATMENT BY ALTERNATIVE METHODS OF REGRESSION GAS CHROMATOGRAPHIC RETENTION INDICES OF 35 PYRAZINES

    Directory of Open Access Journals (Sweden)

    Fatiha Mebarki

    2016-02-01

    Full Text Available The study treated two closer alternative methods of which the principal characteristic: a non-parametric method (the least absolute deviation (LAD and a traditional method of diagnosis OLS.This was applied to model, separately, the indices of retention of the same whole of 35 pyrazines (27 pyrazines with 8 other pyrazines in the same unit eluted to the columns OV-101 and Carbowax-20M, by using theoretical molecular descriptors calculated using the software DRAGON. The detection of influential observations for non-parametric method (LAD is a problem which has been extensively studied and offers alternative dicapproaches whose main feature is the robustness.here is presented and compared with the standard least squares regression .The comparison between methods LAD and OLS is based on the equation of the hyperplane, in order to confirm the robustness thus to detect by the meaningless statements and the points of lever and validated results in the state approached by the tests statistics: Test of Anderson-Darling, shapiro-wilk, Agostino, Jarque-Bera, graphic test (histogram of frequency and the confidence interval thanks to the concept of robustness to check if the distribution of the errors is really approximate.

  5. Debating Curricular Strategies for Teaching Statistics and Research Methods: What Does the Current Evidence Suggest?

    Science.gov (United States)

    Barron, Kenneth E.; Apple, Kevin J.

    2014-01-01

    Coursework in statistics and research methods is a core requirement in most undergraduate psychology programs. However, is there an optimal way to structure and sequence methodology courses to facilitate student learning? For example, should statistics be required before research methods, should research methods be required before statistics, or…

  6. Debating Curricular Strategies for Teaching Statistics and Research Methods: What Does the Current Evidence Suggest?

    Science.gov (United States)

    Barron, Kenneth E.; Apple, Kevin J.

    2014-01-01

    Coursework in statistics and research methods is a core requirement in most undergraduate psychology programs. However, is there an optimal way to structure and sequence methodology courses to facilitate student learning? For example, should statistics be required before research methods, should research methods be required before statistics, or…

  7. Statistical Methods and Software for the Analysis of Occupational Exposure Data with Non-detectable Values

    Energy Technology Data Exchange (ETDEWEB)

    Frome, EL

    2005-09-20

    Environmental exposure measurements are, in general, positive and may be subject to left censoring; i.e,. the measured value is less than a ''detection limit''. In occupational monitoring, strategies for assessing workplace exposures typically focus on the mean exposure level or the probability that any measurement exceeds a limit. Parametric methods used to determine acceptable levels of exposure, are often based on a two parameter lognormal distribution. The mean exposure level, an upper percentile, and the exceedance fraction are used to characterize exposure levels, and confidence limits are used to describe the uncertainty in these estimates. Statistical methods for random samples (without non-detects) from the lognormal distribution are well known for each of these situations. In this report, methods for estimating these quantities based on the maximum likelihood method for randomly left censored lognormal data are described and graphical methods are used to evaluate the lognormal assumption. If the lognormal model is in doubt and an alternative distribution for the exposure profile of a similar exposure group is not available, then nonparametric methods for left censored data are used. The mean exposure level, along with the upper confidence limit, is obtained using the product limit estimate, and the upper confidence limit on an upper percentile (i.e., the upper tolerance limit) is obtained using a nonparametric approach. All of these methods are well known but computational complexity has limited their use in routine data analysis with left censored data. The recent development of the R environment for statistical data analysis and graphics has greatly enhanced the availability of high-quality nonproprietary (open source) software that serves as the basis for implementing the methods in this paper.

  8. Critical Realism and Statistical Methods--A Response to Nash

    Science.gov (United States)

    Scott, David

    2007-01-01

    This article offers a defence of critical realism in the face of objections Nash (2005) makes to it in a recent edition of this journal. It is argued that critical and scientific realisms are closely related and that both are opposed to statistical positivism. However, the suggestion is made that scientific realism retains (from statistical…

  9. Statistical methods for decision making in mine action

    DEFF Research Database (Denmark)

    Larsen, Jan

    The design and evaluation of mine clearance equipment – the problem of reliability * Detection probability – tossing a coin * Requirements in mine action * Detection probability and confidence in MA * Using statistics in area reduction Improving performance by information fusion and combination...

  10. A Statistical Analysis of the Robustness of Alternate Genetic Coding Tables

    Directory of Open Access Journals (Sweden)

    Isil Aksan Kurnaz

    2008-05-01

    Full Text Available The rules that specify how the information contained in DNA is translated into amino acid “language” during protein synthesis are called “the genetic code”, commonly called the “Standard” or “Universal” Genetic Code Table. As a matter of fact, this coding table is not at all “universal”: in addition to different genetic code tables used by different organisms, even within the same organism the nuclear and mitochondrial genes may be subject to two different coding tables. Results In an attempt to understand the advantages and disadvantages these coding tables may bring to an organism, we have decided to analyze various coding tables on genes subject to mutations, and have estimated how these genes “survive” over generations. We have used this as indicative of the “evolutionary” success of that particular coding table. We find that the “standard” genetic code is not actually the most robust of all coding tables, and interestingly, Flatworm Mitochondrial Code (FMC appears to be the highest ranking coding table given our assumptions. Conclusions It is commonly hypothesized that the more robust a genetic code, the better suited it is for maintenance of the genome. Our study shows that, given the assumptions in our model, Standard Genetic Code is quite poor when compared to other alternate code tables in terms of robustness. This brings about the question of why Standard Code has been so widely accepted by a wider variety of organisms instead of FMC, which needs to be addressed for a thorough understanding of genetic code evolution.

  11. Some thoughts on alternative methods and their scientific implications

    NARCIS (Netherlands)

    Mansvelt, van J.D.

    1983-01-01

    Reflections upon our agricultural problems cannot be limitated to those problems itself, but should incorporate a reflection upon our social and scientific traditions. An alternative agriculture asks for participating nature research

  12. The Budget Scoring Alternatives Financing Methods for Defense Requirements

    Science.gov (United States)

    2007-04-30

    programs, the Department of Defense (DoD) must consider alternative forms of financing, including leases and public - private partnerships (PPPs), to...Åèìáëáíáçå=oÉëÉ~êÅÜW=`ob^qfkd=pvkbodv=clo=fkclojba=`e^kdb====- 4 - = = leases, share-in-savings contracts, and public private partnerships (PPPs), have...to meet the requirements. Alternative Financing Agreements: Public - private Partnerships In August 2003, the Government Accountability Office (GAO

  13. Alternative methods for the treatment of post-menopausal troubles [Alternative Methoden zur Behandlung postmenopausaler Beschwerden

    Directory of Open Access Journals (Sweden)

    Wasem, Jürgen

    2012-05-01

    Full Text Available [english] Menopause is described as the transition from the reproductive phase of a women to the non reproductive. Changes in hormone levels might lead to complaints and health consequences especially during peri- and postmenopause. Hormone therapy has a potential damaging health risk profile and is recommended for temporal limited therapy for acute vasomotor symptoms only.The present HTA-report aims to assess the effectiveness and the cost-effectiveness of alternative treatment methods for women with postmenopausal symptoms in Germany regarding patient relevant endpoints (reduction of symptoms and frequency of adverse events and improvement of quality of life.A systematic literature search was carried out in 33 relevant databases in September 2010. Citations were selected according to pre-defined criteria and were extracted and evaluated.In the systematic research 22 studies are identified for the effectiveness evaluation, 22 primary studies and one review.High doses of isolated genistein reduce the frequency/intensity of hot flashes while low doses of genistein show no significant effect. Intake of isoflavone extract such as genistein, daidzein, glycitein in various combinations does not have an effect on improvement of cognitive function or vaginal dryness. The effect of black cohosh and hop extract for menopausal complaints cannot be determined since results are heterogenous. The combination of isoflavone, black cohosh, monk’s pepper, valerian and vitamin E has a positive effect on menopause symptoms. Ginkgo biloba shows no significant effect on menopause symptoms and cognitive improvement beside mental flexibility. Acupuncture has a significant influence on hot flashes especially in severe cases.No final statement can be drawn regarding the effectiveness of alter­ne treatment methods due to qualitative shortcomings of included studies and a general limited availability of studies in this field. Furthermore, the generalization of the

  14. InSAR Tropospheric Correction Methods: A Statistical Comparison over Different Regions

    Science.gov (United States)

    Bekaert, D. P.; Walters, R. J.; Wright, T. J.; Hooper, A. J.; Parker, D. J.

    2015-12-01

    Observing small magnitude surface displacements through InSAR is highly challenging, and requires advanced correction techniques to reduce noise. In fact, one of the largest obstacles facing the InSAR community is related to tropospheric noise correction. Spatial and temporal variations in temperature, pressure, and relative humidity result in a spatially-variable InSAR tropospheric signal, which masks smaller surface displacements due to tectonic or volcanic deformation. Correction methods applied today include those relying on weather model data, GNSS and/or spectrometer data. Unfortunately, these methods are often limited by the spatial and temporal resolution of the auxiliary data. Alternatively a correction can be estimated from the high-resolution interferometric phase by assuming a linear or a power-law relationship between the phase and topography. For these methods, the challenge lies in separating deformation from tropospheric signals. We will present results of a statistical comparison of the state-of-the-art tropospheric corrections estimated from spectrometer products (MERIS and MODIS), a low and high spatial-resolution weather model (ERA-I and WRF), and both the conventional linear and power-law empirical methods. We evaluate the correction capability over Southern Mexico, Italy, and El Hierro, and investigate the impact of increasing cloud cover on the accuracy of the tropospheric delay estimation. We find that each method has its strengths and weaknesses, and suggest that further developments should aim to combine different correction methods. All the presented methods are included into our new open source software package called TRAIN - Toolbox for Reducing Atmospheric InSAR Noise (Bekaert et al., in review), which is available to the community Bekaert, D., R. Walters, T. Wright, A. Hooper, and D. Parker (in review), Statistical comparison of InSAR tropospheric correction techniques, Remote Sensing of Environment

  15. Which alternative methods to the iridium gamma-graphy?; Quelles methodes alternatives a la gammagraphie a l'iridium?

    Energy Technology Data Exchange (ETDEWEB)

    Hatsch, J.; Chauveau, D.; Blettner, A. [Institut de Soudure, 93 - Villepinte (France)

    2009-05-15

    Gamma-graphy is a very widely used process for testing welded steel constructions, pipings, pressure vessels, frameworks and particularly welds in them. The major disadvantage of this NDT method lies in the risks due to ionizing radiations requiring the setting up of a safety perimeter being a constraint to the owner or involving a shift system for the personnel and thus heavy indirect expenditures. In addition, the recent French regulatory pressure as for transport, storage, and radioactive sources management make their use still more complicated under industrial conditions and their setting up more and more expensive. It seems difficult that only one NDT technique could a substitute for gamma-graphy. Various alternative solutions are possible. Their setting up depends on the type of component to be inspected, on the nature of material, on the type of welding (butt weld, nozzle), on the orientation and position of the defects to be detected as well as their environment. This conference surveys the techniques liable to substitute for gamma-graphy as well as their scope of application and the hindrances limiting their development. (authors)

  16. An Alternative Method for Identifying Interplanetary Magnetic Cloud Regions

    Science.gov (United States)

    Ojeda-Gonzalez, A.; Mendes, O.; Calzadilla, A.; Domingues, M. O.; Prestes, A.; Klausner, V.

    2017-03-01

    Spatio-temporal entropy (STE) analysis is used as an alternative mathematical tool to identify possible magnetic cloud (MC) candidates. We analyze Interplanetary Magnetic Field (IMF) data using a time interval of only 10 days. We select a convenient data interval of 2500 records moving forward by 200 record steps until the end of the time series. For every data segment, the STE is calculated at each step. During an MC event, the STE reaches values close to zero. This extremely low value of STE is due to MC structure features. However, not all of the magnetic components in MCs have STE values close to zero at the same time. For this reason, we create a standardization index (the so-called Interplanetary Entropy, IE, index). This index is a worthwhile effort to develop new tools to help diagnose ICME structures. The IE was calculated using a time window of one year (1999), and it has a success rate of 70% over other identifiers of MCs. The unsuccessful cases (30%) are caused by small and weak MCs. The results show that the IE methodology identified 9 of 13 MCs, and emitted nine false alarm cases. In 1999, a total of 788 windows of 2500 values existed, meaning that the percentage of false alarms was 1.14%, which can be considered a good result. In addition, four time windows, each of 10 days, are studied, where the IE method was effective in finding MC candidates. As a novel result, two new MCs are identified in these time windows.

  17. Statistical methods for data analysis in particle physics

    CERN Document Server

    Lista, Luca

    2015-01-01

    This concise set of course-based notes provides the reader with the main concepts and tools to perform statistical analysis of experimental data, in particular in the field of high-energy physics (HEP). First, an introduction to probability theory and basic statistics is given, mainly as reminder from advanced undergraduate studies, yet also in view to clearly distinguish the Frequentist versus Bayesian approaches and interpretations in subsequent applications. More advanced concepts and applications are gradually introduced, culminating in the chapter on upper limits as many applications in HEP concern hypothesis testing, where often the main goal is to provide better and better limits so as to be able to distinguish eventually between competing hypotheses or to rule out some of them altogether. Many worked examples will help newcomers to the field and graduate students to understand the pitfalls in applying theoretical concepts to actual data

  18. Statistical analysis of global surface air temperature and sea level using cointegration methods

    DEFF Research Database (Denmark)

    Schmith, Torben; Johansen, Søren; Thejll, Peter

    Global sea levels are rising which is widely understood as a consequence of thermal expansion and melting of glaciers and land-based ice caps. Due to physically-based models being unable to simulate observed sea level trends, semi-empirical models have been applied as an alternative for projecting...... of future sea levels. There is in this, however, potential pitfalls due to the trending nature of the time series. We apply a statistical method called cointegration analysis to observed global sea level and surface air temperature, capable of handling such peculiarities. We find a relationship between sea...... level and temperature and find that temperature causally depends on the sea level, which can be understood as a consequence of the large heat capacity of the ocean. We further find that the warming episode in the 1940s is exceptional in the sense that sea level and warming deviates from the expected...

  19. Statistical analysis of global surface temperature and sea level using cointegration methods

    DEFF Research Database (Denmark)

    Schmidt, Torben; Johansen, Søren; Thejll, Peter

    2012-01-01

    Global sea levels are rising which is widely understood as a consequence of thermal expansion and melting of glaciers and land-based ice caps. Due to the lack of representation of ice-sheet dynamics in present-day physically-based climate models being unable to simulate observed sea level trends......, semi-empirical models have been applied as an alternative for projecting of future sea levels. There is in this, however, potential pitfalls due to the trending nature of the time series. We apply a statistical method called cointegration analysis to observed global sea level and land-ocean surface air...... temperature, capable of handling such peculiarities. We find a relationship between sea level and temperature and find that temperature causally depends on the sea level, which can be understood as a consequence of the large heat capacity of the ocean. We further find that the warming episode in the 1940s...

  20. METHODOLOGICAL PRINCIPLES AND METHODS OF TERMS OF TRADE STATISTICAL EVALUATION

    Directory of Open Access Journals (Sweden)

    N. Kovtun

    2014-09-01

    Full Text Available The paper studies the methodological principles and guidance of the statistical evaluation of terms of trade for the United Nations classification model – Harmonized Commodity Description and Coding System (HS. The practical implementation of the proposed three-stage model of index analysis and estimation of terms of trade for Ukraine's commodity-members for the period of 2011-2012 are realized.

  1. NordVal: A Nordic system for validation of alternative microbiological methods

    DEFF Research Database (Denmark)

    Qvist, Sven

    2007-01-01

    Val validated and certified methods for meat intended for Finland and Sweden. NordVal has at present 25 validated and certified alternative microbiological methods registered. The methods comprise ELISA, PCR and DNA based methods for Salmonella, Listeria and Campylobacter. Alternative culture based methods...

  2. Effectiveness of Alternative Methods for Toothbrush Disinfection: An In Vitro Study

    Directory of Open Access Journals (Sweden)

    Ilkay Peker

    2014-01-01

    Full Text Available Objective. This study aimed to evaluate the effectiveness of alternative methods for toothbrush disinfection. Methods. Two-hundred eighty toothbrushes were included in the study. The toothbrushes were divided into 7 groups and were contaminated by standardized suspensions of Lactobacillus rhamnosus (L. rhamnosus, Streptococcus mutans (S. mutans, Staphylococcus aureus (S. aureus, and Escherichia coli (E. coli. The following disinfectants were tested: 1% sodium hypochlorite (NaOCl, 100% and 50% white vinegar, microwave (MW oven, ultraviolet (UV sanitizer, and mouth rinse-containing propolis (MCP. Data were analyzed with Kruskal Wallis and Dunn’s tests. Results. Statistically significant differences were found between different methods and control group for all tested bacteria. There were statistically significant differences between all test groups for all microorganisms. MW was the most effective for L. rhamnosus and 100% white vinegar was the most effective method for S. mutans and S. aureus. NaOCl was the most effective for E. coli. Conclusion. This study showed that 100% white vinegar was considered to be effective for tested microorganisms. Similarly, 1% NaOCl is cost-effective, easily accessible, and comparatively effective for toothbrush disinfection. Because these agents are nontoxic, cost-effective and easily accessible, they may be appropriate for household use.

  3. Statistical Diagnosis of the Best Weibull Methods for Wind Power Assessment for Agricultural Applications

    OpenAIRE

    Abul Kalam Azad; Mohammad Golam Rasul; Talal Yusaf

    2014-01-01

    The best Weibull distribution methods for the assessment of wind energy potential at different altitudes in desired locations are statistically diagnosed in this study. Seven different methods, namely graphical method (GM), method of moments (MOM), standard deviation method (STDM), maximum likelihood method (MLM), power density method (PDM), modified maximum likelihood method (MMLM) and equivalent energy method (EEM) were used to estimate the Weibull parameters and six statistical tools, name...

  4. Statistical methods for segmentation and classification of images

    DEFF Research Database (Denmark)

    Rosholm, Anders

    1997-01-01

    The central matter of the present thesis is Bayesian statistical inference applied to classification of images. An initial review of Markov Random Fields relates to the modeling aspect of the indicated main subject. In that connection, emphasis is put on the relatively unknown sub-class of Pickard...... with a Pickard Random Field modeling of a considered (categorical) image phenomemon. An extension of the fast PRF based classification technique is presented. The modification introduces auto-correlation into the model of an involved noise process, which previously has been assumed independent. The suitability...... of the extended model is documented by tests on controlled image data containing auto-correlated noise....

  5. Spatial Analysis Along Networks Statistical and Computational Methods

    CERN Document Server

    Okabe, Atsuyuki

    2012-01-01

    In the real world, there are numerous and various events that occur on and alongside networks, including the occurrence of traffic accidents on highways, the location of stores alongside roads, the incidence of crime on streets and the contamination along rivers. In order to carry out analyses of those events, the researcher needs to be familiar with a range of specific techniques. Spatial Analysis Along Networks provides a practical guide to the necessary statistical techniques and their computational implementation. Each chapter illustrates a specific technique, from Stochastic Point Process

  6. Convex Optimization Methods for Graphs and Statistical Modeling

    Science.gov (United States)

    2011-06-01

    requirements that the graph be triangle-free and square-free. Of course such graph reconstruction problems may be infeasible in general, as there may be...over C1, C2 is motivated by a similar procedure in statistics and signal processing, which goes by the name of “matched filtering.” Of course other...h is the height of the cap over the equator. Via elementary trigonometry , the solid angle that K subtends is given by π/2 − sin−1(h). Hence, if h(β

  7. Mathematical and statistical methods for actuarial sciences and finance

    CERN Document Server

    Pizzi, Claudio

    2014-01-01

    The interaction between mathematicians and statisticians has been shown to be an effective approach for dealing with actuarial, insurance and financial problems, both from an academic perspective and from an operative one. The collection of original papers presented in this volume pursues precisely this purpose. It covers a wide variety of subjects in actuarial, insurance and finance fields, all treated in the light of the successful cooperation between the above two quantitative approaches. The papers published in this volume present theoretical and methodological contributions and their applications to real contexts. With respect to the theoretical and methodological contributions, some of the considered areas of investigation are: actuarial models; alternative testing approaches; behavioral finance; clustering techniques; coherent and non-coherent risk measures; credit scoring approaches; data envelopment analysis; dynamic stochastic programming; financial contagion models; financial ratios; intelli...

  8. Alternative modeling methods for plasma-based Rf ion sources

    Energy Technology Data Exchange (ETDEWEB)

    Veitzer, Seth A., E-mail: veitzer@txcorp.com; Kundrapu, Madhusudhan, E-mail: madhusnk@txcorp.com; Stoltz, Peter H., E-mail: phstoltz@txcorp.com; Beckwith, Kristian R. C., E-mail: beckwith@txcorp.com [Tech-X Corporation, Boulder, Colorado 80303 (United States)

    2016-02-15

    Rf-driven ion sources for accelerators and many industrial applications benefit from detailed numerical modeling and simulation of plasma characteristics. For instance, modeling of the Spallation Neutron Source (SNS) internal antenna H{sup −} source has indicated that a large plasma velocity is induced near bends in the antenna where structural failures are often observed. This could lead to improved designs and ion source performance based on simulation and modeling. However, there are significant separations of time and spatial scales inherent to Rf-driven plasma ion sources, which makes it difficult to model ion sources with explicit, kinetic Particle-In-Cell (PIC) simulation codes. In particular, if both electron and ion motions are to be explicitly modeled, then the simulation time step must be very small, and total simulation times must be large enough to capture the evolution of the plasma ions, as well as extending over many Rf periods. Additional physics processes such as plasma chemistry and surface effects such as secondary electron emission increase the computational requirements in such a way that even fully parallel explicit PIC models cannot be used. One alternative method is to develop fluid-based codes coupled with electromagnetics in order to model ion sources. Time-domain fluid models can simulate plasma evolution, plasma chemistry, and surface physics models with reasonable computational resources by not explicitly resolving electron motions, which thereby leads to an increase in the time step. This is achieved by solving fluid motions coupled with electromagnetics using reduced-physics models, such as single-temperature magnetohydrodynamics (MHD), extended, gas dynamic, and Hall MHD, and two-fluid MHD models. We show recent results on modeling the internal antenna H{sup −} ion source for the SNS at Oak Ridge National Laboratory using the fluid plasma modeling code USim. We compare demonstrate plasma temperature equilibration in two

  9. Alternative modeling methods for plasma-based Rf ion sources

    Science.gov (United States)

    Veitzer, Seth A.; Kundrapu, Madhusudhan; Stoltz, Peter H.; Beckwith, Kristian R. C.

    2016-02-01

    Rf-driven ion sources for accelerators and many industrial applications benefit from detailed numerical modeling and simulation of plasma characteristics. For instance, modeling of the Spallation Neutron Source (SNS) internal antenna H- source has indicated that a large plasma velocity is induced near bends in the antenna where structural failures are often observed. This could lead to improved designs and ion source performance based on simulation and modeling. However, there are significant separations of time and spatial scales inherent to Rf-driven plasma ion sources, which makes it difficult to model ion sources with explicit, kinetic Particle-In-Cell (PIC) simulation codes. In particular, if both electron and ion motions are to be explicitly modeled, then the simulation time step must be very small, and total simulation times must be large enough to capture the evolution of the plasma ions, as well as extending over many Rf periods. Additional physics processes such as plasma chemistry and surface effects such as secondary electron emission increase the computational requirements in such a way that even fully parallel explicit PIC models cannot be used. One alternative method is to develop fluid-based codes coupled with electromagnetics in order to model ion sources. Time-domain fluid models can simulate plasma evolution, plasma chemistry, and surface physics models with reasonable computational resources by not explicitly resolving electron motions, which thereby leads to an increase in the time step. This is achieved by solving fluid motions coupled with electromagnetics using reduced-physics models, such as single-temperature magnetohydrodynamics (MHD), extended, gas dynamic, and Hall MHD, and two-fluid MHD models. We show recent results on modeling the internal antenna H- ion source for the SNS at Oak Ridge National Laboratory using the fluid plasma modeling code USim. We compare demonstrate plasma temperature equilibration in two-temperature MHD models

  10. Monitoring Method of Cow Anthrax Based on Gis and Spatial Statistical Analysis

    Science.gov (United States)

    Li, Lin; Yang, Yong; Wang, Hongbin; Dong, Jing; Zhao, Yujun; He, Jianbin; Fan, Honggang

    Geographic information system (GIS) is a computer application system, which possesses the ability of manipulating spatial information and has been used in many fields related with the spatial information management. Many methods and models have been established for analyzing animal diseases distribution models and temporal-spatial transmission models. Great benefits have been gained from the application of GIS in animal disease epidemiology. GIS is now a very important tool in animal disease epidemiological research. Spatial analysis function of GIS can be widened and strengthened by using spatial statistical analysis, allowing for the deeper exploration, analysis, manipulation and interpretation of spatial pattern and spatial correlation of the animal disease. In this paper, we analyzed the cow anthrax spatial distribution characteristics in the target district A (due to the secret of epidemic data we call it district A) based on the established GIS of the cow anthrax in this district in combination of spatial statistical analysis and GIS. The Cow anthrax is biogeochemical disease, and its geographical distribution is related closely to the environmental factors of habitats and has some spatial characteristics, and therefore the correct analysis of the spatial distribution of anthrax cow for monitoring and the prevention and control of anthrax has a very important role. However, the application of classic statistical methods in some areas is very difficult because of the pastoral nomadic context. The high mobility of livestock and the lack of enough suitable sampling for the some of the difficulties in monitoring currently make it nearly impossible to apply rigorous random sampling methods. It is thus necessary to develop an alternative sampling method, which could overcome the lack of sampling and meet the requirements for randomness. The GIS computer application software ArcGIS9.1 was used to overcome the lack of data of sampling sites.Using ArcGIS 9.1 and GEODA

  11. Methods in probability and statistical inference. Progress report, June 1975--June 14, 1976. [Dept. of Statistics, Univ. of Chicago

    Energy Technology Data Exchange (ETDEWEB)

    Perlman, M D

    1976-03-01

    Efficient methods for approximating percentage points of the largest characteristic root of a Wishart matrix, and other statistical quantities of interest, were developed. Fitting of non-additive models to two-way and higher-way tables and the further development of the SNAP statistical computing system were reported. Numerical procedures for computing boundary-crossing probabilities for Brownian motion and other stochastic processes, such as Bessel diffusions, were implemented. Mathematical techniques from statistical mechanics were applied to obtain a unified treatment of probabilities of large deviations of the sample; in the setting of general topological vector spaces. The application of the Martin boundary to questions about infinite particle systems was studied. A comparative study of classical ''omnibus'' and Bayes procedures for combining several independent noncentral chi-square test statistics was completed. Work proceeds on the related problem of combining noncentral F-tests. A numerical study of the small-sample powers of the Pearson chi-square and likelihood ratio tests for multinomial goodness-of-fit was made. The relationship between asymptotic (large sample) efficiency of test statistics, as measured by Bahadur's concept of exact slope, and actual small-sample efficiency was studied. A promising new technique for the simultaneous estimation of all correlation coefficients in a multivariate population was developed. The method adapts the James--Stein ''shrinking'' estimator (for location parameters) to the estimating of correlations.

  12. M&M's "The Method," and Other Ideas about Teaching Elementary Statistics.

    Science.gov (United States)

    May, E. Lee Jr.

    2000-01-01

    Consists of a collection of observations about the teaching of the first course in elementary probability and statistics offered by many colleges and universities. Highlights the Goldberg Method for solving problems in probability and statistics. (Author/ASK)

  13. Modification of codes NUALGAM and BREMRAD. Volume 3: Statistical considerations of the Monte Carlo method

    Science.gov (United States)

    Firstenberg, H.

    1971-01-01

    The statistics are considered of the Monte Carlo method relative to the interpretation of the NUGAM2 and NUGAM3 computer code results. A numerical experiment using the NUGAM2 code is presented and the results are statistically interpreted.

  14. Introducing Students to the Application of Statistics and Investigative Methods in Political Science

    Science.gov (United States)

    Wells, Dominic D.; Nemire, Nathan A.

    2017-01-01

    This exercise introduces students to the application of statistics and its investigative methods in political science. It helps students gain a better understanding and a greater appreciation of statistics through a real world application.

  15. Comparison of alternative improved perturbative methods for nonlinear oscillations

    Energy Technology Data Exchange (ETDEWEB)

    Amore, Paolo [Facultad de Ciencias, Universidad de Colima, Bernal Diaz del Castillo 340, Colima (Mexico)]. E-mail: paolo@ucol.mx; Raya, Alfredo [Facultad de Ciencias, Universidad de Colima, Bernal Diaz del Castillo 340, Colima (Mexico); Fernandez, Francisco M. [INIFTA (Conicet, UNLP), Diag. 113 y 64 S/N, Sucursal 4, Casilla de Correo 16, 1900 La Plata (Argentina)

    2005-06-06

    We discuss and compare two alternative perturbation approaches for the calculation of the period of nonlinear systems based on the Lindstedt-Poincare technique. As illustrative examples we choose one-dimensional anharmonic oscillators and the Van der Pol equation. Our results show that each approach is better for just one type of model considered here.

  16. Alternatives in Medical Education: Non-Animal Methods.

    Science.gov (United States)

    Carlson, Peggy, Ed.

    The technology explosion in medical education has led to the use of computer models, videotapes, interactive videos, and state-of-the-art simulators in medical training. This booklet describes alternatives to using animals in medical education. Although it is mainly intended to describe products applicable to medical school courses, high-quality,…

  17. Exploration of Alternative Approaches for Estimation of Single Sensor Error Statistics Using the MODIS Aqua Matchup Database

    Science.gov (United States)

    Kilpatrick, K. A.; Podesta, G. P.; Kumar, C.; Minnett, P. J.; Williams, E.; Walsh, S.

    2016-12-01

    Sea surface temperature (SST) is a fundamental quantity to understand weather and climate dynamics. Modern ocean observing systems monitor SST using multiple platforms and instruments - including satellite-borne sensors. The integration of observations from multiple sources, however, requires that SSTs from each instrument or measurement system have associated estimates of systematic errors (bias) and variability (dispersion). Guidance on how to derive meaningful error properties, however, is still being developed. A single pair of bias and dispersion values for the entire globe is not adequate, as satellite SSTs clearly show error patterns that vary in space and time, and that may even partially cancel each other when an overall statistic is calculated. Alternatively, a `hypercube' approach was proposed for SST retrievals from the MODIS infrared radiometers on the Terra and Aqua EOS satellites. Retrieval uncertainty was estimated separately for `hypercube bins' defined by the combination of season, latitude, viewing geometry, surface temperature, and "wet" or "dry" atmospheres. A disadvantage was the appearance of obvious spatial discontinuities in mapped uncertainty fields. Recently, Petrenko et al. (J. of Atmospheric & Oceanic Technology 33: 345-358, 2016) proposed an alternative approach, in which SST retrieval errors are instead segmented based on the values of regressors, i.e., the terms (excluding offsets) in the statistical algorithm used to estimate SST. This approach sought to characterize SST errors with a limited number of arguments, regardless of how many physical variables influence the values of algorithm terms. Using co-located MODIS-Aqua observations and in situ SST measurements in the MODIS Matchup Database (2002 to mid-2016) we explore both segmentation approaches. An initial finding is that, in both approaches, only a small portion of the multivariate space is occupied by MODIS matchups determined to be cloud-free. Another finding was that it

  18. An Optimization Method for Simulator Using Probability Statistic Model

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    An optimization method was presented to be easily applied in retargetable simulator. The substance of this method is to reduce the redundant information of operation code which is caused by the variety of execution frequencies of instructions. By recoding the operation code in the loading part of simulator, times of bit comparison in identification of an instruction will get reduced. Thus the performance of the simulator will be improved. The theoretical analysis and experimental results both prove the validity of this method.

  19. Statistical Methods and Tools for Hanford Staged Feed Tank Sampling

    Energy Technology Data Exchange (ETDEWEB)

    Fountain, Matthew S.; Brigantic, Robert T.; Peterson, Reid A.

    2013-10-01

    This report summarizes work conducted by Pacific Northwest National Laboratory to technically evaluate the current approach to staged feed sampling of high-level waste (HLW) sludge to meet waste acceptance criteria (WAC) for transfer from tank farms to the Hanford Waste Treatment and Immobilization Plant (WTP). The current sampling and analysis approach is detailed in the document titled Initial Data Quality Objectives for WTP Feed Acceptance Criteria, 24590-WTP-RPT-MGT-11-014, Revision 0 (Arakali et al. 2011). The goal of this current work is to evaluate and provide recommendations to support a defensible, technical and statistical basis for the staged feed sampling approach that meets WAC data quality objectives (DQOs).

  20. Dragon-kings: mechanisms, statistical methods and empirical evidence

    CERN Document Server

    Sornette, D; 10.1140/epjst/e2012-01559-5

    2012-01-01

    This introductory article presents the special Discussion and Debate volume "From black swans to dragon-kings, is there life beyond power laws?" published in Eur. Phys. J. Special Topics in May 2012. We summarize and put in perspective the contributions into three main themes: (i) mechanisms for dragon-kings, (ii) detection of dragon-kings and statistical tests and (iii) empirical evidence in a large variety of natural and social systems. Overall, we are pleased to witness significant advances both in the introduction and clarification of underlying mechanisms and in the development of novel efficient tests that demonstrate clear evidence for the presence of dragon-kings in many systems. However, this positive view should be balanced by the fact that this remains a very delicate and difficult field, if only due to the scarcity of data as well as the extraordinary important implications with respect to hazard assessment, risk control and predictability.

  1. Analogue Correction Method of Errors by Combining Statistical and Dynamical Methods

    Institute of Scientific and Technical Information of China (English)

    REN Hongli; CHOU Jifan

    2006-01-01

    Based on the atmospheric analogy principle, the inverse problem that the information of historical analogue data is utilized to estimate model errors is put forward and a method of analogue correction of errors (ACE) of model is developed in this paper. The ACE can combine effectively statistical and dynamical methods, and need not change the current numerical prediction models. The new method not only adequately utilizes dynamical achievements but also can reasonably absorb the information of a great many analogues in historical data in order to reduce model errors and improve forecast skill.Furthermore, the ACE may identify specific historical data for the solution of the inverse problem in terms of the particularity of current forecast. The qualitative analyses show that the ACE is theoretically equivalent to the principle of the previous analogue-dynamical model, but need not rebuild the complicated analogue-deviation model, so has better feasibility and operational foreground. Moreover, under the ideal situations, when numerical models or historical analogues are perfect, the forecast of the ACE would transform into the forecast of dynamical or statistical method, respectively.

  2. Alternative remedies for insomnia: a proposed method for personalized therapeutic trials

    Directory of Open Access Journals (Sweden)

    Romero K

    2017-03-01

    Full Text Available Kate Romero,1,2 Balaji Goparaju,1,2 Kathryn Russo,1,2 M Brandon Westover,1 Matt T Bianchi1,2 1Neurology Department, Massachusetts General Hospital, 2Division of Sleep Medicine, Harvard Medical School, Boston, MA, USA Abstract: Insomnia is a common symptom, with chronic insomnia being diagnosed in 5–10% of adults. Although many insomnia patients use prescription therapy for insomnia, the health benefits remain uncertain and adverse risks remain a concern. While similar effectiveness and risk concerns exist for herbal remedies, many individuals turn to such alternatives to prescriptions for insomnia. Like prescription hypnotics, herbal remedies that have undergone clinical testing often show subjective sleep improvements that exceed objective measures, which may relate to interindividual heterogeneity and/or placebo effects. Response heterogeneity can undermine traditional randomized trial approaches, which in some fields has prompted a shift toward stratified trials based on genotype or phenotype, or the so-called n-of-1 method of testing placebo versus active drug in within-person alternating blocks. We reviewed six independent compendiums of herbal agents to assemble a group of over 70 reported to benefit sleep. To bridge the gap between the unfeasible expectation of formal evidence in this space and the reality of common self-medication by those with insomnia, we propose a method for guided self-testing that overcomes certain operational barriers related to inter- and intraindividual sources of phenotypic variability. Patient-chosen outcomes drive a general statistical model that allows personalized self-assessment that can augment the open-label nature of routine practice. The potential advantages of this method include flexibility to implement for other (nonherbal insomnia interventions. Keywords: insomnia, over the counter, alternative remedy, herbal, supplement

  3. Statistical methods for damage detection applied to civil structures

    DEFF Research Database (Denmark)

    Gres, Szymon; Ulriksen, Martin Dalgaard; Döhler, Michael

    2017-01-01

    of the two damage detection methods is similar, hereby implying merit of the new Mahalanobis distance-based approach, as it is less computational complex. The fusion of the damage indicators in the control chart provides the most accurate view on the progressively damaged systems....... and compared to the well-known subspace-based damage detection algorithm in the context of two large case studies. Both methods are implemented in the modal analysis and structural health monitoring software ARTeMIS, in which the joint features of the methods are concluded in a control chart in an attempt...

  4. Statistics in science the foundations of statistical methods in biology, physics and economics

    CERN Document Server

    Costantini, Domenico

    1990-01-01

    An inference may be defined as a passage of thought according to some method. In the theory of knowledge it is customary to distinguish deductive and non-deductive inferences. Deductive inferences are truth preserving, that is, the truth of the premises is preserved in the con­ clusion. As a result, the conclusion of a deductive inference is already 'contained' in the premises, although we may not know this fact until the inference is performed. Standard examples of deductive inferences are taken from logic and mathematics. Non-deductive inferences need not preserve truth, that is, 'thought may pass' from true premises to false conclusions. Such inferences can be expansive, or, ampliative in the sense that the performances of such inferences actually increases our putative knowledge. Standard non-deductive inferences do not really exist, but one may think of elementary inductive inferences in which conclusions regarding the future are drawn from knowledge of the past. Since the body of scientific knowledge i...

  5. GROUNDWATER MONITORING: Statistical Methods for Testing Special Background Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Chou, Charissa J.

    2004-04-28

    This chapter illustrates application of a powerful intra-well testing method referred as the combined Shewhart-CUSUM control chart approach, which can detect abrupt and gradual changes in groundwater parameter concentrations. This method is broadly applicable to groundwater monitoring situations where there is no clearly defined upgradient well or wells, where spatial variability exists in parameter concentrations, or when groundwater flow rate is extremely slow. Procedures for determining the minimum time needed to acquire independent groundwater samples and useful transformations for obtaining normally distributed data are also provided. The control chart method will be insensitive to detect real changes if a preexisting trend is observed in the background data set. A method and a case study describing how a trend observed in a background data set can be removed using a transformation suggested by Gibbons (1994) are presented to illustrate treatment of a preexisting trend.

  6. Climate time series analysis classical statistical and bootstrap methods

    CERN Document Server

    Mudelsee, Manfred

    2014-01-01

    Written for climatologists and applied statisticians, this book explains the bootstrap algorithms (including novel adaptions) and methods for confidence interval construction. The accuracy of the algorithms is tested by means of Monte Carlo experiments.

  7. Statistical evaluation of texture analysis from the biocrystallization method

    OpenAIRE

    Meelursarn, Aumaporn

    2007-01-01

    The consumers are becoming more concerned about food quality, especially regarding how, when and where the foods are produced (Haglund et al., 1999; Kahl et al., 2004; Alföldi, et al., 2006). Therefore, during recent years there has been a growing interest in the methods for food quality assessment, especially in the picture-development methods as a complement to traditional chemical analysis of single compounds (Kahl et al., 2006). The biocrystallization as one of the picture-developin...

  8. Statistical Methods for Predicting Malaria Incidences Using Data from Sudan

    Science.gov (United States)

    Awadalla, Khidir E.

    2017-01-01

    Malaria is the leading cause of illness and death in Sudan. The entire population is at risk of malaria epidemics with a very high burden on government and population. The usefulness of forecasting methods in predicting the number of future incidences is needed to motivate the development of a system that can predict future incidences. The objective of this paper is to develop applicable and understood time series models and to find out what method can provide better performance to predict future incidences level. We used monthly incidence data collected from five states in Sudan with unstable malaria transmission. We test four methods of the forecast: (1) autoregressive integrated moving average (ARIMA); (2) exponential smoothing; (3) transformation model; and (4) moving average. The result showed that transformation method performed significantly better than the other methods for Gadaref, Gazira, North Kordofan, and Northern, while the moving average model performed significantly better for Khartoum. Future research should combine a number of different and dissimilar methods of time series to improve forecast accuracy with the ultimate aim of developing a simple and useful model for producing reasonably reliable forecasts of the malaria incidence in the study area.

  9. Alternative and Efficient Extraction Methods for Marine-Derived Compounds

    OpenAIRE

    Clara Grosso; Patrícia Valentão; Federico Ferreres; Paula B. Andrade

    2015-01-01

    Marine ecosystems cover more than 70% of the globe’s surface. These habitats are occupied by a great diversity of marine organisms that produce highly structural diverse metabolites as a defense mechanism. In the last decades, these metabolites have been extracted and isolated in order to test them in different bioassays and assess their potential to fight human diseases. Since traditional extraction techniques are both solvent- and time-consuming, this review emphasizes alternative extracti...

  10. Alternative Methods for Field Corrections in Helical Solenoids

    Energy Technology Data Exchange (ETDEWEB)

    Lopes, M. L. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Krave, S. T. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Tompkins, J. C. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Yonehara, K. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Flanagan, G. [Muons Inc., Batavia, IL (United States); Kahn, S. A. [Muons Inc., Batavia, IL (United States); Melconian, K. [Texas A & M Univ., College Station, TX (United States)

    2015-05-01

    Helical cooling channels have been proposed for highly efficient 6D muon cooling. Helical solenoids produce solenoidal, helical dipole, and helical gradient field components. Previous studies explored the geometric tunability limits on these main field components. In this paper we present two alternative correction schemes, tilting the solenoids and the addition of helical lines, to reduce the required strength of the anti-solenoid and add an additional tuning knob.

  11. Alternative Methods of Collective Disputes Resolution in the Czech Republic

    Directory of Open Access Journals (Sweden)

    Hamuľáková Klára

    2016-10-01

    Full Text Available On 11 June 2013, the Commission issued the Recommendation on common principles for injunctive and compensatory collective redress mechanisms in the Member States concerning the violations of rights granted under Union law. The main areas where private enforcement of rights granted under Union law in the form of collective redress is of value are consumer protection, competition, environment protection, protection of personal data, financial services legislation and protection of investments. Point 13 of the Recommendation concurrently emphasises that the principles it puts forward relate both to judicial and out-of-court collective redress. The Member States should ensure that judicial collective redress mechanisms are accompanied by appropriate means of collective alternative dispute resolution available to the parties before and throughout the litigation. Point 25 et seq. of the Recommendation then contains special regulations concerning collective alternative dispute resolution and settlements. The purpose of this article is to evaluate if the current legislation on alternative dispute resolution in the Czech Republic meets the principles encompassed in the Recommendation or if radical legal changes need to be adopted.

  12. Data Analysis & Statistical Methods for Command File Errors

    Science.gov (United States)

    Meshkat, Leila; Waggoner, Bruce; Bryant, Larry

    2014-01-01

    This paper explains current work on modeling for managing the risk of command file errors. It is focused on analyzing actual data from a JPL spaceflight mission to build models for evaluating and predicting error rates as a function of several key variables. We constructed a rich dataset by considering the number of errors, the number of files radiated, including the number commands and blocks in each file, as well as subjective estimates of workload and operational novelty. We have assessed these data using different curve fitting and distribution fitting techniques, such as multiple regression analysis, and maximum likelihood estimation to see how much of the variability in the error rates can be explained with these. We have also used goodness of fit testing strategies and principal component analysis to further assess our data. Finally, we constructed a model of expected error rates based on the what these statistics bore out as critical drivers to the error rate. This model allows project management to evaluate the error rate against a theoretically expected rate as well as anticipate future error rates.

  13. Non-Statistical Methods of Analysing of Bankruptcy Risk

    Directory of Open Access Journals (Sweden)

    Pisula Tomasz

    2015-06-01

    Full Text Available The article focuses on assessing the effectiveness of a non-statistical approach to bankruptcy modelling in enterprises operating in the logistics sector. In order to describe the issue more comprehensively, the aforementioned prediction of the possible negative results of business operations was carried out for companies functioning in the Polish region of Podkarpacie, and in Slovakia. The bankruptcy predictors selected for the assessment of companies operating in the logistics sector included 28 financial indicators characterizing these enterprises in terms of their financial standing and management effectiveness. The purpose of the study was to identify factors (models describing the bankruptcy risk in enterprises in the context of their forecasting effectiveness in a one-year and two-year time horizon. In order to assess their practical applicability the models were carefully analysed and validated. The usefulness of the models was assessed in terms of their classification properties, and the capacity to accurately identify enterprises at risk of bankruptcy and healthy companies as well as proper calibration of the models to the data from training sample sets.

  14. Comparison of Statistical Methods for Detector Testing Programs

    Energy Technology Data Exchange (ETDEWEB)

    Rennie, John Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Abhold, Mark [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-10-14

    A typical goal for any detector testing program is to ascertain not only the performance of the detector systems under test, but also the confidence that systems accepted using that testing program’s acceptance criteria will exceed a minimum acceptable performance (which is usually expressed as the minimum acceptable success probability, p). A similar problem often arises in statistics, where we would like to ascertain the fraction, p, of a population of items that possess a property that may take one of two possible values. Typically, the problem is approached by drawing a fixed sample of size n, with the number of items out of n that possess the desired property, x, being termed successes. The sample mean gives an estimate of the population mean p ≈ x/n, although usually it is desirable to accompany such an estimate with a statement concerning the range within which p may fall and the confidence associated with that range. Procedures for establishing such ranges and confidence limits are described in detail by Clopper, Brown, and Agresti for two-sided symmetric confidence intervals.

  15. An improved Bayesian matting method based on image statistic characteristics

    Science.gov (United States)

    Sun, Wei; Luo, Siwei; Wu, Lina

    2015-03-01

    Image matting is an important task in image and video editing and has been studied for more than 30 years. In this paper we propose an improved interactive matting method. Starting from a coarse user-guided trimap, we first perform a color estimation based on texture and color information and use the result to refine the original trimap. Then with the new trimap, we apply soft matting process which is improved Bayesian matting with smoothness constraints. Experimental results on natural image show that this method is useful, especially for the images have similar texture feature in the background or the images which is hard to give a precise trimap.

  16. Statistical methods in interphase cytogenetics: an experimental approach.

    Science.gov (United States)

    Kibbelaar, R E; Kok, F; Dreef, E J; Kleiverda, J K; Cornelisse, C J; Raap, A K; Kluin, P M

    1993-10-01

    In situ hybridization (ISH) techniques on interphase cells, or interphase cytogenetics, have powerful potential clinical and biological applications, such as detection of minimal residual disease, early relapse, and the study of clonal evolution and expansion in neoplasia. Much attention has been paid to issues related to ISH data acquisition, i.e., the numbers, colors, intensities, and spatial relationships of hybridization signals. The methodology concerning data analysis, which is of prime importance for clinical applications, however, is less well investigated. We have studied the latter for the detection of small monosomic and trisomic cell populations using various mixtures of human female and male cells. With a chromosome X specific probe, the male cells stimulated monosomic subpopulations of 0, 1, 5, 10, 50, 90, 95, 99, and 100%. Analogously, when a (7 + Y) specific probe combination was used, containing a mixture of chromosome No. 7 and Y-specific DNA, the male cells simulated trisomic cell populations. Probes specific for chromosomes Nos. 1, 7, 8, and 9 were used for estimation of ISH artifacts. Three statistical tests, the Kolmogorov-Smirnov test, the multiple-proportion test, and the z'-max test, were applied to the empirical data using the control data as a reference for ISH artifacts. The Kolmogorov-Smirnov test was found to be inferior for discrimination of small monosomic or trisomic cell populations. The other two tests showed that when 400 cells were evaluated, and using selected control probes, monosomy X could be detected at a frequency of 5% aberrant cells, and trisomy 7 + Y at a frequency of 1%.(ABSTRACT TRUNCATED AT 250 WORDS)

  17. CAPABILITY ASSESSMENT OF MEASURING EQUIPMENT USING STATISTIC METHOD

    Directory of Open Access Journals (Sweden)

    Pavel POLÁK

    2014-10-01

    Full Text Available Capability assessment of the measurement device is one of the methods of process quality control. Only in case the measurement device is capable, the capability of the measurement and consequently production process can be assessed. This paper deals with assessment of the capability of the measuring device using indices Cg and Cgk.

  18. Statistical tests for equal predictive ability across multiple forecasting methods

    DEFF Research Database (Denmark)

    Borup, Daniel; Thyrsgaard, Martin

    as non-stationarity of the data. We introduce two finite-sample corrections, leading to good size and power properties. We also provide a two-step Model Confidence Set-type decision rule for ranking the forecasting methods into sets of indistinguishable conditional predictive ability, particularly...

  19. Impact of statistical learning methods on the predictive power of multivariate normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A.; van t Veld, Aart A.

    2012-01-01

    PURPOSE: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator

  20. An Alternate Method for Computation of Transfer Function Matrix

    Directory of Open Access Journals (Sweden)

    Appukuttan K. K.

    2010-01-01

    Full Text Available A direct and simple numerical method is presented for calculating the transfer function matrix of a linear time invariant multivariable system (A, B, C. The method is based on the matrix-determinant identity, and it involves operations with an auxiliary vector on the matrices. The method is computationally faster compared to Liverrier and Danilevsky methods.

  1. A Review of the Statistical and Quantitative Methods Used to Study Alcohol-Attributable Crime

    Science.gov (United States)

    Fitterer, Jessica L.; Nelson, Trisalyn A.

    2015-01-01

    Modelling the relationship between alcohol consumption and crime generates new knowledge for crime prevention strategies. Advances in data, particularly data with spatial and temporal attributes, have led to a growing suite of applied methods for modelling. In support of alcohol and crime researchers we synthesized and critiqued existing methods of spatially and quantitatively modelling the effects of alcohol exposure on crime to aid method selection, and identify new opportunities for analysis strategies. We searched the alcohol-crime literature from 1950 to January 2014. Analyses that statistically evaluated or mapped the association between alcohol and crime were included. For modelling purposes, crime data were most often derived from generalized police reports, aggregated to large spatial units such as census tracts or postal codes, and standardized by residential population data. Sixty-eight of the 90 selected studies included geospatial data of which 48 used cross-sectional datasets. Regression was the prominent modelling choice (n = 78) though dependent on data many variations existed. There are opportunities to improve information for alcohol-attributable crime prevention by using alternative population data to standardize crime rates, sourcing crime information from non-traditional platforms (social media), increasing the number of panel studies, and conducting analysis at the local level (neighbourhood, block, or point). Due to the spatio-temporal advances in crime data, we expect a continued uptake of flexible Bayesian hierarchical modelling, a greater inclusion of spatial-temporal point pattern analysis, and shift toward prospective (forecast) modelling over small areas (e.g., blocks). PMID:26418016

  2. A Review of the Statistical and Quantitative Methods Used to Study Alcohol-Attributable Crime.

    Science.gov (United States)

    Fitterer, Jessica L; Nelson, Trisalyn A

    2015-01-01

    Modelling the relationship between alcohol consumption and crime generates new knowledge for crime prevention strategies. Advances in data, particularly data with spatial and temporal attributes, have led to a growing suite of applied methods for modelling. In support of alcohol and crime researchers we synthesized and critiqued existing methods of spatially and quantitatively modelling the effects of alcohol exposure on crime to aid method selection, and identify new opportunities for analysis strategies. We searched the alcohol-crime literature from 1950 to January 2014. Analyses that statistically evaluated or mapped the association between alcohol and crime were included. For modelling purposes, crime data were most often derived from generalized police reports, aggregated to large spatial units such as census tracts or postal codes, and standardized by residential population data. Sixty-eight of the 90 selected studies included geospatial data of which 48 used cross-sectional datasets. Regression was the prominent modelling choice (n = 78) though dependent on data many variations existed. There are opportunities to improve information for alcohol-attributable crime prevention by using alternative population data to standardize crime rates, sourcing crime information from non-traditional platforms (social media), increasing the number of panel studies, and conducting analysis at the local level (neighbourhood, block, or point). Due to the spatio-temporal advances in crime data, we expect a continued uptake of flexible Bayesian hierarchical modelling, a greater inclusion of spatial-temporal point pattern analysis, and shift toward prospective (forecast) modelling over small areas (e.g., blocks).

  3. Students' Attitudes toward Statistics across the Disciplines: A Mixed-Methods Approach

    Science.gov (United States)

    Griffith, James D.; Adams, Lea T.; Gu, Lucy L.; Hart, Christian L.; Nichols-Whitehead, Penney

    2012-01-01

    Students' attitudes toward statistics were investigated using a mixed-methods approach including a discovery-oriented qualitative methodology among 684 undergraduate students across business, criminal justice, and psychology majors where at least one course in statistics was required. Students were asked about their attitudes toward statistics and…

  4. Counting Better? An Examination of the Impact of Quantitative Method Teaching on Statistical Anxiety and Confidence

    Science.gov (United States)

    Chamberlain, John Martyn; Hillier, John; Signoretta, Paola

    2015-01-01

    This article reports the results of research concerned with students' statistical anxiety and confidence to both complete and learn to complete statistical tasks. Data were collected at the beginning and end of a quantitative method statistics module. Students recognised the value of numeracy skills but felt they were not necessarily relevant for…

  5. Students' Attitudes toward Statistics across the Disciplines: A Mixed-Methods Approach

    Science.gov (United States)

    Griffith, James D.; Adams, Lea T.; Gu, Lucy L.; Hart, Christian L.; Nichols-Whitehead, Penney

    2012-01-01

    Students' attitudes toward statistics were investigated using a mixed-methods approach including a discovery-oriented qualitative methodology among 684 undergraduate students across business, criminal justice, and psychology majors where at least one course in statistics was required. Students were asked about their attitudes toward statistics and…

  6. Practical methods for generating alternating magnetic fields for biomedical research

    Science.gov (United States)

    Christiansen, Michael G.; Howe, Christina M.; Bono, David C.; Perreault, David J.; Anikeeva, Polina

    2017-08-01

    Alternating magnetic fields (AMFs) cause magnetic nanoparticles (MNPs) to dissipate heat while leaving surrounding tissue unharmed, a mechanism that serves as the basis for a variety of emerging biomedical technologies. Unfortunately, the challenges and costs of developing experimental setups commonly used to produce AMFs with suitable field amplitudes and frequencies present a barrier to researchers. This paper first presents a simple, cost-effective, and robust alternative for small AMF working volumes that uses soft ferromagnetic cores to focus the flux into a gap. As the experimental length scale increases to accommodate animal models (working volumes of 100s of cm3 or greater), poor thermal conductivity and volumetrically scaled core losses render that strategy ineffective. Comparatively feasible strategies for these larger volumes instead use low loss resonant tank circuits to generate circulating currents of 1 kA or greater in order to produce the comparable field amplitudes. These principles can be extended to the problem of identifying practical routes for scaling AMF setups to humans, an infrequently acknowledged challenge that influences the extent to which many applications of MNPs may ever become clinically relevant.

  7. Developing TOPSIS method using statistical normalization for selecting knowledge management strategies

    Directory of Open Access Journals (Sweden)

    Amin Zadeh Sarraf

    2013-09-01

    Full Text Available Purpose: Numerous companies are expecting their knowledge management (KM to be performed effectively in order to leverage and transform the knowledge into competitive advantages. However, here raises a critical issue of how companies can better evaluate and select a favorable KM strategy prior to a successful KM implementation. Design/methodology/approach: An extension of TOPSIS, a multi-attribute decision making (MADM technique, to a group decision environment is investigated. TOPSIS is a practical and useful technique for ranking and selection of a number of externally determined alternatives through distance measures. The entropy method is often used for assessing weights in the TOPSIS method. Entropy in information theory is a criterion uses for measuring the amount of disorder represented by a discrete probability distribution. According to decrease resistance degree of employees opposite of implementing a new strategy, it seems necessary to spot all managers’ opinion. The normal distribution considered the most prominent probability distribution in statistics is used to normalize gathered data. Findings: The results of this study show that by considering 6 criteria for alternatives Evaluation, the most appropriate KM strategy to implement  in our company was ‘‘Personalization’’. Research limitations/implications: In this research, there are some assumptions that might affect the accuracy of the approach such as normal distribution of sample and community. These assumptions can be changed in future work. Originality/value: This paper proposes an effective solution based on combined entropy and TOPSIS approach to help companies that need to evaluate and select KM strategies. In represented solution, opinions of all managers is gathered and normalized by using standard normal distribution and central limit theorem. Keywords: Knowledge management; strategy; TOPSIS; Normal distribution; entropy

  8. The Case Survey and Alternative Methods for Research Aggregation

    Science.gov (United States)

    1974-06-01

    naowledge acquisition. As the knowledge base grows, it will cumulate and patterns will emerge that will provide a broader understanding of social life ...of FolLow-Through programs were found to predollinate Jin one set of pooled data with common statistical characteristics, aod studies of Montessori ...be mad compatible. l"’we" and other mundane d if f icu ties have a t renendous ca ac i tv to ctonsurW ’ i M- and energy. Montessori ), or they could be

  9. Alternative microbial methods: An overview and selection criteria.

    NARCIS (Netherlands)

    Jasson, V.; Jacxsens, L.; Luning, P.A.; Rajkovic, A.; Uyttendaele, M.

    2010-01-01

    This study provides an overview and criteria for the selection of a method, other than the reference method, for microbial analysis of foods. In a first part an overview of the general characteristics of rapid methods available, both for enumeration and detection, is given with reference to relevant

  10. Alternative microbial methods: An overview and selection criteria.

    NARCIS (Netherlands)

    Jasson, V.; Jacxsens, L.; Luning, P.A.; Rajkovic, A.; Uyttendaele, M.

    2010-01-01

    This study provides an overview and criteria for the selection of a method, other than the reference method, for microbial analysis of foods. In a first part an overview of the general characteristics of rapid methods available, both for enumeration and detection, is given with reference to relevant

  11. Experimental Data Mining Techniques(Using Multiple Statistical Methods

    Directory of Open Access Journals (Sweden)

    Mustafa Zaidi

    2012-05-01

    Full Text Available This paper discusses the possible solutions of non-linear multivariable by experimental Data mining techniques using on orthogonal array. Taguchi method is a very useful technique to reduce the time and cost of the experiment but the ignoring all kind of interaction effects. The results are not much encouraging and motivate to study Laser cutting process of non-linear multivariable is modeled by one and two way analysis of variance also linear and non linear regression analysis. These techniques are used to explore better analysis techniques and improve the laser cutting quality by reducing process variations caused by controllable process parameters. The size of data set causes difficulties in modeling and simulation of the problem such as decision tree is useful technique but it is not able to predict better results. The results of analysis of variance are encouraging. Taguchi and regression normally optimizes input process parameters for single characteristics.

  12. Refining developmental coordination disorder subtyping with multivariate statistical methods

    Directory of Open Access Journals (Sweden)

    Lalanne Christophe

    2012-07-01

    Full Text Available Abstract Background With a large number of potentially relevant clinical indicators penalization and ensemble learning methods are thought to provide better predictive performance than usual linear predictors. However, little is known about how they perform in clinical studies where few cases are available. We used Random Forests and Partial Least Squares Discriminant Analysis to select the most salient impairments in Developmental Coordination Disorder (DCD and assess patients similarity. Methods We considered a wide-range testing battery for various neuropsychological and visuo-motor impairments which aimed at characterizing subtypes of DCD in a sample of 63 children. Classifiers were optimized on a training sample, and they were used subsequently to rank the 49 items according to a permuted measure of variable importance. In addition, subtyping consistency was assessed with cluster analysis on the training sample. Clustering fitness and predictive accuracy were evaluated on the validation sample. Results Both classifiers yielded a relevant subset of items impairments that altogether accounted for a sharp discrimination between three DCD subtypes: ideomotor, visual-spatial and constructional, and mixt dyspraxia. The main impairments that were found to characterize the three subtypes were: digital perception, imitations of gestures, digital praxia, lego blocks, visual spatial structuration, visual motor integration, coordination between upper and lower limbs. Classification accuracy was above 90% for all classifiers, and clustering fitness was found to be satisfactory. Conclusions Random Forests and Partial Least Squares Discriminant Analysis are useful tools to extract salient features from a large pool of correlated binary predictors, but also provide a way to assess individuals proximities in a reduced factor space. Less than 15 neuro-visual, neuro-psychomotor and neuro-psychological tests might be required to provide a sensitive and

  13. Predicting sulphur and nitrogen deposition using a simple statistical method

    Science.gov (United States)

    Oulehle, Filip; Kopáček, Jiří; Chuman, Tomáš; Černohous, Vladimír; Hůnová, Iva; Hruška, Jakub; Krám, Pavel; Lachmanová, Zora; Navrátil, Tomáš; Štěpánek, Petr; Tesař, Miroslav; Evans, Christopher D.

    2016-09-01

    Data from 32 long-term (1994-2012) monitoring sites were used to assess temporal development and spatial variability of sulphur (S) and inorganic nitrogen (N) concentrations in bulk precipitation, and S in throughfall, for the Czech Republic. Despite large variance in absolute S and N concentration/deposition among sites, temporal coherence using standardised data (Z score) was demonstrated. Overall significant declines of SO4 concentration in bulk and throughfall precipitation, as well as NO3 and NH4 concentration in bulk precipitation, were observed. Median Z score values of bulk SO4, NO3 and NH4 and throughfall SO4 derived from observations and the respective emission rates of SO2, NOx and NH3 in the Czech Republic and Slovakia showed highly significant (p Z score values were calculated for the whole period 1900-2012 and then back-transformed to give estimates of concentration for the individual sites. Uncertainty associated with the concentration calculations was estimated as 20% for SO4 bulk precipitation, 22% for throughfall SO4, 18% for bulk NO3 and 28% for bulk NH4. The application of the method suggested that it is effective in the long-term reconstruction and prediction of S and N deposition at a variety of sites. Multiple regression modelling was used to extrapolate site characteristics (mean precipitation chemistry and its standard deviation) from monitored to unmonitored sites. Spatially distributed temporal development of S and N depositions were calculated since 1900. The method allows spatio-temporal estimation of the acid deposition in regions with extensive monitoring of precipitation chemistry.

  14. Alternative and Efficient Extraction Methods for Marine-Derived Compounds

    Directory of Open Access Journals (Sweden)

    Clara Grosso

    2015-05-01

    Full Text Available Marine ecosystems cover more than 70% of the globe’s surface. These habitats are occupied by a great diversity of marine organisms that produce highly structural diverse metabolites as a defense mechanism. In the last decades, these metabolites have been extracted and isolated in order to test them in different bioassays and assess their potential to fight human diseases. Since traditional extraction techniques are both solvent- and time-consuming, this review emphasizes alternative extraction techniques, such as supercritical fluid extraction, pressurized solvent extraction, microwave-assisted extraction, ultrasound-assisted extraction, pulsed electric field-assisted extraction, enzyme-assisted extraction, and extraction with switchable solvents and ionic liquids, applied in the search for marine compounds. Only studies published in the 21st century are considered.

  15. Alternative and efficient extraction methods for marine-derived compounds.

    Science.gov (United States)

    Grosso, Clara; Valentão, Patrícia; Ferreres, Federico; Andrade, Paula B

    2015-05-01

    Marine ecosystems cover more than 70% of the globe's surface. These habitats are occupied by a great diversity of marine organisms that produce highly structural diverse metabolites as a defense mechanism. In the last decades, these metabolites have been extracted and isolated in order to test them in different bioassays and assess their potential to fight human diseases. Since traditional extraction techniques are both solvent- and time-consuming, this review emphasizes alternative extraction techniques, such as supercritical fluid extraction, pressurized solvent extraction, microwave-assisted extraction, ultrasound-assisted extraction, pulsed electric field-assisted extraction, enzyme-assisted extraction, and extraction with switchable solvents and ionic liquids, applied in the search for marine compounds. Only studies published in the 21st century are considered.

  16. Statistical methods used in the public health literature and implications for training of public health professionals.

    Science.gov (United States)

    Hayat, Matthew J; Powell, Amanda; Johnson, Tessa; Cadwell, Betsy L

    2017-01-01

    Statistical literacy and knowledge is needed to read and understand the public health literature. The purpose of this study was to quantify basic and advanced statistical methods used in public health research. We randomly sampled 216 published articles from seven top tier general public health journals. Studies were reviewed by two readers and a standardized data collection form completed for each article. Data were analyzed with descriptive statistics and frequency distributions. Results were summarized for statistical methods used in the literature, including descriptive and inferential statistics, modeling, advanced statistical techniques, and statistical software used. Approximately 81.9% of articles reported an observational study design and 93.1% of articles were substantively focused. Descriptive statistics in table or graphical form were reported in more than 95% of the articles, and statistical inference reported in more than 76% of the studies reviewed. These results reveal the types of statistical methods currently used in the public health literature. Although this study did not obtain information on what should be taught, information on statistical methods being used is useful for curriculum development in graduate health sciences education, as well as making informed decisions about continuing education for public health professionals.

  17. Testing the rate isomorphy hypothesis using five statistical methods

    Institute of Scientific and Technical Information of China (English)

    Xian-Ju Kuang; Megha N. Parajulee2+,; Pei-Jian Shi; Feng Ge; Fang-Sen Xue

    2012-01-01

    Organisms are said to be in developmental rate isomorphy when the proportions of developmental stage durations are unaffected by temperature.Comprehensive stage-specific developmental data were generated on the cabbage beetle,Colaphellus bowringi Baly (Coleoptera:Chrysomelidae),at eight temperatures ranging from 16℃ to 30℃ (in 2℃ increments) and five analytical methods were used to test the rate isomorphy hypothesis,including:(i) direct comparison of lower developmental thresholds with standard errors based on the traditional linear equation describing developmental rate as the linear function of temperature; (ii) analysis of covariance to compare the lower developmental thresholds of different stages based on the Ikemoto-Takai linear equation; (iii)testing the significance of the slope item in the regression line of arcsin(√P) versus temperature,where p is the ratio of the developmental duration of a particular developmental stage to the entire pre-imaginal developmental duration for one insect or mite species; (iv)analysis of variance to test for significant differences between the ratios of developmental stage durations to that of pre-imaginal development; and (v) checking whether there is an element less than a given level of significance in the p-value matrix of rotating regression line.The results revealed no significant difference among the lower developmental thresholds or among the aforementioned ratios,and thus convincingly confirmed the rate isomorphy hypothesis.

  18. Statistical methods for the forensic analysis of striated tool marks

    Energy Technology Data Exchange (ETDEWEB)

    Hoeksema, Amy Beth [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  19. Development of Alternative Methods for Determining Soil Organic Matter

    Directory of Open Access Journals (Sweden)

    Diego Mendes de Souza

    2016-01-01

    Full Text Available ABSTRACT Soil organic matter (SOM is important to fertility, since it performs several functions such as cycling, water and nutrient retention and soil aggregation, in addition to being an energy requirement for biological activity. This study proposes new trends to the Embrapa, Walkley-Black, and Mebius methods that allowed the determination of SOM by spectrophotometry, increasing functionality. The mass of 500 mg was reduced to 200 mg, generating a mean of 60 % saving of reagents and a decrease of 91 % in the volume of residue generated for the three methods without compromising accuracy and precision. We were able to optimize conditions for the Mebius method and establish the digestion time of maximum recovery of SOM by factorial design and response surface. The methods were validated by the estimate of figures of merits. Between the methods investigated, the optimized Mebius method was best suited for determining SOM, showing near 100 % recovery.

  20. A critical review of alternate methods of Purex solvent treatment

    Energy Technology Data Exchange (ETDEWEB)

    Gordon, N.R.

    1957-09-20

    This document is a HAPO report dated September 20, 1957. At the time of this report, there had been many different methods suggested for improving the Purex organic recovery system. Some of these suggestions been tried in the plant and others had not. Information concerning this system and, particularly, methods for improving it were wide spread. At the time of this report was no immediate source of information available on the proposed and investigated methods for improving the Purex solvent recovery system.

  1. A new method of studying the statistical properties of speckle phase

    Institute of Scientific and Technical Information of China (English)

    Qiankai Wang

    2009-01-01

    A new theoretical method with generality is proposed to study the statistical properties of the speckle phase. The general expression of the standard deviation of the speckle phase about the first-order statistics is derived according to the relation between the phase and the complex speckle amplitude. The statistical properties of the speckle phase have been studied in the diffraction fields with this new theoretical method.

  2. Axial electron channeling statistical method of site occupancy determination

    Institute of Scientific and Technical Information of China (English)

    YE; Jia

    2001-01-01

    [1]Johnson, W., Sowerby, R., Venter, R. D., Plane Strain Slip Line Fields for Metal Deformation Processes——A Source Book and Bibliography, New York: Pergamon Press, 1982.[2]Hill, R., The Mathematical Theory of Plasticity, Oxford: Oxford University Press, 1950.[3]Sokolovsky, V. V., Theory of Plasticity(in Russia), Moskow: Nat. Tech. Press, 1950.[4]Kachanov, L. M., Foundations Theory of Plasticity, London: North-Holland, 1975.[5]Shield, R. T., On the plastic flow of metal condition of axial symmetry, Proc. Roy. Soc., 1955, 233A: 267.[6]Lippmann, H., IUTAM Symposium on Metal Forming Plasticity, New York: Springer-Verlag, 1979.[7]Spencer, A. J. M., The approximate solution of certain problem of axially-symmetric plastic flow, J. Mech. Phys. Solids, 1964, 12: 231.[8]Wang, R., Xiong, Z. H., Wang, W. B., Foundation of Plasticity (in Chinese), Beijing: Science Press, 1982.[9]Collins, I. E., Dewhurst, P., A slip line field analysis of asymmetrical hot rolling, International Journal of Mechanical Science, 1975, 17: 643.[10]Collins, I. F., Slip line field analysis of forming processes in plane strain and axial symmetry, Advanced Technology of Plasticity, 1984, 11: 1074.[11]Yu, M. H., Yang, S. Y., Liu, C. Y. et al., Unified plane-strain slip line field theory system, J. Civil Engineering (in Chinese), 1997, 30(2): 14[12]Simmons, J. A., Hauser, F., Dorn, E., Mathematical Theories of Plastic Deformation Under Impulsive Loading, Berkeley-Los Angeles: University of California Press, 1962.[13]Lin, C. C., On a perturbation theory based on the method of characteristies, J. Math. Phys., 1954, 33: 117—134.[14]Hopkins, H. G., The method of characteristics and its applications to the theory of stress waver in solids, in Engineering Plasticity, Combridge: Combridge University Press, 1968, 277—315.[15]Shield, R. T., The plastic indentation of a layer by a flat punch, Quart. Appl. Math., 1955, 13: 27.[16]Haar, A., von

  3. The Sine Method: An Alternative Height Measurement Technique

    Science.gov (United States)

    Don C. Bragg; Lee E. Frelich; Robert T. Leverett; Will Blozan; Dale J. Luthringer

    2011-01-01

    Height is one of the most important dimensions of trees, but few observers are fully aware of the consequences of the misapplication of conventional height measurement techniques. A new approach, the sine method, can improve height measurement by being less sensitive to the requirements of conventional techniques (similar triangles and the tangent method). We studied...

  4. Evaluating an alternative method for rapid urinary creatinine determination

    Science.gov (United States)

    Creatinine (CR) is an endogenously-produced chemical routinely assayed in urine specimens to assess kidney function, sample dilution. The industry-standard method for CR determination, known as the kinetic Jaffe (KJ) method, relies on an exponential rate of a colorimetric change,...

  5. Effectiveness of Alternative Extension Methods through Radio Broadcasting in West Africa

    Science.gov (United States)

    Moussa, Bokar; Otoo, Miriam; Fulton, Joan; Lowenberg-DeBoer, James

    2011-01-01

    There is an urgent need to quantify which extension methods are most effective in Africa. The objective of this study was to determine the impact of alternative extension methods on adoption of the triple bagging cowpea storage technology in Niger and Burkina Faso. This study was designed as a quasi-experiment with two alternative extension…

  6. 76 FR 50221 - International Workshop on Alternative Methods for Human and Veterinary Rabies Vaccine Testing...

    Science.gov (United States)

    2011-08-12

    ... HUMAN SERVICES International Workshop on Alternative Methods for Human and Veterinary Rabies Vaccine... ``International Workshop on Alternative Methods for Human and Veterinary Rabies Vaccine Testing: State of the... approaches that may reduce, refine, or replace animal use in human and veterinary rabies vaccine...

  7. 77 FR 17457 - Work Group on Alternative Test Methods for Commercial Measuring Devices

    Science.gov (United States)

    2012-03-26

    ... National Institute of Standards and Technology Work Group on Alternative Test Methods for Commercial...: The National Institute of Standards and Technology (NIST) is forming a Work Group (WG) to examine alternative methods for testing the accuracy of commercial measuring devices including, but not limited...

  8. 76 FR 65382 - Regulation of Fuel and Fuel Additives: Alternative Test Method for Olefins in Gasoline

    Science.gov (United States)

    2011-10-21

    ... Olefins in Gasoline AGENCY: Environmental Protection Agency (EPA). ACTION: Final rule. SUMMARY: The... alternative test method for olefin content in gasoline. This final rule will provide flexibility to the... to me? II. Rule Change A. Alternative Test Method for Olefins in Gasoline III. Statutory...

  9. Effectiveness of Alternative Extension Methods through Radio Broadcasting in West Africa

    Science.gov (United States)

    Moussa, Bokar; Otoo, Miriam; Fulton, Joan; Lowenberg-DeBoer, James

    2011-01-01

    There is an urgent need to quantify which extension methods are most effective in Africa. The objective of this study was to determine the impact of alternative extension methods on adoption of the triple bagging cowpea storage technology in Niger and Burkina Faso. This study was designed as a quasi-experiment with two alternative extension…

  10. Relationship between Students' Scores on Research Methods and Statistics, and Undergraduate Project Scores

    Science.gov (United States)

    Ossai, Peter Agbadobi Uloku

    2016-01-01

    This study examined the relationship between students' scores on Research Methods and statistics, and undergraduate project at the final year. The purpose was to find out whether students matched knowledge of research with project-writing skill. The study adopted an expost facto correlational design. Scores on Research Methods and Statistics for…

  11. The Playground Game: Inquiry‐Based Learning About Research Methods and Statistics

    NARCIS (Netherlands)

    Westera, Wim; Slootmaker, Aad; Kurvers, Hub

    2014-01-01

    The Playground Game is a web-based game that was developed for teaching research methods and statistics to nursing and social sciences students in higher education and vocational training. The complexity and abstract nature of research methods and statistics poses many challenges for students. The P

  12. APA's Learning Objectives for Research Methods and Statistics in Practice: A Multimethod Analysis

    Science.gov (United States)

    Tomcho, Thomas J.; Rice, Diana; Foels, Rob; Folmsbee, Leah; Vladescu, Jason; Lissman, Rachel; Matulewicz, Ryan; Bopp, Kara

    2009-01-01

    Research methods and statistics courses constitute a core undergraduate psychology requirement. We analyzed course syllabi and faculty self-reported coverage of both research methods and statistics course learning objectives to assess the concordance with APA's learning objectives (American Psychological Association, 2007). We obtained a sample of…

  13. Best Practices in Teaching Statistics and Research Methods in the Behavioral Sciences [with CD-ROM

    Science.gov (United States)

    Dunn, Dana S., Ed.; Smith, Randolph A., Ed.; Beins, Barney, Ed.

    2007-01-01

    This book provides a showcase for "best practices" in teaching statistics and research methods in two- and four-year colleges and universities. A helpful resource for teaching introductory, intermediate, and advanced statistics and/or methods, the book features coverage of: (1) ways to integrate these courses; (2) how to promote ethical conduct;…

  14. APA's Learning Objectives for Research Methods and Statistics in Practice: A Multimethod Analysis

    Science.gov (United States)

    Tomcho, Thomas J.; Rice, Diana; Foels, Rob; Folmsbee, Leah; Vladescu, Jason; Lissman, Rachel; Matulewicz, Ryan; Bopp, Kara

    2009-01-01

    Research methods and statistics courses constitute a core undergraduate psychology requirement. We analyzed course syllabi and faculty self-reported coverage of both research methods and statistics course learning objectives to assess the concordance with APA's learning objectives (American Psychological Association, 2007). We obtained a sample of…

  15. Methods Matter: Tracking Health Disparities in Alternative High Schools.

    Science.gov (United States)

    Johnson, Karen E; Goyal, Mohit; Simonton, Amanda J; Richardson, Rebecca; Morris, Marian; Rew, Lynn

    2017-05-01

    Alternative high school (AHS) students are at-risk for school dropout and engage in high levels of health-risk behaviors that should be monitored over time. They are excluded from most public health surveillance efforts (e.g., Youth Risk Behavior Survey; YRBS), hindering our ability to monitor health disparities and allocate scarce resources to the areas of greatest need. Using active parental consent, we recruited 515 students from 14 AHSs in Texas to take a modified YRBS. We calculated three different participation rates, tracked participation by age of legal consent (≥18 and rate among students students, cooperation rates may be more accurate than participation rates based off of enrollment or attendance. Requiring active consent and not having accurate participation rates may result in surveillance data that are of disparate quality. This threatens to mask the needs of AHS students and perpetuate disparities because we are likely missing the highest-risk students within a high-risk sample and cannot generalize findings. © 2017 Wiley Periodicals, Inc.

  16. Alternative Methods to Treat Nausea and Vomiting from Cancer Chemotherapy

    Directory of Open Access Journals (Sweden)

    Mohammad Ali Sheikhi

    2015-01-01

    Full Text Available Chemotherapy Induced Nausea and Vomiting (CINV is among the most intensive side effects and critical concerns for patients with cancer. Most of these patients experience nausea and vomiting after chemotherapy. Sometimes, this is so annoying that it may prevent them from continuing the therapy. With the recent advances, a variety of therapeutic methods are innovated and applied to control CINV. Among them, the main methods include medicinal therapy, relaxation, and herbal therapy. Yet, using dexamethasone together with massage therapy and ginger is identified as the most effective method.

  17. Alternative method for reconstruction of antihydrogen annihilation vertices

    Science.gov (United States)

    Amole, C.; Ashkezari, M. D.; Andresen, G. B.; Baquero-Ruiz, M.; Bertsche, W.; Bowe, P. D.; Butler, E.; Cesar, C. L.; Chapman, S.; Charlton, M.; Deller, A.; Eriksson, S.; Fajans, J.; Friesen, T.; Fujiwara, M. C.; Gill, D. R.; Gutierrez, A.; Hangst, J. S.; Hardy, W. N.; Hayano, R. S.; Hayden, M. E.; Humphries, A. J.; Hydomako, R.; Jonsell, S.; Kurchaninov, L.; Madsen, N.; Menary, S.; Nolan, P.; Olchanski, K.; Olin, A.; Povilus, A.; Pusa, P.; Robicheaux, F.; Sarid, E.; Silveira, D. M.; So, C.; Storey, J. W.; Thompson, R. I.; van der Werf, D. P.; Wurtele, J. S.; Yamazaki, Y.

    The ALPHA experiment, located at CERN, aims to compare the properties of antihydrogen atoms with those of hydrogen atoms. The neutral antihydrogen atoms are trapped using an octupole magnetic trap. The trap region is surrounded by a three layered silicon detector used to reconstruct the antiproton annihilation vertices. This paper describes a method we have devised that can be used for reconstructing annihilation vertices with a good resolution and is more efficient than the standard method currently used for the same purpose.

  18. Alternative method for reconstruction of antihydrogen annihilation vertices

    CERN Document Server

    Amole, C; Andresen , G B; Baquero-Ruiz, M; Bertsche, W; Bowe, P D; Butler, E; Cesar, C L; Chapman, S; Charlton, M; Deller, A; Eriksson, S; Fajans, J; Friesen, T; Fujiwara, M C; Gill, D R; Gutierrez, A; Hangst, J S; Hardy, W N; Hayano, R S; Hayden, M E; Humphries, A J; Hydomako, R; Jonsell, S; Kurchaninov, L; Madsen, N; Menary, S; Nolan, P; Olchanski, K; Olin, A; Povilus, A; Pusa, P; Robicheaux, F; Sarid, E; Silveira, D M; So, C; Storey, J W; Thompson, R I; van der Werf, D P; Wurtele, J S; Yamazaki,Y

    2012-01-01

    The ALPHA experiment, located at CERN, aims to compare the properties of antihydrogen atoms with those of hydrogen atoms. The neutral antihydrogen atoms are trapped using an octupole magnetic trap. The trap region is surrounded by a three layered silicon detector used to reconstruct the antiproton annihilation vertices. This paper describes a method we have devised that can be used for reconstructing annihilation vertices with a good resolution and is more efficient than the standard method currently used for the same purpose.

  19. Feasibility and acceptability of alternate methods of postnatal data collection.

    Science.gov (United States)

    McCormack, Lacey A; Friedrich, Christa; Fahrenwald, Nancy; Specker, Bonny

    2014-05-01

    This study was done in preparation for the launch of the National Children's Study (NCS) main study. The goal of this study was to examine the feasibility (completion rates and completeness of data), acceptability, staff time and cost-effectiveness of three methods of data collection for the postnatal 3- and 9-month questionnaires completed as part of NCS protocol. Eligible NCS participants who were scheduled to complete a postnatal questionnaire at three and nine months were randomly assigned to receive either: (a) telephone data collection (b) web-based data collection, or (c) self-administered (mailed) questionnaires. Event completion rates and satisfaction across the three data collection methods were compared and the influence of socio-demographic factors on completion rates and satisfaction rates was examined. Cost data were compared to data for completion and satisfaction for each of the delivery methods. Completion rates and satisfaction did not differ significantly by method, but completeness of data did, with odds of data completeness higher among web than phone (p data collection methods were seen. Mail and phone data collection were the least complete of the three methods and were the most expensive. Mailed data collection was neither complete nor exceptionally economical. Web-based data collection was the least costly and provided the most complete data. Participants without web access could complete the questionnaire over the phone.

  20. An alternative method for the measurement of the mechanical impulse of a vertically directed blast

    CSIR Research Space (South Africa)

    Turner, GR

    2008-01-01

    Full Text Available An alternative method for the measurement of the total mechanical impulse of a vertically directed blast due to an explosive charge is presented. The method differs from apparatus that employ a vertically displaced mass (similar in principle...

  1. Special section: Statistical methods for next-generation gene sequencing data

    OpenAIRE

    Kafadar, Karen

    2012-01-01

    This issue includes six articles that develop and apply statistical methods for the analysis of gene sequencing data of different types. The methods are tailored to the different data types and, in each case, lead to biological insights not readily identified without the use of statistical methods. A common feature in all articles is the development of methods for analyzing simultaneously data of different types (e.g., genotype, phenotype, pedigree, etc.); that is, using data of one type to i...

  2. A novel method to assist the detection of the Cyclic Alternating Pattern (CAP).

    Science.gov (United States)

    Tenorio, J M; Alba, A; Mendez, M O; Bianchi, A M; Grassi, A; Arce-Santana, E; Chouvarda, I; Mariani, S; Rosso, V; Terzano, M G; Parrino, L

    2012-01-01

    This study proposes a novel method to assist the detection of the components that build up the Cyclic Alternating Pattern (CAP). CAP is a sleep phenomenon formed by consecutive sequences of activations (A1, A2, A3) and non-activations during nonREM sleep. The main importance of CAP evaluation is the possibility of defining the sleep process more accurately. Ten recordings from healthy and good sleepers were included in this study. The method is based on inferential statistics to define the initial and ending points of the CAP components based only on an initialization point given by the expert. The results show concordance up to 95% for A1, 85% for A2 and 60% for A3, together with an overestimation of 1.5 s in A1, 1.3 s in A2 and 0 s in A3. The total CAP rate presents a total underestimation of 7 min. Those results suggest that the method is able to accurately detect the initial and ending points of the activations, and may be helpful for the physicians by reducing the time dedicated to the manual inspection task.

  3. Alternative sample preparation methods for MALDI-MS

    Energy Technology Data Exchange (ETDEWEB)

    Hurst, G.B.; Buchanan, M.V. [Oak Ridge National Lab., TN (United States); Czartoski, T.J. [Kenyon College, Gambier, OH (United States)

    1994-12-31

    Since the introduction of matrix-assisted laser desorption and ionization (MALDI), sample preparation has been a limiting step in the applicability of this important technique for mass spectrometric analysis of biomolecules. A number of variations on the original sample preparation method for have been described. The {open_quotes}standard{close_quotes} method of MALDI sample preparation requires mixing a solution containing the analyte and a large excess of matrix, and allowing a small volume of this solution to dry on a probe tip before insertion into the mass spectrometer. The resulting sample can fairly inhomogeneous. As a result, the process of aiming the desorption laser at a favorable spot on the dried sample can be tedious and time-consuming. The authors are evaluating several approaches to MALDI sample preparation, with the goal of developing a faster and more reproducible method.

  4. Methods for estimating selected low-flow statistics and development of annual flow-duration statistics for Ohio

    Science.gov (United States)

    Koltun, G.F.; Kula, Stephanie P.

    2013-01-01

    This report presents the results of a study to develop methods for estimating selected low-flow statistics and for determining annual flow-duration statistics for Ohio streams. Regression techniques were used to develop equations for estimating 10-year recurrence-interval (10-percent annual-nonexceedance probability) low-flow yields, in cubic feet per second per square mile, with averaging periods of 1, 7, 30, and 90-day(s), and for estimating the yield corresponding to the long-term 80-percent duration flow. These equations, which estimate low-flow yields as a function of a streamflow-variability index, are based on previously published low-flow statistics for 79 long-term continuous-record streamgages with at least 10 years of data collected through water year 1997. When applied to the calibration dataset, average absolute percent errors for the regression equations ranged from 15.8 to 42.0 percent. The regression results have been incorporated into the U.S. Geological Survey (USGS) StreamStats application for Ohio (http://water.usgs.gov/osw/streamstats/ohio.html) in the form of a yield grid to facilitate estimation of the corresponding streamflow statistics in cubic feet per second. Logistic-regression equations also were developed and incorporated into the USGS StreamStats application for Ohio for selected low-flow statistics to help identify occurrences of zero-valued statistics. Quantiles of daily and 7-day mean streamflows were determined for annual and annual-seasonal (September–November) periods for each complete climatic year of streamflow-gaging station record for 110 selected streamflow-gaging stations with 20 or more years of record. The quantiles determined for each climatic year were the 99-, 98-, 95-, 90-, 80-, 75-, 70-, 60-, 50-, 40-, 30-, 25-, 20-, 10-, 5-, 2-, and 1-percent exceedance streamflows. Selected exceedance percentiles of the annual-exceedance percentiles were subsequently computed and tabulated to help facilitate consideration of the

  5. Conceptual Approaches to Alternate Methods in Toxicological Testing

    Directory of Open Access Journals (Sweden)

    Alan M. Goldberg

    1987-04-01

    Full Text Available Due to public pressure, in vivo methods of toxicity testing is being attempted to be replaced by in vitro methods, such as cell and organ culture, computer modelling and modified LD50 tests using lesser number of animals. Specifically in the case of Draize eye irritancy test using rabbits, a number of refinements have been incorporated by different workers, mainly use of a local anaesthetic which will reduce animal distress without vitiating the test results. The author recommends exploration of new avenues for testing based on the advances in cell biology.

  6. A chronicle of permutation statistical methods 1920–2000, and beyond

    CERN Document Server

    Berry, Kenneth J; Mielke Jr , Paul W

    2014-01-01

    The focus of this book is on the birth and historical development of permutation statistical methods from the early 1920s to the near present. Beginning with the seminal contributions of R.A. Fisher, E.J.G. Pitman, and others in the 1920s and 1930s, permutation statistical methods were initially introduced to validate the assumptions of classical statistical methods. Permutation methods have advantages over classical methods in that they are optimal for small data sets and non-random samples, are data-dependent, and are free of distributional assumptions. Permutation probability values may be exact, or estimated via moment- or resampling-approximation procedures. Because permutation methods are inherently computationally-intensive, the evolution of computers and computing technology that made modern permutation methods possible accompanies the historical narrative. Permutation analogs of many well-known statistical tests are presented in a historical context, including multiple correlation and regression, ana...

  7. How to choose the right statistical software?-a method increasing the post-purchase satisfaction.

    Science.gov (United States)

    Cavaliere, Roberto

    2015-12-01

    Nowadays, we live in the "data era" where the use of statistical or data analysis software is inevitable, in any research field. This means that the choice of the right software tool or platform is a strategic issue for a research department. Nevertheless, in many cases decision makers do not pay the right attention to a comprehensive and appropriate evaluation of what the market offers. Indeed, the choice still depends on few factors like, for instance, researcher's personal inclination, e.g., which software have been used at the university or is already known. This is not wrong in principle, but in some cases it's not enough at all and might lead to a "dead end" situation, typically after months or years of investments already done on the wrong software. This article, far from being a full and complete guide to statistical software evaluation, aims to illustrate some key points of the decision process and introduce an extended range of factors which can help to undertake the right choice, at least in potential. There is not enough literature about that topic, most of the time underestimated, both in the traditional literature and even in the so called "gray literature", even if some documents or short pages can be found online. Anyhow, it seems there is not a common and known standpoint about the process of software evaluation from the final user perspective. We suggests a multi-factor analysis leading to an evaluation matrix tool, to be intended as a flexible and customizable tool, aimed to provide a clearer picture of the software alternatives available, not in abstract but related to the researcher's own context and needs. This method is a result of about twenty years of experience of the author in the field of evaluating and using technical-computing software and partially arises from a research made about such topics as part of a project funded by European Commission under the Lifelong Learning Programme 2011.

  8. Alternative correction equations in the Jacobi-Davidson method

    NARCIS (Netherlands)

    Genseberger, M.; Sleijpen, G.L.G.

    1998-01-01

    The correction equation in the Jacobi-Davidson method is effective in a subspace orthogonal to the current eigenvector approximation, whereas for the continuation of the process only vectors orthogonal to the search subspace are of importance. Such a vector is obtained by orthogonalizing the

  9. Alternative correction equations in the Jacobi-Davidson method

    NARCIS (Netherlands)

    Genseberger, M.; Sleijpen, G.L.G.

    2001-01-01

    The correction equation in the Jacobi-Davidson method is effective in a subspace orthogonal to the current eigenvector approximation, whereas for the continuation of the process only vectors orthogonal to the search subspace are of importance. Such a vector is obtained by orthogonalizing the (approx

  10. An alternative method for neonatal cerebro-myocardial perfusion

    Science.gov (United States)

    Luciani, Giovanni Battista; De Rita, Fabrizio; Faggian, Giuseppe; Mazzucco, Alessandro

    2012-01-01

    Several techniques have already been described for selective cerebral perfusion during repair of aortic arch pathology in children. One method combining cerebral with myocardial perfusion has also been proposed. A novel technique is reported here for selective and independent cerebro-myocardial perfusion for neonatal and infant arch surgery. Technical aspects and potential advantages are discussed. PMID:22307393

  11. STUDY OF SEASONAL TREND-PROCESS WITH THE METHOD OF CLASSICAL STATISTICS

    Directory of Open Access Journals (Sweden)

    Kymratova A. M.

    2014-11-01

    Full Text Available This work is devoted to the methods of multicriteria optimization and classical statistics of obtaining pre-estimated information for time series that have long-term memory, which is why their levels do not satisfy the independence property, and therefore the classical prediction methods may be inadequate. The developed methods of obtaining such information are based on classical statistics methods such as mathematical statistics, multicriteria optimization and extreme value theory. The effectiveness of the proposed approach has been demonstrated on the example of specific time series of volumes of mountain rivers

  12. An Accelerated Linearized Alternating Direction Method of Multipliers

    Science.gov (United States)

    2014-02-01

    Obozinski, and J.-P. Vert. Group lasso with overlap and graph lasso. In Proceedings of the 26th Annual International Conference on Machine Learning...the 30th International Conference on Machine Learning (ICML-13), pages 80–88, 2013. [43] J. Pena. Nash equilibria computation via smoothing techniques...Optima, 78:12–13, 2008. [44] M. J. D. Powell. A method for nonlinear constraints in minimization problems. In Optimization (Sympos., Univ. Keele

  13. Alternative method for quantification of alfa-amylase activity

    Directory of Open Access Journals (Sweden)

    DF. Farias

    Full Text Available A modification of the sensitive agar diffusion method was developed for macro-scale determination of alfa-amylase. The proposed modifications lower costs with the utilisation of starch as substrate and agar as supporting medium. Thus, a standard curve was built using alfa-amylase solution from Aspergillus oryzae, with concentrations ranging from 2.4 to 7,500 U.mL-1. Clear radial diffusion zones were measured after 4 hours of incubation at 20 °C. A linear relationship between the logarithm of enzyme activities and the area of clear zones was obtained. The method was validated by testing α-amylase from barley at the concentrations of 2.4; 60; 300 and 1,500 U.mL-1. The proposed method turned out to be simpler, faster, less expensive and able to determine on a macro-scale α-amylase over a wide range (2.4 to 7,500 U.mL-1 in scientific investigation as well as in teaching laboratory activities.

  14. OZONE: ALTERNATIVE METHOD FOR MITE CONTROL ON SPECK

    Directory of Open Access Journals (Sweden)

    C. Cantoni

    2013-02-01

    Full Text Available This study is aimed at the development of a method for integrated mite control in the industrial production of speck. The investigation were carried out on the premises of five factories in the north-east of Italy. Tyrophagus putrescentiae and T. longior were predominant. The gaseous ozone treatment at low level (0.4 ppm was able to kill mites in a period within 15 days and 1 month. The characteristic layer of mould on the product surface reappears within 1 month from the end of treatment with ozone.

  15. Application of an Error Statistics Estimation Method to the PSAS Forecast Error Covariance Model

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    In atmospheric data assimilation systems, the forecast error covariance model is an important component. However, the parameters required by a forecast error covariance model are difficult to obtain due to the absence of the truth. This study applies an error statistics estimation method to the Physical-space Statistical Analysis System (PSAS) height-wind forecast error covariance model. This method consists of two components: the first component computes the error statistics by using the National Meteorological Center (NMC) method, which is a lagged-forecast difference approach, within the framework of the PSAS height-wind forecast error covariance model; the second obtains a calibration formula to rescale the error standard deviations provided by the NMC method. The calibration is against the error statistics estimated by using a maximum-likelihood estimation (MLE) with rawindsonde height observed-minus-forecast residuals. A complete set of formulas for estimating the error statistics and for the calibration is applied to a one-month-long dataset generated by a general circulation model of the Global Model and Assimilation Office (GMAO), NASA. There is a clear constant relationship between the error statistics estimates of the NMC-method and MLE. The final product provides a full set of 6-hour error statistics required by the PSAS height-wind forecast error covariance model over the globe. The features of these error statistics are examined and discussed.

  16. A Review of the Statistical and Quantitative Methods Used to Study Alcohol-Attributable Crime.

    Directory of Open Access Journals (Sweden)

    Jessica L Fitterer

    Full Text Available Modelling the relationship between alcohol consumption and crime generates new knowledge for crime prevention strategies. Advances in data, particularly data with spatial and temporal attributes, have led to a growing suite of applied methods for modelling. In support of alcohol and crime researchers we synthesized and critiqued existing methods of spatially and quantitatively modelling the effects of alcohol exposure on crime to aid method selection, and identify new opportunities for analysis strategies. We searched the alcohol-crime literature from 1950 to January 2014. Analyses that statistically evaluated or mapped the association between alcohol and crime were included. For modelling purposes, crime data were most often derived from generalized police reports, aggregated to large spatial units such as census tracts or postal codes, and standardized by residential population data. Sixty-eight of the 90 selected studies included geospatial data of which 48 used cross-sectional datasets. Regression was the prominent modelling choice (n = 78 though dependent on data many variations existed. There are opportunities to improve information for alcohol-attributable crime prevention by using alternative population data to standardize crime rates, sourcing crime information from non-traditional platforms (social media, increasing the number of panel studies, and conducting analysis at the local level (neighbourhood, block, or point. Due to the spatio-temporal advances in crime data, we expect a continued uptake of flexible Bayesian hierarchical modelling, a greater inclusion of spatial-temporal point pattern analysis, and shift toward prospective (forecast modelling over small areas (e.g., blocks.

  17. Can We Use Polya’s Method to Improve Students’ Performance in the Statistics Classes?

    Directory of Open Access Journals (Sweden)

    Indika Wickramasinghe

    2015-01-01

    Full Text Available In this study, Polya’s problem-solving method is introduced in a statistics class in an effort to enhance students’ performance. Teaching the method was applied to one of the two introductory-level statistics classes taught by the same instructor, and a comparison was made between the performances in the two classes. The results indicate there was a significant improvement of the students’ performance in the class in which Polya’s method was introduced.

  18. Combating anti-statistical thinking using simulation-based methods throughout the undergraduate curriculum

    OpenAIRE

    Tintle, Nathan; Chance, Beth; Cobb, George; Roy, Soma; Swanson, Todd; VanderStoep, Jill

    2015-01-01

    The use of simulation-based methods for introducing inference is growing in popularity for the Stat 101 course, due in part to increasing evidence of the methods ability to improve students' statistical thinking. This impact comes from simulation-based methods (a) clearly presenting the overarching logic of inference, (b) strengthening ties between statistics and probability or mathematical concepts, (c) encouraging a focus on the entire research process, (d) facilitating student thinking abo...

  19. An alternative method for smartphone input using AR markers

    Directory of Open Access Journals (Sweden)

    Yuna Kang

    2014-07-01

    Full Text Available As smartphones came into wide use recently, it has become increasingly popular not only among young people, but among middle-aged people as well. Most smartphones adopt capacitive full touch screen, so touch commands are made by fingers unlike the PDAs in the past that use touch pens. In this case, a significant portion of the smartphone’s screen is blocked by the finger so it is impossible to see the screens around the finger touching the screen; this causes difficulties in making precise inputs. To solve this problem, this research proposes a method of using simple AR markers to improve the interface of smartphones. A marker is placed in front of the smartphone camera. Then, the camera image of the marker is analyzed to determine the position of the marker as the position of the mouse cursor. This method can enable click, double-click, drag-and-drop used in PCs as well as touch, slide, long-touch-input in smartphones. Through this research, smartphone inputs can be made more precise and simple, and show the possibility of the application of a new concept of smartphone interface.

  20. Using the Bootstrap Method for a Statistical Significance Test of Differences between Summary Histograms

    Science.gov (United States)

    Xu, Kuan-Man

    2006-01-01

    A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.

  1. Text mining describes the use of statistical and epidemiological methods in published medical research.

    Science.gov (United States)

    Meaney, Christopher; Moineddin, Rahim; Voruganti, Teja; O'Brien, Mary Ann; Krueger, Paul; Sullivan, Frank

    2016-06-01

    To describe trends in the use of statistical and epidemiological methods in the medical literature over the past 2 decades. We obtained all 1,028,786 articles from the PubMed Central Open-Access archive (retrieved May 9, 2015). We focused on 113,450 medical research articles. A Delphi panel identified 177 statistical/epidemiological methods pertinent to clinical researchers. We used a text-mining approach to determine if a specific statistical/epidemiological method was encountered in a given article. We report the proportion of articles using a specific method for the entire cross-sectional sample and also stratified into three blocks of time (1995-2005; 2006-2010; 2011-2015). Numeric descriptive statistics were commonplace (96.4% articles). Other frequently encountered methods groups included statistical inferential concepts (52.9% articles), epidemiological measures of association (53.5% articles) methods for diagnostic/classification accuracy (40.1% articles), hypothesis testing (28.8% articles), ANOVA (23.2% articles), and regression (22.6% articles). We observed relative percent increases in the use of: regression (103.0%), missing data methods (217.9%), survival analysis (147.6%), and correlated data analysis (192.2%). This study identified commonly encountered and emergent methods used to investigate medical research problems. Clinical researchers must be aware of the methodological landscape in their field, as statistical/epidemiological methods underpin research claims. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Alternative Method for Solving Traveling Salesman Problem by Evolutionary Algorithm

    Directory of Open Access Journals (Sweden)

    Zuzana Čičková

    2008-06-01

    Full Text Available This article describes the application of Self Organizing Migrating Algorithm (SOMA to the well-known optimization problem - Traveling Salesman Problem (TSP. SOMA is a relatively new optimization method that is based on Evolutionary Algorithms that are originally focused on solving non-linear programming problems that contain continuous variables. The TSP has model character in many branches of Operation Research because of its computational complexity; therefore the use of Evolutionary Algorithm requires some special approaches to guarantee feasibility of solutions. In this article two concrete examples of TSP as 8 cities set and 25 cities set are given to demonstrate the practical use of SOMA. Firstly, the penalty approach is applied as a simple way to guarantee feasibility of solution. Then, new approach that works only on feasible solutions is presented.

  3. An alternative method for the measurement of neutron flux

    Indian Academy of Sciences (India)

    Rupa Sarkar; Prasanna Kumar Mondal; Barun Kumar Chatterjee

    2015-10-01

    A simple and easy method for measuring the neutron flux is presented. This paper deals with the experimental verification of neutron dose rate–flux relationship for a non-dissipative medium. Though the neutron flux cannot be obtained from the dose rate in a dissipative medium, experimental result shows that for non-dissipative medium one can obtain the neutron flux from dose rate. We have used a 241 AmBe neutron source for neutron irradiation, and the neutron dose rate and count rate were measured using a NM2B neutron monitor and R-12 superheated droplet detector (SDD), respectively. Here, the neutron flux inferred from the neutron count rate obtained with R-12 SDD shows an excellent agreement with the flux inferred from the neutron dose rate in a non-dissipative medium.

  4. Reliability and applications of statistical methods based on oligonucleotide frequencies in bacterial and archaeal genomes

    DEFF Research Database (Denmark)

    Bohlin, J; Skjerve, E; Ussery, David

    2008-01-01

    BACKGROUND: The increasing number of sequenced prokaryotic genomes contains a wealth of genomic data that needs to be effectively analysed. A set of statistical tools exists for such analysis, but their strengths and weaknesses have not been fully explored. The statistical methods we are concerned......, or be based on specific statistical distributions. Advantages with these statistical methods include measurements of phylogenetic relationship with relatively small pieces of DNA sampled from almost anywhere within genomes, detection of foreign/conserved DNA, and homology searches. Our aim was to explore...... measure was a good measure to detect horizontally transferred regions, and when used to compare the phylogenetic relationships between plasmids and hosts, significant correlation (R2 = 0.4) was found with genomic GC content and intra-chromosomal homogeneity. CONCLUSION: The statistical methods examined...

  5. An Exploratory Study to Develop an Alternative Model of Public Library Management Using the Institute of Museum and Library Services' Public Library Statistics

    Science.gov (United States)

    Kim, Giyeong; Yu, So Young

    2011-01-01

    In this explorative study, we first investigate current use of public library statistics in public library management to identify a governing framework and then carefully suggest an alternative framework with income as a goal for sustainability. The meaning of income in terms of management is also discussed. Within this framework, we conduct a…

  6. ALTERNATING METHOD STUDY ON STRESS ANALYSIS OF SURROUNDING ROCK FOR TWO RANDOM GEOMETRY TUNNELS

    Institute of Scientific and Technical Information of China (English)

    吕爱钟; 张路青

    1997-01-01

    The stress analysis of surrounding rock for two random geometry tunnels is studied inthis paper by using Schwarz's alternating method. The simple and effective alternating algorithm is found, in which the surplus surface force is approximated by Fourier series, thus the iteration derivation can be conducted according to the precision required, finally, the stress results with high precision are obtained.

  7. Alternative sintering methods compared to conventional thermal sintering for inkjet printed silver nanoparticle ink

    NARCIS (Netherlands)

    Niittynen, J.; Abbel, R.; Mäntysalo, M.; Perelaer, J.; Schubert, U.S.; Lupo, D.

    2014-01-01

    In this contribution several alternative sintering methods are compared to traditional thermal sintering as high temperature and long process time of thermal sintering are increasing the costs of inkjet-printing and preventing the use of this technology in large scale manufacturing. Alternative sint

  8. Innovative Solutions for Words with Emphasis: Alternative Methods of Braille Transcription

    Science.gov (United States)

    Kamei-Hannan, Cheryl

    2009-01-01

    The author of this study proposed two alternative methods for transcribing words with emphasis into braille and compared the use of the symbols for emphasis with the current braille code. The results showed that students were faster at locating words presented in one of the alternate formats, but that there was no difference in students' accuracy…

  9. A von Neumann Alternating Method for Finding Common Solutions to Variational Inequalities

    CERN Document Server

    Censor, Yair; Reich, Simeon

    2012-01-01

    Modifying von Neumann's alternating projections algorithm, we obtain an alternating method for solving the recently introduced Common Solutions to Variational Inequalities Problem (CSVIP). For simplicity, we mainly confine our attention to the two-set CSVIP, which entails finding common solutions to two unrelated variational inequalities in Hilbert space.

  10. A Comparative Analysis of Multivariate Statistical Detection Methods Applied to Syndromic Surveillance

    Science.gov (United States)

    2007-06-01

    the observed system. Our research involved a comparative analysis of two multivariate statistical methods, the multivariate CUSUM (MCUSUM) and the...outbreaks. We found that, similar to results for the univariate CUSUM and EWMA, the directionally-sensitive MCUSUM and MEWMA perform very similarly. 14...SUBJECT TERMS Biosurveillance, Multivariate CUSUM , Multivariate EWMA, Statistical Process Control, Syndromic Surveillance 15. NUMBER OF PAGES

  11. Statistical relevance of vorticity conservation with the Hamiltonian particle-mesh method

    NARCIS (Netherlands)

    Dubinkina, S.; Frank, J.E.

    2009-01-01

    We conduct long simulations with a Hamiltonian particle-mesh method for ideal fluid flow, to determine the statistical mean vorticity field. Lagrangian and Eulerian statistical models are proposed for the discrete dynamics, and these are compared against numerical experiments. The observed results a

  12. Statistical relevance of vorticity conservation with the Hamiltonian particle-mesh method

    NARCIS (Netherlands)

    Dubinkina, S.; Frank, J.E.

    2010-01-01

    We conduct long-time simulations with a Hamiltonian particle-mesh method for ideal fluid flow, to determine the statistical mean vorticity field of the discretization. Lagrangian and Eulerian statistical models are proposed for the discrete dynamics, and these are compared against numerical experime

  13. Statistical relevance of vorticity conservation in the Hamiltonian particle-mesh method

    NARCIS (Netherlands)

    S. Dubinkina; J. Frank

    2010-01-01

    We conduct long-time simulations with a Hamiltonian particle-mesh method for ideal fluid flow, to determine the statistical mean vorticity field of the discretization. Lagrangian and Eulerian statistical models are proposed for the discrete dynamics, and these are compared against numerical experime

  14. Hybrid statistics-simulations based method for atom-counting from ADF STEM images.

    Science.gov (United States)

    De Wael, Annelies; De Backer, Annick; Jones, Lewys; Nellist, Peter D; Van Aert, Sandra

    2017-01-25

    A hybrid statistics-simulations based method for atom-counting from annular dark field scanning transmission electron microscopy (ADF STEM) images of monotype crystalline nanostructures is presented. Different atom-counting methods already exist for model-like systems. However, the increasing relevance of radiation damage in the study of nanostructures demands a method that allows atom-counting from low dose images with a low signal-to-noise ratio. Therefore, the hybrid method directly includes prior knowledge from image simulations into the existing statistics-based method for atom-counting, and accounts in this manner for possible discrepancies between actual and simulated experimental conditions. It is shown by means of simulations and experiments that this hybrid method outperforms the statistics-based method, especially for low electron doses and small nanoparticles. The analysis of a simulated low dose image of a small nanoparticle suggests that this method allows for far more reliable quantitative analysis of beam-sensitive materials.

  15. NEW POSSIBILITIES OF RAISING RABBIT YOUNG BY ALTERNATIVE NURSING METHODS

    Directory of Open Access Journals (Sweden)

    Tünde Gyarmati

    2000-06-01

    Full Text Available In the first experiment 360 Pannon White kits from 45 litters were divided into 3 groups. The kits in group SS were suckled once a day during the first 35 days of life (traditional method of nursing. Group DD was raised by 2 does and the kits were suckled both in the morning and in the evening until 35 days of age. The kits of the 3 rd group (D0 were suckled twice a day for 23 days, after which they were weaned. Rabbits which were suckled twice a day consumed 89 % more milk until 23 days of age than those of the SS kits. In the second experiment authors used different management systems, to investigate the possibilities of “double-suckling”. Both nulliparous as multiparous does were treated to induce pseudopregnancy (by means of GnRH. They were used as second doe and suckled the kits in the afternoon. Pseudopregnant does produced milk (nulliparous does less than does which had previously produced litters, but in conditions of controlled suckling they were not willing to suckle the young. In a third experiment, does (n=44 were inseminated 11 days after kindling. Weaning was performed at the age of 21 days and does could systematically serve as second mother for the litters born at the same day. The “additional” does that nursed the young in the afternoon produced 65 % of the milk quantity produced by the natural mother between days 0 and 21.

  16. An improved k-NN method based on multiple-point statistics for classification of high-spatial resolution imagery

    Science.gov (United States)

    Tang, Y.; Jing, L.; Li, H.; Liu, Q.; Ding, H.

    2016-04-01

    In this paper, the potential of multiple-point statistics (MPS) for object-based classification is explored using a modified k-nearest neighbour (k-NN) classification method (MPk-NN). The method first utilises a training image derived from a classified map to characterise the spatial correlation between multiple points of land cover classes, overcoming the limitations of two-point geostatistical methods, and then the spatial information in the form of multiple-point probability is incorporated into the k-NN classifier. The remotely sensed image of an IKONOS subscene of the Beijing urban area was selected to evaluate the method. The image was object-based classified using the MPk-NN method and several alternatives, including the traditional k-NN, the geostatistically weighted k-NN, the Bayesian method, the decision tree classifier (DTC), and the support vector machine classifier (SVM). It was demonstrated that the MPk-NN approach can achieve greater classification accuracy relative to the alternatives, which are 82.05% and 89.12% based on pixel and object testing data, respectively. Thus, the proposed method is appropriate for object-based classification.

  17. Toward improved statistical methods for analyzing Cotinine-Biomarker health association data

    Directory of Open Access Journals (Sweden)

    Clark John D

    2011-10-01

    Full Text Available Abstract Background Serum cotinine, a metabolite of nicotine, is frequently used in research as a biomarker of recent tobacco smoke exposure. Historically, secondhand smoke (SHS research uses suboptimal statistical methods due to censored serum cotinine values, meaning a measurement below the limit of detection (LOD. Methods We compared commonly used methods for analyzing censored serum cotinine data using parametric and non-parametric techniques employing data from the 1999-2004 National Health and Nutrition Examination Surveys (NHANES. To illustrate the differences in associations obtained by various analytic methods, we compared parameter estimates for the association between cotinine and the inflammatory marker homocysteine using complete case analysis, single and multiple imputation, "reverse" Kaplan-Meier, and logistic regression models. Results Parameter estimates and statistical significance varied according to the statistical method used with censored serum cotinine values. Single imputation of censored values with either 0, LOD or LOD/√2 yielded similar estimates and significance; multiple imputation method yielded smaller estimates than the other methods and without statistical significance. Multiple regression modelling using the "reverse" Kaplan-Meier method yielded statistically significant estimates that were larger than those from parametric methods. Conclusions Analyses of serum cotinine data with values below the LOD require special attention. "Reverse" Kaplan-Meier was the only method inherently able to deal with censored data with multiple LODs, and may be the most accurate since it avoids data manipulation needed for use with other commonly used statistical methods. Additional research is needed into the identification of optimal statistical methods for analysis of SHS biomarkers subject to a LOD.

  18. Analyzing Planck and low redshift data sets with advanced statistical methods

    Science.gov (United States)

    Eifler, Tim

    The recent ESA/NASA Planck mission has provided a key data set to constrain cosmology that is most sensitive to physics of the early Universe, such as inflation and primordial NonGaussianity (Planck 2015 results XIII). In combination with cosmological probes of the LargeScale Structure (LSS), the Planck data set is a powerful source of information to investigate late time phenomena (Planck 2015 results XIV), e.g. the accelerated expansion of the Universe, the impact of baryonic physics on the growth of structure, and the alignment of galaxies in their dark matter halos. It is the main objective of this proposal to re-analyze the archival Planck data, 1) with different, more recently developed statistical methods for cosmological parameter inference, and 2) to combine Planck and ground-based observations in an innovative way. We will make the corresponding analysis framework publicly available and believe that it will set a new standard for future CMB-LSS analyses. Advanced statistical methods, such as the Gibbs sampler (Jewell et al 2004, Wandelt et al 2004) have been critical in the analysis of Planck data. More recently, Approximate Bayesian Computation (ABC, see Weyant et al 2012, Akeret et al 2015, Ishida et al 2015, for cosmological applications) has matured to an interesting tool in cosmological likelihood analyses. It circumvents several assumptions that enter the standard Planck (and most LSS) likelihood analyses, most importantly, the assumption that the functional form of the likelihood of the CMB observables is a multivariate Gaussian. Beyond applying new statistical methods to Planck data in order to cross-check and validate existing constraints, we plan to combine Planck and DES data in a new and innovative way and run multi-probe likelihood analyses of CMB and LSS observables. The complexity of multiprobe likelihood analyses scale (non-linearly) with the level of correlations amongst the individual probes that are included. For the multi

  19. Recommended methods for statistical analysis of data containing less-than-detectable measurements

    Energy Technology Data Exchange (ETDEWEB)

    Atwood, C.L.; Blackwood, L.G.; Harris, G.A.; Loehr, C.A.

    1990-09-01

    This report is a manual for statistical workers dealing with environmental measurements, when some of the measurements are not given exactly but are only reported as less than detectable. For some statistical settings with such data, many methods have been proposed in the literature, while for others few or none have been proposed. This report gives a recommended method in each of the settings considered. The body of the report gives a brief description of each recommended method. Appendix A gives example programs using the statistical package SAS, for those methods that involve nonstandard methods. Appendix B presents the methods that were compared and the reasons for selecting each recommended method, and explains any fine points that might be of interest. This is an interim version. Future revisions will complete the recommendations. 34 refs., 2 figs., 11 tabs.

  20. Recommended methods for statistical analysis of data containing less-than-detectable measurements

    Energy Technology Data Exchange (ETDEWEB)

    Atwood, C.L.; Blackwood, L.G.; Harris, G.A.; Loehr, C.A.

    1991-09-01

    This report is a manual for statistical workers dealing with environmental measurements, when some of the measurements are not given exactly but are only reported as less than detectable. For some statistical settings with such data, many methods have been proposed in the literature, while for others few or none have been proposed. This report gives a recommended method in each of the settings considered. The body of the report gives a brief description of each recommended method. Appendix A gives example programs using the statistical package SAS, for those methods that involve nonstandard methods. Appendix B presents the methods that were compared and the reasons for selecting each recommended method, and explains any fine points that might be of interest. 7 refs., 4 figs.

  1. An Efficient Graph-based Method for Long-term Land-use Change Statistics

    Directory of Open Access Journals (Sweden)

    Yipeng Zhang

    2015-12-01

    Full Text Available Statistical analysis of land-use change plays an important role in sustainable land management and has received increasing attention from scholars and administrative departments. However, the statistical process involving spatial overlay analysis remains difficult and needs improvement to deal with mass land-use data. In this paper, we introduce a spatio-temporal flow network model to reveal the hidden relational information among spatio-temporal entities. Based on graph theory, the constant condition of saturated multi-commodity flow is derived. A new method based on a network partition technique of spatio-temporal flow network are proposed to optimize the transition statistical process. The effectiveness and efficiency of the proposed method is verified through experiments using land-use data in Hunan from 2009 to 2014. In the comparison among three different land-use change statistical methods, the proposed method exhibits remarkable superiority in efficiency.

  2. Description of selected structural elements of composite foams using statistical methods

    Directory of Open Access Journals (Sweden)

    K. Gawdzińska

    2011-04-01

    Full Text Available This article makes use of images from a computer tomograph for the description of selected structure elements of metal and compositefoams by means of statistical methods. Besides, compression stress of the tested materials has been determined.

  3. REGULATION AND ROLLING QUALITY CONTROL ON THE BASIS OF STATISTICAL METHODS

    Directory of Open Access Journals (Sweden)

    A. N. Polobovets

    2009-01-01

    Full Text Available It is shown that introduction of statistical method of control will allow to reduce efforts for production, delivery of the samples to the laboratory of mechanical testing, and to reduce the expenses as well.

  4. A pilot course for training-in-context in statistics and research methods

    African Journals Online (AJOL)

    A pilot course for training-in-context in statistics and research methods: Radiation oncology. ... roles of emotional engagement and social networking in facilitating effective ... Participants reported an increased understanding of the principles of ...

  5. Probability of Detection (POD) as a statistical model for the validation of qualitative methods.

    Science.gov (United States)

    Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T

    2011-01-01

    A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.

  6. Physics-based statistical model and simulation method of RF propagation in urban environments

    Science.gov (United States)

    Pao, Hsueh-Yuan; Dvorak, Steven L.

    2010-09-14

    A physics-based statistical model and simulation/modeling method and system of electromagnetic wave propagation (wireless communication) in urban environments. In particular, the model is a computationally efficient close-formed parametric model of RF propagation in an urban environment which is extracted from a physics-based statistical wireless channel simulation method and system. The simulation divides the complex urban environment into a network of interconnected urban canyon waveguides which can be analyzed individually; calculates spectral coefficients of modal fields in the waveguides excited by the propagation using a database of statistical impedance boundary conditions which incorporates the complexity of building walls in the propagation model; determines statistical parameters of the calculated modal fields; and determines a parametric propagation model based on the statistical parameters of the calculated modal fields from which predictions of communications capability may be made.

  7. Comparison of different approaches to evaluation of statistical error of the DSMC method

    Science.gov (United States)

    Plotnikov, M. Yu.; Shkarupa, E. V.

    2012-11-01

    Although the direct simulation Monte Carlo (DSMC) method is widely used for solving the steady problems of the rarefied gas dynamics, the questions of its statistical error evaluation are far from being absolutely clear. Typically, the statistical error in the Monte Carlo method is estimated by the standard deviation determined by the variance of the estimate and the number of its realizations. It is assumed that sampled realizations are independent. In distinction from the classical Monte Carlo method, in the DSMC method the time-averaged estimate is used and the sampled realizations are dependent. Additional difficulties in the evaluation of the statistical error are caused by the complexity of the estimates used in the DSMC method. In the presented work we compare two approaches to evaluating the statistical error. One of them is based on the results of the equilibrium statistical mechanics and the "persistent random walk". Another approach is based on the central limit theorem for Markov processes. Each of these approaches has its own benefits and disadvantages. The first approach mentioned above does not require additional computations to construct estimates of the statistical error. On the other hand it allows evaluating statistical error only in the case when all components of velocity and temperature are equivalent. The second approach to evaluating the statistical error is applicable to simulation by the DSMC method a flows with any degree of nonequilibrium. It allows evaluating the statistical errors of the estimates of velocity and temperature components. The comparison of these approaches was realized on the example of a number of classic problems with different degree of nonequilibrium.

  8. Developmental neurotoxicity testing: recommendations for developing alternative methods for the screening and prioritization of chemicals.

    Science.gov (United States)

    Crofton, Kevin M; Mundy, William R; Lein, Pamela J; Bal-Price, Anna; Coecke, Sandra; Seiler, Andrea E M; Knaut, Holger; Buzanska, Leonora; Goldberg, Alan

    2011-01-01

    Developmental neurotoxicity testing (DNT) is perceived by many stakeholders to be an area in critical need of alternative methods to current animal testing protocols and guidelines. An immediate goal is to develop test methods that are capable of screening large numbers of chemicals. This document provides recommendations for developing alternative DNT approaches that will generate the type of data required for evaluating and comparing predictive capacity and efficiency across test methods and laboratories. These recommendations were originally drafted to stimulate and focus discussions of alternative testing methods and models for DNT at the TestSmart DNT II meeting (http://caat.jhsph.edu/programs/workshops/dnt2.html) and this document reflects critical feedback from all stakeholders that participated in this meeting. The intent of this document is to serve as a catalyst for engaging the research community in the development of DNT alternatives and it is expected that these recommendations will continue to evolve with the science.

  9. 27 CFR 22.22 - Alternate methods or procedures; and emergency variations from requirements.

    Science.gov (United States)

    2010-04-01

    ... OF TAX-FREE ALCOHOL Administrative Provisions Authorities § 22.22 Alternate methods or procedures..., conditions or limitations set forth in the approval, authority for the variation from requirements...

  10. 27 CFR 20.22 - Alternate methods or procedures; and emergency variations from requirements.

    Science.gov (United States)

    2010-04-01

    ... OF DENATURED ALCOHOL AND RUM Administrative Provisions Authorities § 20.22 Alternate methods or... forth in the approval, authority for the variation from requirements is automatically terminated and...

  11. Notification: Notification Memo for Evaluation of Management Controls for Alternative Asbestos Control Method Experiments

    Science.gov (United States)

    Project #OPE-FY12-0011, February 27, 2012. This memorandum is to notify you that the Office of Inspector General (OIG) is initiating an evaluation on the Alternative Asbestos Control Method (AACM) experiments.

  12. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures

    OpenAIRE

    Rock, Adam J.; Coventry, William L.; Morgan, Methuen I.; Loi, Natasha M.

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLe...

  13. Statistical methods of discrimination and classification advances in theory and applications

    CERN Document Server

    Choi, Sung C

    1986-01-01

    Statistical Methods of Discrimination and Classification: Advances in Theory and Applications is a collection of papers that tackles the multivariate problems of discriminating and classifying subjects into exclusive population. The book presents 13 papers that cover that advancement in the statistical procedure of discriminating and classifying. The studies in the text primarily focus on various methods of discriminating and classifying variables, such as multiple discriminant analysis in the presence of mixed continuous and categorical data; choice of the smoothing parameter and efficiency o

  14. Earnings Management of Firms Reporting Long Term Debt: An Alternative Method

    Directory of Open Access Journals (Sweden)

    Yulius Jogi Christiawan

    2014-01-01

    Full Text Available This study aims to apply an alternative detection model to prove that the earnings management will be occured when a company has long-term debts as well as the pressure of operating income. Generally, the literature study of earnings management indicates that the detection of earnings management can be grouped into two objectives, 1] to find variables for detecting earnings management (accruals, real activity and classification shifting and 2] to use some advanced statistical or mathematical models to detect earnings management. This study applies a quantitative approach using secondary data of financial statements. The study was conducted on 50 companies with the largest market capitalization, 50 of the most active companies based on trading volume, 50 of the most active companies based on the value of trade and 50 of the most active companies by frequency trading. All of them are 200  public company (listed in the Indonesia Stock Exchange-ID based on IDX statistical report 2013. The results of this study are expected to provide a new method to detect earnings management and its application in the context of positive accounting theory (PAT. The results of the study proves that the model is able to detect earnings management by utilizing foreign exchange transaction losses and use these models to support PAT (particularly on debt covenant hypothesis. These results contribute that earnings management can be done by using the foreign exchange gain / loss. However, the limitation of this study is the model has not been able to capture the phenomenon of earnings management if a company does not report any long-term debt nor foreign exchange gain/ loss.

  15. Statistical methods to estimate treatment effects from multichannel electroencephalography (EEG) data in clinical trials.

    Science.gov (United States)

    Ma, Junshui; Wang, Shubing; Raubertas, Richard; Svetnik, Vladimir

    2010-07-15

    With the increasing popularity of using electroencephalography (EEG) to reveal the treatment effect in drug development clinical trials, the vast volume and complex nature of EEG data compose an intriguing, but challenging, topic. In this paper the statistical analysis methods recommended by the EEG community, along with methods frequently used in the published literature, are first reviewed. A straightforward adjustment of the existing methods to handle multichannel EEG data is then introduced. In addition, based on the spatial smoothness property of EEG data, a new category of statistical methods is proposed. The new methods use a linear combination of low-degree spherical harmonic (SPHARM) basis functions to represent a spatially smoothed version of the EEG data on the scalp, which is close to a sphere in shape. In total, seven statistical methods, including both the existing and the newly proposed methods, are applied to two clinical datasets to compare their power to detect a drug effect. Contrary to the EEG community's recommendation, our results suggest that (1) the nonparametric method does not outperform its parametric counterpart; and (2) including baseline data in the analysis does not always improve the statistical power. In addition, our results recommend that (3) simple paired statistical tests should be avoided due to their poor power; and (4) the proposed spatially smoothed methods perform better than their unsmoothed versions.

  16. Mathematical and Statistical Models and Methods for Describing the Thermal Characteristics of Buildings

    DEFF Research Database (Denmark)

    Madsen, Henrik; Bacher, Peder; Andersen, Philip Hvidthøft Delff

    2010-01-01

    , existence of prior physical knowledge, the data and the available statistical soft- ware tools. The importance of statistical model validation is discussed, and some simple tools for that purpose are demonstrated. This paper also briefly describes some of the most frequently used software tools for modelling......This paper describes a number of statistical methods and models for describing the thermal characteristics of buildings using frequent readings of heat consumption, ambient air temperature, and other available climate variables. For some of the methods frequent readings of the indoor air...

  17. Developing Econometrics Statistical Theories and Methods with Applications to Economics and Business

    CERN Document Server

    Tong, Hengqing; Huang, Yangxin

    2011-01-01

    Statistical Theories and Methods with Applications to Economics and Business highlights recent advances in statistical theory and methods that benefit econometric practice. It deals with exploratory data analysis, a prerequisite to statistical modelling and part of data mining. It provides recently developed computational tools useful for data mining, analysing the reasons to do data mining and the best techniques to use in a given situation.Provides a detailed description of computer algorithms.Provides recently developed computational tools useful for data miningHighlights recent advances in

  18. Allergic Contact Dermatitis to Ophthalmic Medications: Relevant Allergens and Alternative Testing Methods.

    Science.gov (United States)

    Grey, Katherine R; Warshaw, Erin M

    Allergic contact dermatitis is an important cause of periorbital dermatitis. Topical ophthalmic agents are relevant sensitizers. Contact dermatitis to ophthalmic medications can be challenging to diagnose and manage given the numerous possible offending agents, including both active and inactive ingredients. Furthermore, a substantial body of literature reports false-negative patch test results to ophthalmic agents. Subsequently, numerous alternative testing methods have been described. This review outlines the periorbital manifestations, causative agents, and alternative testing methods of allergic contact dermatitis to ophthalmic medications.

  19. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    Science.gov (United States)

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  20. A robust statistical method for association-based eQTL analysis.

    Directory of Open Access Journals (Sweden)

    Ning Jiang

    Full Text Available BACKGROUND: It has been well established that theoretical kernel for recently surging genome-wide association study (GWAS is statistical inference of linkage disequilibrium (LD between a tested genetic marker and a putative locus affecting a disease trait. However, LD analysis is vulnerable to several confounding factors of which population stratification is the most prominent. Whilst many methods have been proposed to correct for the influence either through predicting the structure parameters or correcting inflation in the test statistic due to the stratification, these may not be feasible or may impose further statistical problems in practical implementation. METHODOLOGY: We propose here a novel statistical method to control spurious LD in GWAS from population structure by incorporating a control marker into testing for significance of genetic association of a polymorphic marker with phenotypic variation of a complex trait. The method avoids the need of structure prediction which may be infeasible or inadequate in practice and accounts properly for a varying effect of population stratification on different regions of the genome under study. Utility and statistical properties of the new method were tested through an intensive computer simulation study and an association-based genome-wide mapping of expression quantitative trait loci in genetically divergent human populations. RESULTS/CONCLUSIONS: The analyses show that the new method confers an improved statistical power for detecting genuine genetic association in subpopulations and an effective control of spurious associations stemmed from population structure when compared with other two popularly implemented methods in the literature of GWAS.

  1. Statistical methods for the detection of answer copying on achievement tests

    NARCIS (Netherlands)

    Sotaridona, Leonardo Sitchirita

    2003-01-01

    This thesis contains a collection of studies where statistical methods for the detection of answer copying on achievement tests in multiple-choice format are proposed and investigated. Although all methods are suited to detect answer copying, each method is designed to address specific characteristi

  2. An Investigation of the Variety and Complexity of Statistical Methods Used in Current Internal Medicine Literature.

    Science.gov (United States)

    Narayanan, Roshni; Nugent, Rebecca; Nugent, Kenneth

    2015-10-01

    Accreditation Council for Graduate Medical Education guidelines require internal medicine residents to develop skills in the interpretation of medical literature and to understand the principles of research. A necessary component is the ability to understand the statistical methods used and their results, material that is not an in-depth focus of most medical school curricula and residency programs. Given the breadth and depth of the current medical literature and an increasing emphasis on complex, sophisticated statistical analyses, the statistical foundation and education necessary for residents are uncertain. We reviewed the statistical methods and terms used in 49 articles discussed at the journal club in the Department of Internal Medicine residency program at Texas Tech University between January 1, 2013 and June 30, 2013. We collected information on the study type and on the statistical methods used for summarizing and comparing samples, determining the relations between independent variables and dependent variables, and estimating models. We then identified the typical statistics education level at which each term or method is learned. A total of 14 articles came from the Journal of the American Medical Association Internal Medicine, 11 from the New England Journal of Medicine, 6 from the Annals of Internal Medicine, 5 from the Journal of the American Medical Association, and 13 from other journals. Twenty reported randomized controlled trials. Summary statistics included mean values (39 articles), category counts (38), and medians (28). Group comparisons were based on t tests (14 articles), χ2 tests (21), and nonparametric ranking tests (10). The relations between dependent and independent variables were analyzed with simple regression (6 articles), multivariate regression (11), and logistic regression (8). Nine studies reported odds ratios with 95% confidence intervals, and seven analyzed test performance using sensitivity and specificity calculations

  3. Development and Evaluation of a Hybrid Dynamical-Statistical Downscaling Method

    Science.gov (United States)

    Walton, Daniel Burton

    Regional climate change studies usually rely on downscaling of global climate model (GCM) output in order to resolve important fine-scale features and processes that govern local climate. Previous efforts have used one of two techniques: (1) dynamical downscaling, in which a regional climate model is forced at the boundaries by GCM output, or (2) statistical downscaling, which employs historical empirical relationships to go from coarse to fine resolution. Studies using these methods have been criticized because they either dynamical downscaled only a few GCMs, or used statistical downscaling on an ensemble of GCMs, but missed important dynamical effects in the climate change signal. This study describes the development and evaluation of a hybrid dynamical-statstical downscaling method that utilizes aspects of both dynamical and statistical downscaling to address these concerns. The first step of the hybrid method is to use dynamical downscaling to understand the most important physical processes that contribute to the climate change signal in the region of interest. Then a statistical model is built based on the patterns and relationships identified from dynamical downscaling. This statistical model can be used to downscale an entire ensemble of GCMs quickly and efficiently. The hybrid method is first applied to a domain covering Los Angeles Region to generate projections of temperature change between the 2041-2060 and 1981-2000 periods for 32 CMIP5 GCMs. The hybrid method is also applied to a larger region covering all of California and the adjacent ocean. The hybrid method works well in both areas, primarily because a single feature, the land-sea contrast in the warming, controls the overwhelming majority of the spatial detail. Finally, the dynamically downscaled temperature change patterns are compared to those produced by two commonly-used statistical methods, BCSD and BCCA. Results show that dynamical downscaling recovers important spatial features that the

  4. The Continuized Log-Linear Method: An Alternative to the Kernel Method of Continuization in Test Equating

    Science.gov (United States)

    Wang, Tianyou

    2008-01-01

    Von Davier, Holland, and Thayer (2004) laid out a five-step framework of test equating that can be applied to various data collection designs and equating methods. In the continuization step, they presented an adjusted Gaussian kernel method that preserves the first two moments. This article proposes an alternative continuization method that…

  5. Performance comparison of three predictor selection methods for statistical downscaling of daily precipitation

    Science.gov (United States)

    Yang, Chunli; Wang, Ninglian; Wang, Shijin; Zhou, Liang

    2016-10-01

    Predictor selection is a critical factor affecting the statistical downscaling of daily precipitation. This study provides a general comparison between uncertainties in downscaled results from three commonly used predictor selection methods (correlation analysis, partial correlation analysis, and stepwise regression analysis). Uncertainty is analyzed by comparing statistical indices, including the mean, variance, and the distribution of monthly mean daily precipitation, wet spell length, and the number of wet days. The downscaled results are produced by the artificial neural network (ANN) statistical downscaling model and 50 years (1961-2010) of observed daily precipitation together with reanalysis predictors. Although results show little difference between downscaling methods, stepwise regression analysis is generally the best method for selecting predictors for the ANN statistical downscaling model of daily precipitation, followed by partial correlation analysis and then correlation analysis.

  6. Simulation evaluation of statistical properties of methods for indirect and mixed treatment comparisons

    Directory of Open Access Journals (Sweden)

    Song Fujian

    2012-09-01

    Full Text Available Abstract Background Indirect treatment comparison (ITC and mixed treatment comparisons (MTC have been increasingly used in network meta-analyses. This simulation study comprehensively investigated statistical properties and performances of commonly used ITC and MTC methods, including simple ITC (the Bucher method, frequentist and Bayesian MTC methods. Methods A simple network of three sets of two-arm trials with a closed loop was simulated. Different simulation scenarios were based on different number of trials, assumed treatment effects, extent of heterogeneity, bias and inconsistency. The performance of the ITC and MTC methods was measured by the type I error, statistical power, observed bias and mean squared error (MSE. Results When there are no biases in primary studies, all ITC and MTC methods investigated are on average unbiased. Depending on the extent and direction of biases in different sets of studies, ITC and MTC methods may be more or less biased than direct treatment comparisons (DTC. Of the methods investigated, the simple ITC method has the largest mean squared error (MSE. The DTC is superior to the ITC in terms of statistical power and MSE. Under the simulated circumstances in which there are no systematic biases and inconsistencies, the performances of MTC methods are generally better than the performance of the corresponding DTC methods. For inconsistency detection in network meta-analysis, the methods evaluated are on average unbiased. The statistical power of commonly used methods for detecting inconsistency is very low. Conclusions The available methods for indirect and mixed treatment comparisons have different advantages and limitations, depending on whether data analysed satisfies underlying assumptions. To choose the most valid statistical methods for research synthesis, an appropriate assessment of primary studies included in evidence network is required.

  7. 8th International Conference on Soft Methods in Probability and Statistics

    CERN Document Server

    Giordani, Paolo; Vantaggi, Barbara; Gagolewski, Marek; Gil, María; Grzegorzewski, Przemysław; Hryniewicz, Olgierd

    2017-01-01

    This proceedings volume is a collection of peer reviewed papers presented at the 8th International Conference on Soft Methods in Probability and Statistics (SMPS 2016) held in Rome (Italy). The book is dedicated to Data science which aims at developing automated methods to analyze massive amounts of data and to extract knowledge from them. It shows how Data science employs various programming techniques and methods of data wrangling, data visualization, machine learning, probability and statistics. The soft methods proposed in this volume represent a collection of tools in these fields that can also be useful for data science.

  8. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures.

    Science.gov (United States)

    Rock, Adam J; Coventry, William L; Morgan, Methuen I; Loi, Natasha M

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology.

  9. A local and global statistics pattern analysis method and its application to process fault identification☆

    Institute of Scientific and Technical Information of China (English)

    Hanyuan Zhang; Xuemin Tian; Xiaogang Deng; Lianfang Cai

    2015-01-01

    Traditional principal component analysis (PCA) is a second-order method and lacks the ability to provide higher-order representations for data variables. Recently, a statistics pattern analysis (SPA) framework has been incor-porated into PCA model to make full use of various statistics of data variables effectively. However, these methods omit the local information, which is also important for process monitoring and fault diagnosis. In this paper, a local and global statistics pattern analysis (LGSPA) method, which integrates SPA framework and locality pre-serving projections within the PCA, is proposed to utilize various statistics and preserve both local and global in-formation in the observed data. For the purpose of fault detection, two monitoring indices are constructed based on the LGSPA model. In order to identify fault variables, an improved reconstruction based contribution (IRBC) plot based on LGSPA model is proposed to locate fault variables. The RBC of various statistics of original process variables to the monitoring indices is calculated with the proposed RBC method. Based on the calculated RBC of process variables' statistics, a new contribution of process variables is built to locate fault variables. The simula-tion results on a simple six-variable system and a continuous stirred tank reactor system demonstrate that the proposed fault diagnosis method can effectively detect fault and distinguish the fault variables from normal variables.

  10. Optimal Alternative to the Akima's Method of Smooth Interpolation Applied in Diabetology

    Directory of Open Access Journals (Sweden)

    Emanuel Paul

    2006-12-01

    Full Text Available It is presented a new method of cubic piecewise smooth interpolation applied to experimental data obtained by glycemic profile for diabetics. This method is applied to create a soft useful in clinical diabetology. The method give an alternative to the Akima's procedure of the derivatives computation on the knots from [Akima, J. Assoc. Comput. Mach., 1970] and have an optimal property.

  11. The intermediates take it all: asymptotics of higher criticism statistics and a powerful alternative based on equal local levels.

    Science.gov (United States)

    Gontscharuk, Veronika; Landwehr, Sandra; Finner, Helmut

    2015-01-01

    The higher criticism (HC) statistic, which can be seen as a normalized version of the famous Kolmogorov-Smirnov statistic, has a long history, dating back to the mid seventies. Originally, HC statistics were used in connection with goodness of fit (GOF) tests but they recently gained some attention in the context of testing the global null hypothesis in high dimensional data. The continuing interest for HC seems to be inspired by a series of nice asymptotic properties related to this statistic. For example, unlike Kolmogorov-Smirnov tests, GOF tests based on the HC statistic are known to be asymptotically sensitive in the moderate tails, hence it is favorably applied for detecting the presence of signals in sparse mixture models. However, some questions around the asymptotic behavior of the HC statistic are still open. We focus on two of them, namely, why a specific intermediate range is crucial for GOF tests based on the HC statistic and why the convergence of the HC distribution to the limiting one is extremely slow. Moreover, the inconsistency in the asymptotic and finite behavior of the HC statistic prompts us to provide a new HC test that has better finite properties than the original HC test while showing the same asymptotics. This test is motivated by the asymptotic behavior of the so-called local levels related to the original HC test. By means of numerical calculations and simulations we show that the new HC test is typically more powerful than the original HC test in normal mixture models.

  12. Statistics-based reconstruction method with high random-error tolerance for integral imaging.

    Science.gov (United States)

    Zhang, Juan; Zhou, Liqiu; Jiao, Xiaoxue; Zhang, Lei; Song, Lipei; Zhang, Bo; Zheng, Yi; Zhang, Zan; Zhao, Xing

    2015-10-01

    A three-dimensional (3D) digital reconstruction method for integral imaging with high random-error tolerance based on statistics is proposed. By statistically analyzing the points reconstructed by triangulation from all corresponding image points in an elemental images array, 3D reconstruction with high random-error tolerance could be realized. To simulate the impacts of random errors, random offsets with different error levels are added to a different number of elemental images in simulation and optical experiments. The results of simulation and optical experiments showed that the proposed statistic-based reconstruction method has relatively stable and better reconstruction accuracy than the conventional reconstruction method. It can be verified that the proposed method can effectively reduce the impacts of random errors on 3D reconstruction of integral imaging. This method is simple and very helpful to the development of integral imaging technology.

  13. Experimental and statistical approaches in method cross-validation to support pharmacokinetic decisions.

    Science.gov (United States)

    Thway, Theingi M; Ma, Mark; Lee, Jean; Sloey, Bethlyn; Yu, Steven; Wang, Yow-Ming C; Desilva, Binodh; Graves, Tom

    2009-04-05

    A case study of experimental and statistical approaches for cross-validating and examining the equivalence of two ligand binding assay (LBA) methods that were employed in pharmacokinetic (PK) studies is presented. The impact of changes in methodology based on the intended use of the methods was assessed. The cross-validation processes included an experimental plan, sample size selection, and statistical analysis with a predefined criterion of method equivalence. The two methods were deemed equivalent if the ratio of mean concentration fell within the 90% confidence interval (0.80-1.25). Statistical consideration of method imprecision was used to choose the number of incurred samples (collected from study animals) and conformance samples (spiked controls) for equivalence tests. The difference of log-transformed mean concentration and the 90% confidence interval for two methods were computed using analysis of variance. The mean concentration ratios of the two methods for the incurred and spiked conformance samples were 1.63 and 1.57, respectively. The 90% confidence limit was 1.55-1.72 for the incurred samples and 1.54-1.60 for the spiked conformance samples; therefore, the 90% confidence interval was not contained within the (0.80-1.25) equivalence interval. When the PK parameters of two studies using each of these two methods were compared, we determined that the therapeutic exposure, AUC((0-168)) and C(max), from Study A/Method 1 was approximately twice that of Study B/Method 2. We concluded that the two methods were not statistically equivalent and that the magnitude of the difference was reflected in the PK parameters in the studies using each method. This paper demonstrates the need for method cross-validation whenever there is a switch in bioanalytical methods, statistical approaches in designing the cross-validation experiments and assessing results, or interpretation of the impact of PK data.

  14. Alternative sintering methods compared to conventional thermal sintering for inkjet printed silver nanoparticle ink

    Energy Technology Data Exchange (ETDEWEB)

    Niittynen, Juha, E-mail: juha.niittynen@tut.fi [Department of Electronics and Communications Engineering, Tampere University of Technology, Korkeakoulunkatu 3, 33720 Tampere (Finland); Abbel, Robert [Holst Centre, High Tech Campus 31, 5656 AE Eindhoven (Netherlands); Mäntysalo, Matti [Department of Electronics and Communications Engineering, Tampere University of Technology, Korkeakoulunkatu 3, 33720 Tampere (Finland); Perelaer, Jolke; Schubert, Ulrich S. [Laboratory of Organic and Macromolecular Chemistry (IOMC), Friedrich-Schiller-University Jena, Humboldtstrasse 10, D-07743 Jena (Germany); Jena Center for Soft Matter (JCSM), Friedrich-Schiller-University Jena, Humboldtstrasse 10, D-07743 Jena (Germany); Lupo, Donald [Department of Electronics and Communications Engineering, Tampere University of Technology, Korkeakoulunkatu 3, 33720 Tampere (Finland)

    2014-04-01

    In this contribution several alternative sintering methods are compared to traditional thermal sintering as high temperature and long process time of thermal sintering are increasing the costs of inkjet-printing and preventing the use of this technology in large scale manufacturing. Alternative sintering techniques are evaluated based on the electrical and mechanical performance they enable on inkjet-printed structures as well as their potential feasibility for large scale manufacturing. Photonic sintering was identified as the most promising alternative to thermal sintering. - Highlights: • Comparison of alternative sintering techniques for large-scale electronics manufacturing • Laser, plasma and photonic sintering of nanoparticle silver ink tested • Electrical and mechanical properties of sintered inks tested • Microstructure analysis used to explain the different electrical and mechanical properties • Photonic sintering identified as the most promising alternative technique.

  15. Trends in study design and the statistical methods employed in a leading general medicine journal.

    Science.gov (United States)

    Gosho, M; Sato, Y; Nagashima, K; Takahashi, S

    2017-07-27

    Study design and statistical methods have become core components of medical research, and the methodology has become more multifaceted and complicated over time. The study of the comprehensive details and current trends of study design and statistical methods is required to support the future implementation of well-planned clinical studies providing information about evidence-based medicine. Our purpose was to illustrate study design and statistical methods employed in recent medical literature. This was an extension study of Sato et al. (N Engl J Med 2017; 376: 1086-1087), which reviewed 238 articles published in 2015 in the New England Journal of Medicine (NEJM) and briefly summarized the statistical methods employed in NEJM. Using the same database, we performed a new investigation of the detailed trends in study design and individual statistical methods that were not reported in the Sato study. Due to the CONSORT statement, prespecification and justification of sample size are obligatory in planning intervention studies. Although standard survival methods (eg Kaplan-Meier estimator and Cox regression model) were most frequently applied, the Gray test and Fine-Gray proportional hazard model for considering competing risks were sometimes used for a more valid statistical inference. With respect to handling missing data, model-based methods, which are valid for missing-at-random data, were more frequently used than single imputation methods. These methods are not recommended as a primary analysis, but they have been applied in many clinical trials. Group sequential design with interim analyses was one of the standard designs, and novel design, such as adaptive dose selection and sample size re-estimation, was sometimes employed in NEJM. Model-based approaches for handling missing data should replace single imputation methods for primary analysis in the light of the information found in some publications. Use of adaptive design with interim analyses is increasing

  16. Reconciling alternate methods for the determination of charge distributions: A probabilistic approach to high-dimensional least-squares approximations

    CERN Document Server

    Champagnat, Nicolas; Faou, Erwan

    2010-01-01

    We propose extensions and improvements of the statistical analysis of distributed multipoles (SADM) algorithm put forth by Chipot et al. in [6] for the derivation of distributed atomic multipoles from the quantum-mechanical electrostatic potential. The method is mathematically extended to general least-squares problems and provides an alternative approximation method in cases where the original least-squares problem is computationally not tractable, either because of its ill-posedness or its high-dimensionality. The solution is approximated employing a Monte Carlo method that takes the average of a random variable defined as the solutions of random small least-squares problems drawn as subsystems of the original problem. The conditions that ensure convergence and consistency of the method are discussed, along with an analysis of the computational cost in specific instances.

  17. Review of statistical methods used in enhanced-oil-recovery research and performance prediction. [131 references

    Energy Technology Data Exchange (ETDEWEB)

    Selvidge, J.E.

    1982-06-01

    Recent literature in the field of enhanced oil recovery (EOR) was surveyed to determine the extent to which researchers in EOR take advantage of statistical techniques in analyzing their data. In addition to determining the current level of reliance on statistical tools, another objective of this study is to promote by example the greater use of these tools. To serve this objective, the discussion of the techniques highlights the observed trend toward the use of increasingly more sophisticated methods and points out the strengths and pitfalls of different approaches. Several examples are also given of opportunities for extending EOR research findings by additional statistical manipulation. The search of the EOR literature, conducted mainly through computerized data bases, yielded nearly 200 articles containing mathematical analysis of the research. Of these, 21 were found to include examples of statistical approaches to data analysis and are discussed in detail in this review. The use of statistical techniques, as might be expected from their general purpose nature, extends across nearly all types of EOR research covering thermal methods of recovery, miscible processes, and micellar polymer floods. Data come from field tests, the laboratory, and computer simulation. The statistical methods range from simple comparisons of mean values to multiple non-linear regression equations and to probabilistic decision functions. The methods are applied to both engineering and economic data. The results of the survey are grouped by statistical technique and include brief descriptions of each of the 21 relevant papers. Complete abstracts of the papers are included in the bibliography. Brief bibliographic information (without abstracts) is also given for the articles identified in the initial search as containing mathematical analyses using other than statistical methods.

  18. PROBLEMS CONCERNING ALTERNATIVE EVALUATION METHODS: THE CASE OF SCIENCE AND TEHNOLOGY TEACHERS

    Directory of Open Access Journals (Sweden)

    Yasemin DEVECİOĞLU-KAYMAKÇI

    2009-11-01

    Full Text Available Recent changes on the Science and Tehnology (ST curriculum have required using alternative evaluation methods in learning and teaching process. The aim of this study is to determine the ST teachers’ problems while using the alternative evaluation methods during their courses. To achieve this, semi-structured interviews have been conducted with 10 ST teachers from different parts of Trabzon during the academic year 2008-2009. The data, analyzed qualitatively, reveals that the teachers have important difficulties in determining, using and evaluating these methods. At the end of the research, it has been concluded that besides the lack of physical infrastructure, labs and libraries, computer and other technologies in their schools, most of the teachers lack the knowledge and skills to implement these methods. The result of the study has shown that ST teachers need an adaptation process to appropriate the aims and importance of alternative evaluation methods effectively.

  19. Classical Methods of Statistics With Applications in Fusion-Oriented Plasma Physics

    CERN Document Server

    Kardaun, Otto J W F

    2005-01-01

    Classical Methods of Statistics is a blend of theory and practical statistical methods written for graduate students and researchers interested in applications to plasma physics and its experimental aspects. It can also fruitfully be used by students majoring in probability theory and statistics. In the first part, the mathematical framework and some of the history of the subject are described. Many exercises help readers to understand the underlying concepts. In the second part, two case studies are presented exemplifying discriminant analysis and multivariate profile analysis. The introductions of these case studies outline contextual magnetic plasma fusion research. In the third part, an overview of statistical software is given and, in particular, SAS and S-PLUS are discussed. In the last chapter, several datasets with guided exercises, predominantly from the ASDEX Upgrade tokamak, are included and their physical background is concisely described. The book concludes with a list of essential keyword transl...

  20. Defining the ecological hydrology of Taiwan Rivers using multivariate statistical methods

    Science.gov (United States)

    Chang, Fi-John; Wu, Tzu-Ching; Tsai, Wen-Ping; Herricks, Edwin E.

    2009-09-01

    SummaryThe identification and verification of ecohydrologic flow indicators has found new support as the importance of ecological flow regimes is recognized in modern water resources management, particularly in river restoration and reservoir management. An ecohydrologic indicator system reflecting the unique characteristics of Taiwan's water resources and hydrology has been developed, the Taiwan ecohydrological indicator system (TEIS). A major challenge for the water resources community is using the TEIS to provide environmental flow rules that improve existing water resources management. This paper examines data from the extensive network of flow monitoring stations in Taiwan using TEIS statistics to define and refine environmental flow options in Taiwan. Multivariate statistical methods were used to examine TEIS statistics for 102 stations representing the geographic and land use diversity of Taiwan. The Pearson correlation coefficient showed high multicollinearity between the TEIS statistics. Watersheds were separated into upper and lower-watershed locations. An analysis of variance indicated significant differences between upstream, more natural, and downstream, more developed, locations in the same basin with hydrologic indicator redundancy in flow change and magnitude statistics. Issues of multicollinearity were examined using a Principal Component Analysis (PCA) with the first three components related to general flow and high/low flow statistics, frequency and time statistics, and quantity statistics. These principle components would explain about 85% of the total variation. A major conclusion is that managers must be aware of differences among basins, as well as differences within basins that will require careful selection of management procedures to achieve needed flow regimes.