WorldWideScience

Sample records for modern statistical methods

  1. Fundamentals of modern statistical methods substantially improving power and accuracy

    Wilcox, Rand R

    2001-01-01

    Conventional statistical methods have a very serious flaw They routinely miss differences among groups or associations among variables that are detected by more modern techniques - even under very small departures from normality Hundreds of journal articles have described the reasons standard techniques can be unsatisfactory, but simple, intuitive explanations are generally unavailable Improved methods have been derived, but they are far from obvious or intuitive based on the training most researchers receive Situations arise where even highly nonsignificant results become significant when analyzed with more modern methods Without assuming any prior training in statistics, Part I of this book describes basic statistical principles from a point of view that makes their shortcomings intuitive and easy to understand The emphasis is on verbal and graphical descriptions of concepts Part II describes modern methods that address the problems covered in Part I Using data from actual studies, many examples are include...

  2. Longitudinal data analysis a handbook of modern statistical methods

    Fitzmaurice, Garrett; Verbeke, Geert; Molenberghs, Geert

    2008-01-01

    Although many books currently available describe statistical models and methods for analyzing longitudinal data, they do not highlight connections between various research threads in the statistical literature. Responding to this void, Longitudinal Data Analysis provides a clear, comprehensive, and unified overview of state-of-the-art theory and applications. It also focuses on the assorted challenges that arise in analyzing longitudinal data. After discussing historical aspects, leading researchers explore four broad themes: parametric modeling, nonparametric and semiparametric methods, joint

  3. An Introduction to Modern Statistical Methods in HCI

    Robertson, Judy; Kaptein, Maurits; Robertson, J; Kaptein, M

    2016-01-01

    This chapter explains why we think statistical methodology matters so much to the HCI community and why we should attempt to improve it. It introduces some flaws in the well-accepted methodology of Null Hypothesis Significance Testing and briefly introduces some alternatives. Throughout the book we

  4. An introduction to modern statistical methods in HCI

    Robertson, J.; Kaptein, M.C.; Robertson, J.; Kaptein, M.C.

    2016-01-01

    This chapter explains why we think statistical methodology matters so much to the HCI community and why we should attempt to improve it. It introduces some flaws in the well-accepted methodology of Null Hypothesis Significance Testing and briefly introduces some alternatives. Throughout the book we

  5. Oxygen Abundance Methods in SDSS: View from Modern Statistics ...

    6Occam's razor is a principle attributed to the 14th century English logician and .... knowledge of a galaxy's metallicity in order to locate it on the appropriate branch of ..... These methods try to balance the log likelihood term (lack of fit) with a.

  6. Basics of modern mathematical statistics

    Spokoiny, Vladimir

    2015-01-01

    This textbook provides a unified and self-contained presentation of the main approaches to and ideas of mathematical statistics. It collects the basic mathematical ideas and tools needed as a basis for more serious studies or even independent research in statistics. The majority of existing textbooks in mathematical statistics follow the classical asymptotic framework. Yet, as modern statistics has changed rapidly in recent years, new methods and approaches have appeared. The emphasis is on finite sample behavior, large parameter dimensions, and model misspecifications. The present book provides a fully self-contained introduction to the world of modern mathematical statistics, collecting the basic knowledge, concepts and findings needed for doing further research in the modern theoretical and applied statistics. This textbook is primarily intended for graduate and postdoc students and young researchers who are interested in modern statistical methods.

  7. Applications of modern statistical methods to analysis of data in physical science

    Wicker, James Eric

    Modern methods of statistical and computational analysis offer solutions to dilemmas confronting researchers in physical science. Although the ideas behind modern statistical and computational analysis methods were originally introduced in the 1970's, most scientists still rely on methods written during the early era of computing. These researchers, who analyze increasingly voluminous and multivariate data sets, need modern analysis methods to extract the best results from their studies. The first section of this work showcases applications of modern linear regression. Since the 1960's, many researchers in spectroscopy have used classical stepwise regression techniques to derive molecular constants. However, problems with thresholds of entry and exit for model variables plagues this analysis method. Other criticisms of this kind of stepwise procedure include its inefficient searching method, the order in which variables enter or leave the model and problems with overfitting data. We implement an information scoring technique that overcomes the assumptions inherent in the stepwise regression process to calculate molecular model parameters. We believe that this kind of information based model evaluation can be applied to more general analysis situations in physical science. The second section proposes new methods of multivariate cluster analysis. The K-means algorithm and the EM algorithm, introduced in the 1960's and 1970's respectively, formed the basis of multivariate cluster analysis methodology for many years. However, several shortcomings of these methods include strong dependence on initial seed values and inaccurate results when the data seriously depart from hypersphericity. We propose new cluster analysis methods based on genetic algorithms that overcomes the strong dependence on initial seed values. In addition, we propose a generalization of the Genetic K-means algorithm which can accurately identify clusters with complex hyperellipsoidal covariance

  8. Modern applied U-statistics

    Kowalski, Jeanne

    2008-01-01

    A timely and applied approach to the newly discovered methods and applications of U-statisticsBuilt on years of collaborative research and academic experience, Modern Applied U-Statistics successfully presents a thorough introduction to the theory of U-statistics using in-depth examples and applications that address contemporary areas of study including biomedical and psychosocial research. Utilizing a "learn by example" approach, this book provides an accessible, yet in-depth, treatment of U-statistics, as well as addresses key concepts in asymptotic theory by integrating translational and cross-disciplinary research.The authors begin with an introduction of the essential and theoretical foundations of U-statistics such as the notion of convergence in probability and distribution, basic convergence results, stochastic Os, inference theory, generalized estimating equations, as well as the definition and asymptotic properties of U-statistics. With an emphasis on nonparametric applications when and where applic...

  9. Modern Thermodynamics with Statistical Mechanics

    Helrich, Carl S

    2009-01-01

    With the aim of presenting thermodynamics in as simple and as unified a form as possible, this textbook starts with an introduction to the first and second laws and then promptly addresses the complete set of the potentials in a subsequent chapter and as a central theme throughout. Before discussing modern laboratory measurements, the book shows that the fundamental quantities sought in the laboratory are those which are required for determining the potentials. Since the subjects of thermodynamics and statistical mechanics are a seamless whole, statistical mechanics is treated as integral part of the text. Other key topics such as irreversibility, the ideas of Ilya Prigogine, chemical reaction rates, equilibrium of heterogeneous systems, and transition-state theory serve to round out this modern treatment. An additional chapter covers quantum statistical mechanics due to active current research in Bose-Einstein condensation. End-of-chapter exercises, chapter summaries, and an appendix reviewing fundamental pr...

  10. Statistical methods

    Szulc, Stefan

    1965-01-01

    Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then

  11. Modern applied statistics with S-plus

    Venables, W N

    1994-01-01

    S-Plus is a powerful environment for statistical and graphical analysis of data. It provides the tools to implement many statistical ideas which have been made possible by the widespread availability of workstations having good graphics and computational capabilities. This book is a guide to using S-Plus to perform statistical analyses and provides both an introduction to the use of S-Plus and a course in modern statistical methods. The aim of the book is to show how to use S-Plus as a powerful and graphical system. Readers are assumed to have a basic grounding in statistics, and so the book is intended for would-be users of S-Plus, and both students and researchers using statistics. Throughout, the emphasis is on presenting practical problems and full analyses of real data sets.

  12. Aspects of modern fracture statistics

    Tradinik, W.; Pabst, R.F.; Kromp, K.

    1981-01-01

    This contribution begins with introductory general remarks about fracture statistics. Then the fundamentals of the distribution of fracture probability are described. In the following part the application of the Weibull Statistics is justified. In the fourth chapter the microstructure of the material is considered in connection with calculations made in order to determine the fracture probability or risk of fracture. (RW) [de

  13. Modern applied statistics with s-plus

    Venables, W N

    1997-01-01

    S-PLUS is a powerful environment for the statistical and graphical analysis of data. It provides the tools to implement many statistical ideas which have been made possible by the widespread availability of workstations having good graphics and computational capabilities. This book is a guide to using S-PLUS to perform statistical analyses and provides both an introduction to the use of S-PLUS and a course in modern statistical methods. S-PLUS is available for both Windows and UNIX workstations, and both versions are covered in depth. The aim of the book is to show how to use S-PLUS as a powerful and graphical system. Readers are assumed to have a basic grounding in statistics, and so the book is intended for would-be users of S-PLUS, and both students and researchers using statistics. Throughout, the emphasis is on presenting practical problems and full analyses of real data sets. Many of the methods discussed are state-of-the-art approaches to topics such as linear and non-linear regression models, robust a...

  14. Intermediate statistics a modern approach

    Stevens, James P

    2007-01-01

    Written for those who use statistical techniques, this text focuses on a conceptual understanding of the material. It uses definitional formulas on small data sets to provide conceptual insight into what is being measured. It emphasizes the assumptions underlying each analysis, and shows how to test the critical assumptions using SPSS or SAS.

  15. Modern Reduction Methods

    Andersson, Pher G

    2008-01-01

    With its comprehensive overview of modern reduction methods, this book features high quality contributions allowing readers to find reliable solutions quickly and easily. The monograph treats the reduction of carbonyles, alkenes, imines and alkynes, as well as reductive aminations and cross and heck couplings, before finishing off with sections on kinetic resolutions and hydrogenolysis. An indispensable lab companion for every chemist.

  16. Methods in Modern Biophysics

    Nölting, Bengt

    2006-01-01

    Incorporating recent dramatic advances, this textbook presents a fresh and timely introduction to modern biophysical methods. An array of new, faster and higher-power biophysical methods now enables scientists to examine the mysteries of life at a molecular level. This innovative text surveys and explains the ten key biophysical methods, including those related to biophysical nanotechnology, scanning probe microscopy, X-ray crystallography, ion mobility spectrometry, mass spectrometry, proteomics, and protein folding and structure. Incorporating much information previously unavailable in tutorial form, Nölting employs worked examples and 267 illustrations to fully detail the techniques and their underlying mechanisms. Methods in Modern Biophysics is written for advanced undergraduate and graduate students, postdocs, researchers, lecturers and professors in biophysics, biochemistry and related fields. Special features in the 2nd edition: • Illustrates the high-resolution methods for ultrashort-living protei...

  17. Methods in Modern Biophysics

    Nölting, Bengt

    2010-01-01

    Incorporating recent dramatic advances, this textbook presents a fresh and timely introduction to modern biophysical methods. An array of new, faster and higher-power biophysical methods now enables scientists to examine the mysteries of life at a molecular level. This innovative text surveys and explains the ten key biophysical methods, including those related to biophysical nanotechnology, scanning probe microscopy, X-ray crystallography, ion mobility spectrometry, mass spectrometry, proteomics, and protein folding and structure. Incorporating much information previously unavailable in tutorial form, Nölting employs worked examples and about 270 illustrations to fully detail the techniques and their underlying mechanisms. Methods in Modern Biophysics is written for advanced undergraduate and graduate students, postdocs, researchers, lecturers, and professors in biophysics, biochemistry and related fields. Special features in the 3rd edition: Introduces rapid partial protein ladder sequencing - an important...

  18. Statistical Methods for Population Genetic Inference Based on Low-Depth Sequencing Data from Modern and Ancient DNA

    Korneliussen, Thorfinn Sand

    Due to the recent advances in DNA sequencing technology genomic data are being generated at an unprecedented rate and we are gaining access to entire genomes at population level. The technology does, however, not give direct access to the genetic variation and the many levels of preprocessing...... that is required before being able to make inferences from the data introduces multiple levels of uncertainty, especially for low-depth data. Therefore methods that take into account the inherent uncertainty are needed for being able to make robust inferences in the downstream analysis of such data. This poses...... a problem for a range of key summary statistics within populations genetics where existing methods are based on the assumption that the true genotypes are known. Motivated by this I present: 1) a new method for the estimation of relatedness between pairs of individuals, 2) a new method for estimating...

  19. A modern course in statistical physics

    Reichl, Linda E

    2016-01-01

    "A Modern Course in Statistical Physics" is a textbook that illustrates the foundations of equilibrium and non-equilibrium statistical physics, and the universal nature of thermodynamic processes, from the point of view of contemporary research problems. The book treats such diverse topics as the microscopic theory of critical phenomena, superfluid dynamics, quantum conductance, light scattering, transport processes, and dissipative structures, all in the framework of the foundations of statistical physics and thermodynamics. It shows the quantum origins of problems in classical statistical physics. One focus of the book is fluctuations that occur due to the discrete nature of matter, a topic of growing importance for nanometer scale physics and biophysics. Another focus concerns classical and quantum phase transitions, in both monatomic and mixed particle systems. This fourth edition extends the range of topics considered to include, for example, entropic forces, electrochemical processes in biological syste...

  20. Nonequilibrium statistical physics a modern perspective

    Livi, Roberto

    2017-01-01

    Statistical mechanics has been proven to be successful at describing physical systems at thermodynamic equilibrium. Since most natural phenomena occur in nonequilibrium conditions, the present challenge is to find suitable physical approaches for such conditions: this book provides a pedagogical pathway that explores various perspectives. The use of clear language, and explanatory figures and diagrams to describe models, simulations and experimental findings makes the book a valuable resource for undergraduate and graduate students, and also for lecturers organizing teaching at varying levels of experience in the field. Written in three parts, it covers basic and traditional concepts of nonequilibrium physics, modern aspects concerning nonequilibrium phase transitions, and application-orientated topics from a modern perspective. A broad range of topics is covered, including Langevin equations, Levy processes, directed percolation, kinetic roughening and pattern formation.

  1. Statistical methods for ranking data

    Alvo, Mayer

    2014-01-01

    This book introduces advanced undergraduate, graduate students and practitioners to statistical methods for ranking data. An important aspect of nonparametric statistics is oriented towards the use of ranking data. Rank correlation is defined through the notion of distance functions and the notion of compatibility is introduced to deal with incomplete data. Ranking data are also modeled using a variety of modern tools such as CART, MCMC, EM algorithm and factor analysis. This book deals with statistical methods used for analyzing such data and provides a novel and unifying approach for hypotheses testing. The techniques described in the book are illustrated with examples and the statistical software is provided on the authors’ website.

  2. THE GROWTH POINTS OF STATISTICAL METHODS

    Orlov A. I.

    2014-01-01

    On the basis of a new paradigm of applied mathematical statistics, data analysis and economic-mathematical methods are identified; we have also discussed five topical areas in which modern applied statistics is developing as well as the other statistical methods, i.e. five "growth points" – nonparametric statistics, robustness, computer-statistical methods, statistics of interval data, statistics of non-numeric data

  3. Working females : a modern statistical approach

    Kuhlenkasper, Torben

    2010-01-01

    The thesis analyzes the changing employment and economic situation of females when they become mothers. Two major questions are focused in the thesis: First, when do mothers return to their previous employment after bearing a child? Secondly, what are the individual economic consequences after having returned to the labor market? Both questions are analyzed empirically with latest statistical methods. The first major part of the thesis, corresponding to the above first motivated question, ...

  4. Methods of modern mathematical physics

    Reed, Michael

    1980-01-01

    This book is the first of a multivolume series devoted to an exposition of functional analysis methods in modern mathematical physics. It describes the fundamental principles of functional analysis and is essentially self-contained, although there are occasional references to later volumes. We have included a few applications when we thought that they would provide motivation for the reader. Later volumes describe various advanced topics in functional analysis and give numerous applications in classical physics, modern physics, and partial differential equations.

  5. Methods of statistical physics

    Akhiezer, Aleksandr I

    1981-01-01

    Methods of Statistical Physics is an exposition of the tools of statistical mechanics, which evaluates the kinetic equations of classical and quantized systems. The book also analyzes the equations of macroscopic physics, such as the equations of hydrodynamics for normal and superfluid liquids and macroscopic electrodynamics. The text gives particular attention to the study of quantum systems. This study begins with a discussion of problems of quantum statistics with a detailed description of the basics of quantum mechanics along with the theory of measurement. An analysis of the asymptotic be

  6. Parametric statistical inference basic theory and modern approaches

    Zacks, Shelemyahu; Tsokos, C P

    1981-01-01

    Parametric Statistical Inference: Basic Theory and Modern Approaches presents the developments and modern trends in statistical inference to students who do not have advanced mathematical and statistical preparation. The topics discussed in the book are basic and common to many fields of statistical inference and thus serve as a jumping board for in-depth study. The book is organized into eight chapters. Chapter 1 provides an overview of how the theory of statistical inference is presented in subsequent chapters. Chapter 2 briefly discusses statistical distributions and their properties. Chapt

  7. International Conference on Modern Problems of Stochastic Analysis and Statistics

    2017-01-01

    This book brings together the latest findings in the area of stochastic analysis and statistics. The individual chapters cover a wide range of topics from limit theorems, Markov processes, nonparametric methods, acturial science, population dynamics, and many others. The volume is dedicated to Valentin Konakov, head of the International Laboratory of Stochastic Analysis and its Applications on the occasion of his 70th birthday. Contributions were prepared by the participants of the international conference of the international conference “Modern problems of stochastic analysis and statistics”, held at the Higher School of Economics in Moscow from May 29 - June 2, 2016. It offers a valuable reference resource for researchers and graduate students interested in modern stochastics.

  8. Permutation statistical methods an integrated approach

    Berry, Kenneth J; Johnston, Janis E

    2016-01-01

    This research monograph provides a synthesis of a number of statistical tests and measures, which, at first consideration, appear disjoint and unrelated. Numerous comparisons of permutation and classical statistical methods are presented, and the two methods are compared via probability values and, where appropriate, measures of effect size. Permutation statistical methods, compared to classical statistical methods, do not rely on theoretical distributions, avoid the usual assumptions of normality and homogeneity of variance, and depend only on the data at hand. This text takes a unique approach to explaining statistics by integrating a large variety of statistical methods, and establishing the rigor of a topic that to many may seem to be a nascent field in statistics. This topic is new in that it took modern computing power to make permutation methods available to people working in the mainstream of research. This research monograph addresses a statistically-informed audience, and can also easily serve as a ...

  9. Discussion of "Modern statistics for spatial point processes"

    Jensen, Eva Bjørn Vedel; Prokesová, Michaela; Hellmund, Gunnar

    2007-01-01

    ABSTRACT. The paper ‘Modern statistics for spatial point processes’ by Jesper Møller and Rasmus P. Waagepetersen is based on a special invited lecture given by the authors at the 21st Nordic Conference on Mathematical Statistics, held at Rebild, Denmark, in June 2006. At the conference, Antti...

  10. [Modern methods of diagnosis dyslipidemia ].

    Sukhorukov, V N; Karagodin, V P; Orekhov, A N

    2016-01-01

    Dyslipidemia is abnormalities of lipid and lipoprotein metabolism. Most dyslipidemias are hyperlipidemias; that is an abnormally high level of lipids and/or lipoproteins in the blood. Lipid and lipoprotein abnormalities are common in the general population, and are regarded as a modifiable risk factor for cardiovascular disease due to their influence on atherosclerosis. Primary dyslipidemia is usually due to genetic causes, while secondary dyslipidemia arises due to other underlying causes such as diabetes mellitus. Thus, dyslipidemia is an important factor in the development of atherosclerosis and cardiovascular diseases therefore, it is important to diagnose it in time. This review focuses on the modern methods of diagnosis of dyslipidemia.

  11. Statistical methods for forecasting

    Abraham, Bovas

    2009-01-01

    The Wiley-Interscience Paperback Series consists of selected books that have been made more accessible to consumers in an effort to increase global appeal and general circulation. With these new unabridged softcover volumes, Wiley hopes to extend the lives of these works by making them available to future generations of statisticians, mathematicians, and scientists."This book, it must be said, lives up to the words on its advertising cover: ''Bridging the gap between introductory, descriptive approaches and highly advanced theoretical treatises, it provides a practical, intermediate level discussion of a variety of forecasting tools, and explains how they relate to one another, both in theory and practice.'' It does just that!"-Journal of the Royal Statistical Society"A well-written work that deals with statistical methods and models that can be used to produce short-term forecasts, this book has wide-ranging applications. It could be used in the context of a study of regression, forecasting, and time series ...

  12. WE-A-201-02: Modern Statistical Modeling

    Niemierko, A.

    2016-06-15

    Regulatory Commission and may be remembered for his critique of the National Academy of Sciences BEIR III report (stating that their methodology “imposes a Delphic quality to the .. risk estimates”.) This led to his appointment as a member of the BEIR V committee. Don presented refresher courses at the AAPM, ASTRO and RSNA meetings and was active in the AAPM as a member or chair of several committees. He was the principal author of AAPM Report 43, which is essentially a critique of established clinical studies prior to 1992. He was co-editor of the Proceedings of many symposia on Time, Dose and Fractionation held in Madison, Wisconsin. He received the AAPM lifetime Achievement award in 2004. Don’s second wife of 46 years, Ann, predeceased him and he is survived by daughters Hillary and Emily, son John and two grandsons. Don was a true gentleman with a unique and erudite writing style illuminated by pithy quotations. If he had a fault it was, perhaps, that he did not realize how much smarter he was than the rest of us. This presentation draws heavily on a biography and video interview in the History and Heritage section of the AAPM website. The quote is his own. Andrzej Niemierko: Statistical modeling plays an essential role in modern medicine for quantitative evaluation of the effect of treatment. This session will feature an overview of statistical modeling techniques used for analyzing the many types of research data and an exploration of recent advances in new statistical modeling methodologies. Learning Objectives: To learn basics of statistical modeling methodology. To discuss statistical models that are frequently used in radiation oncology To discuss advanced modern statistical modeling methods and applications.

  13. WE-A-201-02: Modern Statistical Modeling

    Niemierko, A.

    2016-01-01

    Regulatory Commission and may be remembered for his critique of the National Academy of Sciences BEIR III report (stating that their methodology “imposes a Delphic quality to the .. risk estimates”.) This led to his appointment as a member of the BEIR V committee. Don presented refresher courses at the AAPM, ASTRO and RSNA meetings and was active in the AAPM as a member or chair of several committees. He was the principal author of AAPM Report 43, which is essentially a critique of established clinical studies prior to 1992. He was co-editor of the Proceedings of many symposia on Time, Dose and Fractionation held in Madison, Wisconsin. He received the AAPM lifetime Achievement award in 2004. Don’s second wife of 46 years, Ann, predeceased him and he is survived by daughters Hillary and Emily, son John and two grandsons. Don was a true gentleman with a unique and erudite writing style illuminated by pithy quotations. If he had a fault it was, perhaps, that he did not realize how much smarter he was than the rest of us. This presentation draws heavily on a biography and video interview in the History and Heritage section of the AAPM website. The quote is his own. Andrzej Niemierko: Statistical modeling plays an essential role in modern medicine for quantitative evaluation of the effect of treatment. This session will feature an overview of statistical modeling techniques used for analyzing the many types of research data and an exploration of recent advances in new statistical modeling methodologies. Learning Objectives: To learn basics of statistical modeling methodology. To discuss statistical models that are frequently used in radiation oncology To discuss advanced modern statistical modeling methods and applications.

  14. Understanding advanced statistical methods

    Westfall, Peter

    2013-01-01

    Introduction: Probability, Statistics, and ScienceReality, Nature, Science, and ModelsStatistical Processes: Nature, Design and Measurement, and DataModelsDeterministic ModelsVariabilityParametersPurely Probabilistic Statistical ModelsStatistical Models with Both Deterministic and Probabilistic ComponentsStatistical InferenceGood and Bad ModelsUses of Probability ModelsRandom Variables and Their Probability DistributionsIntroductionTypes of Random Variables: Nominal, Ordinal, and ContinuousDiscrete Probability Distribution FunctionsContinuous Probability Distribution FunctionsSome Calculus-Derivatives and Least SquaresMore Calculus-Integrals and Cumulative Distribution FunctionsProbability Calculation and SimulationIntroductionAnalytic Calculations, Discrete and Continuous CasesSimulation-Based ApproximationGenerating Random NumbersIdentifying DistributionsIntroductionIdentifying Distributions from Theory AloneUsing Data: Estimating Distributions via the HistogramQuantiles: Theoretical and Data-Based Estimate...

  15. Advanced statistical methods in data science

    Chen, Jiahua; Lu, Xuewen; Yi, Grace; Yu, Hao

    2016-01-01

    This book gathers invited presentations from the 2nd Symposium of the ICSA- CANADA Chapter held at the University of Calgary from August 4-6, 2015. The aim of this Symposium was to promote advanced statistical methods in big-data sciences and to allow researchers to exchange ideas on statistics and data science and to embraces the challenges and opportunities of statistics and data science in the modern world. It addresses diverse themes in advanced statistical analysis in big-data sciences, including methods for administrative data analysis, survival data analysis, missing data analysis, high-dimensional and genetic data analysis, longitudinal and functional data analysis, the design and analysis of studies with response-dependent and multi-phase designs, time series and robust statistics, statistical inference based on likelihood, empirical likelihood and estimating functions. The editorial group selected 14 high-quality presentations from this successful symposium and invited the presenters to prepare a fu...

  16. Statistical methods and materials characterisation

    Wallin, K.R.W.

    2010-01-01

    Statistics is a wide mathematical area, which covers a myriad of analysis and estimation options, some of which suit special cases better than others. A comprehensive coverage of the whole area of statistics would be an enormous effort and would also be outside the capabilities of this author. Therefore, this does not intend to be a textbook on statistical methods available for general data analysis and decision making. Instead it will highlight a certain special statistical case applicable to mechanical materials characterization. The methods presented here do not in any way rule out other statistical methods by which to analyze mechanical property material data. (orig.)

  17. Statistical methods in quality assurance

    Eckhard, W.

    1980-01-01

    During the different phases of a production process - planning, development and design, manufacturing, assembling, etc. - most of the decision rests on a base of statistics, the collection, analysis and interpretation of data. Statistical methods can be thought of as a kit of tools to help to solve problems in the quality functions of the quality loop with respect to produce quality products and to reduce quality costs. Various statistical methods are represented, typical examples for their practical application are demonstrated. (RW)

  18. Statistical methods for quality improvement

    Ryan, Thomas P

    2011-01-01

    ...."-TechnometricsThis new edition continues to provide the most current, proven statistical methods for quality control and quality improvementThe use of quantitative methods offers numerous benefits...

  19. Modern teaching methods in economic subjects.

    Maxa, Radek

    2014-01-01

    The main objective of this thesis is a comprehensive assessment of the practical usability and effectiveness of modern activating teaching methods in economic subjects in fulfilling the RVP economics and business and RVP Business Academy in comparison with traditional (standard) methods. To achieve this goal, a systematic clarification and evaluation of key elements of the choice of adequate methods of teaching, presentation and comparison of traditional, modern activating and comprehensive t...

  20. Statistical Methods in Integrative Genomics

    Richardson, Sylvia; Tseng, George C.; Sun, Wei

    2016-01-01

    Statistical methods in integrative genomics aim to answer important biology questions by jointly analyzing multiple types of genomic data (vertical integration) or aggregating the same type of data across multiple studies (horizontal integration). In this article, we introduce different types of genomic data and data resources, and then review statistical methods of integrative genomics, with emphasis on the motivation and rationale of these methods. We conclude with some summary points and future research directions. PMID:27482531

  1. Statistical methods in nonlinear dynamics

    Sensitivity to initial conditions in nonlinear dynamical systems leads to exponential divergence of trajectories that are initially arbitrarily close, and hence to unpredictability. Statistical methods have been found to be helpful in extracting useful information about such systems. In this paper, we review briefly some statistical ...

  2. Statistical Methods in Psychology Journals.

    Willkinson, Leland

    1999-01-01

    Proposes guidelines for revising the American Psychological Association (APA) publication manual or other APA materials to clarify the application of statistics in research reports. The guidelines are intended to induce authors and editors to recognize the thoughtless application of statistical methods. Contains 54 references. (SLD)

  3. Modern methods of thyroid diagnosis

    Hehrmann, R

    1980-11-01

    An attempt is made to provide a systematic general view at the diagnostical methods and facilities which nowadays are applied in the case of thyroid diseases. The reasonable and accurate application of the various methods possible is of decisive importance for the thyroid diagnostics. This planned application requires exact knowledge of the case history and of the findings as well as knowledge of the preconditions, the advantages and disadvantages of and the possible errors in the methods applied. Proposals and guiding lines for a planned step-by-step application of methods of diagnosing thyroid diseases were published by the Section Thyroid Society for Endocrinology in Wiesbaden in December 1978. For step-by-step diagnostics of thyroid diseaeses, these publications may be of help.

  4. Statistical methods for physical science

    Stanford, John L

    1994-01-01

    This volume of Methods of Experimental Physics provides an extensive introduction to probability and statistics in many areas of the physical sciences, with an emphasis on the emerging area of spatial statistics. The scope of topics covered is wide-ranging-the text discusses a variety of the most commonly used classical methods and addresses newer methods that are applicable or potentially important. The chapter authors motivate readers with their insightful discussions, augmenting their material withKey Features* Examines basic probability, including coverage of standard distributions, time s

  5. Statistical methods in nuclear theory

    Shubin, Yu.N.

    1974-01-01

    The paper outlines statistical methods which are widely used for describing properties of excited states of nuclei and nuclear reactions. It discusses physical assumptions lying at the basis of known distributions between levels (Wigner, Poisson distributions) and of widths of highly excited states (Porter-Thomas distribution, as well as assumptions used in the statistical theory of nuclear reactions and in the fluctuation analysis. The author considers the random matrix method, which consists in replacing the matrix elements of a residual interaction by random variables with a simple statistical distribution. Experimental data are compared with results of calculations using the statistical model. The superfluid nucleus model is considered with regard to superconducting-type pair correlations

  6. Robust statistical methods with R

    Jureckova, Jana

    2005-01-01

    Robust statistical methods were developed to supplement the classical procedures when the data violate classical assumptions. They are ideally suited to applied research across a broad spectrum of study, yet most books on the subject are narrowly focused, overly theoretical, or simply outdated. Robust Statistical Methods with R provides a systematic treatment of robust procedures with an emphasis on practical application.The authors work from underlying mathematical tools to implementation, paying special attention to the computational aspects. They cover the whole range of robust methods, including differentiable statistical functions, distance of measures, influence functions, and asymptotic distributions, in a rigorous yet approachable manner. Highlighting hands-on problem solving, many examples and computational algorithms using the R software supplement the discussion. The book examines the characteristics of robustness, estimators of real parameter, large sample properties, and goodness-of-fit tests. It...

  7. Statistical Process Control in a Modern Production Environment

    Windfeldt, Gitte Bjørg

    gathered here and standard statistical software. In Paper 2 a new method for process monitoring is introduced. The method uses a statistical model of the quality characteristic and a sliding window of observations to estimate the probability that the next item will not respect the specications......Paper 1 is aimed at practicians to help them test the assumption that the observations in a sample are independent and identically distributed. An assumption that is essential when using classical Shewhart charts. The test can easily be performed in the control chart setup using the samples....... If the estimated probability exceeds a pre-determined threshold the process will be stopped. The method is exible, allowing a complexity in modeling that remains invisible to the end user. Furthermore, the method allows to build diagnostic plots based on the parameters estimates that can provide valuable insight...

  8. Statistical Methods for Fuzzy Data

    Viertl, Reinhard

    2011-01-01

    Statistical data are not always precise numbers, or vectors, or categories. Real data are frequently what is called fuzzy. Examples where this fuzziness is obvious are quality of life data, environmental, biological, medical, sociological and economics data. Also the results of measurements can be best described by using fuzzy numbers and fuzzy vectors respectively. Statistical analysis methods have to be adapted for the analysis of fuzzy data. In this book, the foundations of the description of fuzzy data are explained, including methods on how to obtain the characterizing function of fuzzy m

  9. Statistical methods in spatial genetics

    Guillot, Gilles; Leblois, Raphael; Coulon, Aurelie

    2009-01-01

    The joint analysis of spatial and genetic data is rapidly becoming the norm in population genetics. More and more studies explicitly describe and quantify the spatial organization of genetic variation and try to relate it to underlying ecological processes. As it has become increasingly difficult...... to keep abreast with the latest methodological developments, we review the statistical toolbox available to analyse population genetic data in a spatially explicit framework. We mostly focus on statistical concepts but also discuss practical aspects of the analytical methods, highlighting not only...

  10. Statistical methods in physical mapping

    Nelson, D.O.

    1995-05-01

    One of the great success stories of modern molecular genetics has been the ability of biologists to isolate and characterize the genes responsible for serious inherited diseases like fragile X syndrome, cystic fibrosis and myotonic muscular dystrophy. This dissertation concentrates on constructing high-resolution physical maps. It demonstrates how probabilistic modeling and statistical analysis can aid molecular geneticists in the tasks of planning, execution, and evaluation of physical maps of chromosomes and large chromosomal regions. The dissertation is divided into six chapters. Chapter 1 provides an introduction to the field of physical mapping, describing the role of physical mapping in gene isolation and ill past efforts at mapping chromosomal regions. The next two chapters review and extend known results on predicting progress in large mapping projects. Such predictions help project planners decide between various approaches and tactics for mapping large regions of the human genome. Chapter 2 shows how probability models have been used in the past to predict progress in mapping projects. Chapter 3 presents new results, based on stationary point process theory, for progress measures for mapping projects based on directed mapping strategies. Chapter 4 describes in detail the construction of all initial high-resolution physical map for human chromosome 19. This chapter introduces the probability and statistical models involved in map construction in the context of a large, ongoing physical mapping project. Chapter 5 concentrates on one such model, the trinomial model. This chapter contains new results on the large-sample behavior of this model, including distributional results, asymptotic moments, and detection error rates. In addition, it contains an optimality result concerning experimental procedures based on the trinomial model. The last chapter explores unsolved problems and describes future work

  11. Statistical methods in physical mapping

    Nelson, David O. [Univ. of California, Berkeley, CA (United States)

    1995-05-01

    One of the great success stories of modern molecular genetics has been the ability of biologists to isolate and characterize the genes responsible for serious inherited diseases like fragile X syndrome, cystic fibrosis and myotonic muscular dystrophy. This dissertation concentrates on constructing high-resolution physical maps. It demonstrates how probabilistic modeling and statistical analysis can aid molecular geneticists in the tasks of planning, execution, and evaluation of physical maps of chromosomes and large chromosomal regions. The dissertation is divided into six chapters. Chapter 1 provides an introduction to the field of physical mapping, describing the role of physical mapping in gene isolation and ill past efforts at mapping chromosomal regions. The next two chapters review and extend known results on predicting progress in large mapping projects. Such predictions help project planners decide between various approaches and tactics for mapping large regions of the human genome. Chapter 2 shows how probability models have been used in the past to predict progress in mapping projects. Chapter 3 presents new results, based on stationary point process theory, for progress measures for mapping projects based on directed mapping strategies. Chapter 4 describes in detail the construction of all initial high-resolution physical map for human chromosome 19. This chapter introduces the probability and statistical models involved in map construction in the context of a large, ongoing physical mapping project. Chapter 5 concentrates on one such model, the trinomial model. This chapter contains new results on the large-sample behavior of this model, including distributional results, asymptotic moments, and detection error rates. In addition, it contains an optimality result concerning experimental procedures based on the trinomial model. The last chapter explores unsolved problems and describes future work.

  12. Statistical Approaches Accomodating Uncertainty in Modern Genomic Data

    Skotte, Line

    the contributed method applicable to case-control studies as well as mapping of quantitative traits. The contributed method provides a needed association test for quantitative traits in the presence of uncertain genotypes and it further allows correction for population structure in association tests for disease...... the potential of the technological advances. The first of the four papers included in this thesis describes a new method for association mapping that accommodates uncertain genotypes from low-coverage re-sequencing data. The method allows uncertain genotypes using a score statistic based on the joint likelihood...... of the observed phenotypes and the observed sequencing data. This joint likelihood accounts for the genotype uncertainties via the posterior probabilities of each genotype given the observed sequencing data and the phenotype distributions are modelled using a generalised linear model framework which makes...

  13. Nonequilibrium statistical mechanics ensemble method

    Eu, Byung Chan

    1998-01-01

    In this monograph, nonequilibrium statistical mechanics is developed by means of ensemble methods on the basis of the Boltzmann equation, the generic Boltzmann equations for classical and quantum dilute gases, and a generalised Boltzmann equation for dense simple fluids The theories are developed in forms parallel with the equilibrium Gibbs ensemble theory in a way fully consistent with the laws of thermodynamics The generalised hydrodynamics equations are the integral part of the theory and describe the evolution of macroscopic processes in accordance with the laws of thermodynamics of systems far removed from equilibrium Audience This book will be of interest to researchers in the fields of statistical mechanics, condensed matter physics, gas dynamics, fluid dynamics, rheology, irreversible thermodynamics and nonequilibrium phenomena

  14. MODERN METHODS OF FOOD INTOLERANCE TESTING

    M. Yu. Rosensteyn

    2016-01-01

    Full Text Available Аn analytical review of modern methods of food intolerance diagnostics based on interpretation of markers used in the various tests is рresented. It is shown that tests based on observation of the reaction of specific antibodies of the immune system to food antigens tested, are the most accurate, reliable and representative for the diagnosis of food intolerance.

  15. Diabetic osteoarthropathy: modern methods of therapy

    Irina Nikolaevna Ul'yanova

    2010-12-01

    Full Text Available This paper is focused on the main aspects of pathogenesis of diabetic osteoarthropathy (DOAP underlain by motor and sensory neuropathies, injuries(including microfractures and joint disintegration, and inflammation accompanied by enhanced cytokine expression. The role of osteopenia andbone resorption-formation decoupling at different stages of DOAP is discussed. Modern methods of DOAP treatment are considered with special referenceto immobilization, drug therapy, surgical and orthopedic care

  16. Statistical methods for quality assurance

    Rinne, H.; Mittag, H.J.

    1989-01-01

    This is the first German-language textbook on quality assurance and the fundamental statistical methods that is suitable for private study. The material for this book has been developed from a course of Hagen Open University and is characterized by a particularly careful didactical design which is achieved and supported by numerous illustrations and photographs, more than 100 exercises with complete problem solutions, many fully displayed calculation examples, surveys fostering a comprehensive approach, bibliography with comments. The textbook has an eye to practice and applications, and great care has been taken by the authors to avoid abstraction wherever appropriate, to explain the proper conditions of application of the testing methods described, and to give guidance for suitable interpretation of results. The testing methods explained also include latest developments and research results in order to foster their adoption in practice. (orig.) [de

  17. Order statistics & inference estimation methods

    Balakrishnan, N

    1991-01-01

    The literature on order statistics and inferenc eis quite extensive and covers a large number of fields ,but most of it is dispersed throughout numerous publications. This volume is the consolidtion of the most important results and places an emphasis on estimation. Both theoretical and computational procedures are presented to meet the needs of researchers, professionals, and students. The methods of estimation discussed are well-illustrated with numerous practical examples from both the physical and life sciences, including sociology,psychology,a nd electrical and chemical engineering. A co

  18. Vortex methods and vortex statistics

    Chorin, A.J.

    1993-05-01

    Vortex methods originated from the observation that in incompressible, inviscid, isentropic flow vorticity (or, more accurately, circulation) is a conserved quantity, as can be readily deduced from the absence of tangential stresses. Thus if the vorticity is known at time t = 0, one can deduce the flow at a later time by simply following it around. In this narrow context, a vortex method is a numerical method that makes use of this observation. Even more generally, the analysis of vortex methods leads, to problems that are closely related to problems in quantum physics and field theory, as well as in harmonic analysis. A broad enough definition of vortex methods ends up by encompassing much of science. Even the purely computational aspects of vortex methods encompass a range of ideas for which vorticity may not be the best unifying theme. The author restricts himself in these lectures to a special class of numerical vortex methods, those that are based on a Lagrangian transport of vorticity in hydrodynamics by smoothed particles (''blobs'') and those whose understanding contributes to the understanding of blob methods. Vortex methods for inviscid flow lead to systems of ordinary differential equations that can be readily clothed in Hamiltonian form, both in three and two space dimensions, and they can preserve exactly a number of invariants of the Euler equations, including topological invariants. Their viscous versions resemble Langevin equations. As a result, they provide a very useful cartoon of statistical hydrodynamics, i.e., of turbulence, one that can to some extent be analyzed analytically and more importantly, explored numerically, with important implications also for superfluids, superconductors, and even polymers. In the authors view, vortex ''blob'' methods provide the most promising path to the understanding of these phenomena

  19. Bayes linear statistics, theory & methods

    Goldstein, Michael

    2007-01-01

    Bayesian methods combine information available from data with any prior information available from expert knowledge. The Bayes linear approach follows this path, offering a quantitative structure for expressing beliefs, and systematic methods for adjusting these beliefs, given observational data. The methodology differs from the full Bayesian methodology in that it establishes simpler approaches to belief specification and analysis based around expectation judgements. Bayes Linear Statistics presents an authoritative account of this approach, explaining the foundations, theory, methodology, and practicalities of this important field. The text provides a thorough coverage of Bayes linear analysis, from the development of the basic language to the collection of algebraic results needed for efficient implementation, with detailed practical examples. The book covers:The importance of partial prior specifications for complex problems where it is difficult to supply a meaningful full prior probability specification...

  20. The statistical mind in modern society. The Netherlands 1850-1940. Volume I: official Statistics, social progress and modern enterprise

    Maarseveen, J.G.S.J. van; Klep, P.M.M.; Stamhuis, I.H.

    2008-01-01

    In the period 1850-1940 statistics developed as a new combination of theory and practice. A wide range of phenomena were looked at in a novel way and this statistical mindset had a pervasive influence in contemporary society. This development of statistics is closely interlinked with the process of

  1. Computed tomography from photon statistics to modern cone-beam CT

    Buzug, T M

    2008-01-01

    Tis book provides an overview of X-ray technology, the historic developmental milestones of modern CT systems, and gives a comprehensive insight into the main reconstruction methods used in computed tomography. Te basis of reconstr- tion is, undoubtedly, mathematics. However, the beauty of computed tomography cannot be understood without a detailed knowledge of X-ray generation, photon- matter interaction, X-ray detection, photon statistics, as well as fundamental signal processing concepts and dedicated measurement systems. Terefore, the reader will ?nd a number of references to these basic d

  2. MODERN METHODS OF REASONABLE PRODUCT SUPPLY

    Anna Kulik

    2016-11-01

    Full Text Available Thesis objective is to study modern methods of product supply with the purpose to determine optimal ways for their rationalization. Since the use of reasonable practices, taking into account external and internal factors under the specific conditions of product moving from the supplier to the buyer, makes the process of product supply economically viable, i.e., low costs for product transportation, ensures fast moving products, their safety and, ultimately, results in reduction of the costs of product disposal. Methodology. The study is based on theoretical methods to study this problem. System analysis method and simulation of the ways to improve were also used in the study. Results. Addressing these issues, the concept, form and stages of product supply process organization depending on the type of product have been studied; product supply management methods based on logistics concept of “demand response”. Practical significance. Optimization of the principles and methods of product supply, factors affecting its organization will, in practice, contribute to the development of reasonable product delivery systems featured with economic efficiency of advanced technologies of product supply. Value/ originality. The analyzed methods of product supply management based on logistics concept of “demand response” can ensure maximum reduction of response time to the changes in demand by rapid stocktaking at those points of the market where the demand is expected to increase, which will allow to reduce the costs of bringing the product to the consumer.

  3. Statistical methods in radiation physics

    Turner, James E; Bogard, James S

    2012-01-01

    This statistics textbook, with particular emphasis on radiation protection and dosimetry, deals with statistical solutions to problems inherent in health physics measurements and decision making. The authors begin with a description of our current understanding of the statistical nature of physical processes at the atomic level, including radioactive decay and interactions of radiation with matter. Examples are taken from problems encountered in health physics, and the material is presented such that health physicists and most other nuclear professionals will more readily understand the application of statistical principles in the familiar context of the examples. Problems are presented at the end of each chapter, with solutions to selected problems provided online. In addition, numerous worked examples are included throughout the text.

  4. Statistical inference via fiducial methods

    Salomé, Diemer

    1998-01-01

    In this thesis the attention is restricted to inductive reasoning using a mathematical probability model. A statistical procedure prescribes, for every theoretically possible set of data, the inference about the unknown of interest. ... Zie: Summary

  5. Modern methods in analytical acoustics lecture notes

    Crighton, D G; Williams, J E Ffowcs; Heckl, M; Leppington, F G

    1992-01-01

    Modern Methods in Analytical Acoustics considers topics fundamental to the understanding of noise, vibration and fluid mechanisms. The series of lectures on which this material is based began by some twenty five years ago and has been developed and expanded ever since. Acknowledged experts in the field have given this course many times in Europe and the USA. Although the scope of the course has widened considerably, the primary aim of teaching analytical techniques of acoustics alongside specific areas of wave motion and unsteady fluid mechanisms remains. The distinguished authors of this volume are drawn from Departments of Acoustics, Engineering of Applied Mathematics in Berlin, Cambridge and London. Their intention is to reach a wider audience of all those concerned with acoustic analysis than has been able to attend the course.

  6. Register-based statistics statistical methods for administrative data

    Wallgren, Anders

    2014-01-01

    This book provides a comprehensive and up to date treatment of  theory and practical implementation in Register-based statistics. It begins by defining the area, before explaining how to structure such systems, as well as detailing alternative approaches. It explains how to create statistical registers, how to implement quality assurance, and the use of IT systems for register-based statistics. Further to this, clear details are given about the practicalities of implementing such statistical methods, such as protection of privacy and the coordination and coherence of such an undertaking. Thi

  7. Hydrobalneological methods in modern medical treatment

    Włodzisław Kuliński

    2015-03-01

    Full Text Available Introduction: Therapeutic methods combining balneology and hydrotherapy have been used in treatment and prevention for a long time. Their influence on the skin, based on mechanical, thermal, and hydrostatic stimuli, results in a reaction of the internal organs as well as the whole body. The most important effects of such procedures are changes within the cardiovascular system. Aim of the research: The use of hydrobalneological methods in modern medical treatment. Material and methods : The analysis focused on the influence of water jets at alternating temperatures in the treatment of functional cardiovascular disturbances with the use of non-invasive methods of autonomic nervous system function work-up based on the analysis of heart rate variability. The effect of the jets on heart rate and blood pressure was observed in 50 patients with first-degree hypertension, which was accompanied by radioelectrocardiographic (RECG assessment of the influence of underwater massage and carbonic acid baths on the cardiovascular system in patients undergoing these procedures due to Da Costa’s syndrome. Results : Water jets at alternating temperatures successfully modulate the tension within the autonomic nervous system and stimulate its parasympathetic part. Underwater massage is a gentle procedure and does not cause significant changes in heart rate and RECG tracing. Carbonic acid baths decrease autonomic nervous system excitability. Conclusions: The study results show a possibility of regulating autonomic nervous system function with the use of selected balneological and hydrotherapeutic methods, and thus influencing the functional level of the human body which is most appropriate for the requirements created by the internal and external environment of the body.

  8. Complex Data Modeling and Computationally Intensive Statistical Methods

    Mantovan, Pietro

    2010-01-01

    The last years have seen the advent and development of many devices able to record and store an always increasing amount of complex and high dimensional data; 3D images generated by medical scanners or satellite remote sensing, DNA microarrays, real time financial data, system control datasets. The analysis of this data poses new challenging problems and requires the development of novel statistical models and computational methods, fueling many fascinating and fast growing research areas of modern statistics. The book offers a wide variety of statistical methods and is addressed to statistici

  9. Statistical indicators and trends in juvenile delinquency in modern Russia

    Yuzikhanova E.G.

    2014-12-01

    Full Text Available Statistics of juvenile delinquency in Russia for ten years, allowing to determine its current trends, is presented. It’s noted that earlier the proportion of juveniles among all criminals was about 11-12%. During the period from 2003 to 2013 the proportion of juveniles in the total number of identified offenders decreased to 6%. Despite the reduction in the number of crimes committed by this category of persons, for several years the largest criminal activity is maintained in the age group 16-17 years (70%. Smaller proportion is the age group 14-15 years, there’s a reduction in the number of committed crimes: from 49,300 in 2000 to 19,700 in 2013. Over the same period, the number of reported crimes committed by minors or with their complicity decreased almost three times. With all the ambiguity of attitude to the considered problem, the author defines the role of criminal law policy of the state in response to trends in juvenile crime taking into account its specificity, caused by the complex of interrelated factors related to age, social, psychological characteristics of juveniles as a special social group, the originality of their social status. The legislative novel is considered: the punishment in the form of arrest is not imposed on persons under the age of eighteen by the time of court verdict. It’s summarized that the problems of juvenile delinquency are only partly solved by the humanization of criminal law policy of the state in order to restore social justice, correct the convict and prevent new crimes commission.

  10. Statistical learning methods: Basics, control and performance

    Zimmermann, J. [Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)]. E-mail: zimmerm@mppmu.mpg.de

    2006-04-01

    The basics of statistical learning are reviewed with a special emphasis on general principles and problems for all different types of learning methods. Different aspects of controlling these methods in a physically adequate way will be discussed. All principles and guidelines will be exercised on examples for statistical learning methods in high energy and astrophysics. These examples prove in addition that statistical learning methods very often lead to a remarkable performance gain compared to the competing classical algorithms.

  11. Statistical learning methods: Basics, control and performance

    Zimmermann, J.

    2006-01-01

    The basics of statistical learning are reviewed with a special emphasis on general principles and problems for all different types of learning methods. Different aspects of controlling these methods in a physically adequate way will be discussed. All principles and guidelines will be exercised on examples for statistical learning methods in high energy and astrophysics. These examples prove in addition that statistical learning methods very often lead to a remarkable performance gain compared to the competing classical algorithms

  12. Multivariate statistical methods a primer

    Manly, Bryan FJ

    2004-01-01

    THE MATERIAL OF MULTIVARIATE ANALYSISExamples of Multivariate DataPreview of Multivariate MethodsThe Multivariate Normal DistributionComputer ProgramsGraphical MethodsChapter SummaryReferencesMATRIX ALGEBRAThe Need for Matrix AlgebraMatrices and VectorsOperations on MatricesMatrix InversionQuadratic FormsEigenvalues and EigenvectorsVectors of Means and Covariance MatricesFurther Reading Chapter SummaryReferencesDISPLAYING MULTIVARIATE DATAThe Problem of Displaying Many Variables in Two DimensionsPlotting index VariablesThe Draftsman's PlotThe Representation of Individual Data P:ointsProfiles o

  13. Statistical data analysis using SAS intermediate statistical methods

    Marasinghe, Mervyn G

    2018-01-01

    The aim of this textbook (previously titled SAS for Data Analytics) is to teach the use of SAS for statistical analysis of data for advanced undergraduate and graduate students in statistics, data science, and disciplines involving analyzing data. The book begins with an introduction beyond the basics of SAS, illustrated with non-trivial, real-world, worked examples. It proceeds to SAS programming and applications, SAS graphics, statistical analysis of regression models, analysis of variance models, analysis of variance with random and mixed effects models, and then takes the discussion beyond regression and analysis of variance to conclude. Pedagogically, the authors introduce theory and methodological basis topic by topic, present a problem as an application, followed by a SAS analysis of the data provided and a discussion of results. The text focuses on applied statistical problems and methods. Key features include: end of chapter exercises, downloadable SAS code and data sets, and advanced material suitab...

  14. Radar Scan Methods in Modern Multifunctional Radars

    V. N. Skosyrev

    2014-01-01

    Full Text Available Considered urgent task of organizing the review space in modern multifunctional radar systems shall review the space in a wide range of elevation angles from minus 5 to 60-80 degrees and 360 degrees azimuth. MfRLS this type should provide an overview of the zone for a limited time (2-3 sec, detecting a wide range of subtle high and low-flying targets. The latter circumstance requires the organization to select targets against the background of reflections from the underlying surface and local objects (MP. When providing an overview of the space taken into account the need to increase not only the noise immunity, and survivability.Two variants of the review of space in the elevation plane in the solid-state AESA radar. In the first case the overview space narrow beam by one beam. In the second - the transfer of DNA is formed, covering the whole sector of responsibility in elevation and at the reception beam is formed in spetsvychislitele (CB as a result of the signal processing of digitized after emitters antenna web. The estimations of the parameters specific to the multifunction radar SAM air and missile defense. It is shown that in a number of practically important cases, preference should be given clearly one of the methods described review of space.The functional scheme with AESA radar for both variants of the review. Necessary to analyze their differences. Contains the problem of increasing the cost of MfRLS with digital beamforming DNA with increasing bandwidth probing signal being processed.Noted drawbacks of MfRLS with digital beamforming beam. Including: reduced accuracy of the coordinates at low elevation angles, the complexity of the organization of thermal regime of the solid element base using quasi-continuous signal with a low duty cycle. Shows their fundamentally unavoidable in the steppe and desert areas with uneven terrain (Kazakhstan, China, the Middle East.It is shown that for MfRLS working in strong clutter, more preferably

  15. Statistical methods for nuclear material management

    Bowen W.M.; Bennett, C.A. (eds.)

    1988-12-01

    This book is intended as a reference manual of statistical methodology for nuclear material management practitioners. It describes statistical methods currently or potentially important in nuclear material management, explains the choice of methods for specific applications, and provides examples of practical applications to nuclear material management problems. Together with the accompanying training manual, which contains fully worked out problems keyed to each chapter, this book can also be used as a textbook for courses in statistical methods for nuclear material management. It should provide increased understanding and guidance to help improve the application of statistical methods to nuclear material management problems.

  16. Statistical methods for nuclear material management

    Bowen, W.M.; Bennett, C.A.

    1988-12-01

    This book is intended as a reference manual of statistical methodology for nuclear material management practitioners. It describes statistical methods currently or potentially important in nuclear material management, explains the choice of methods for specific applications, and provides examples of practical applications to nuclear material management problems. Together with the accompanying training manual, which contains fully worked out problems keyed to each chapter, this book can also be used as a textbook for courses in statistical methods for nuclear material management. It should provide increased understanding and guidance to help improve the application of statistical methods to nuclear material management problems

  17. Introduction to the basic concepts of modern physics special relativity, quantum and statistical physics

    Becchi, Carlo Maria

    2007-01-01

    These notes are designed as a text book for a course on the Modern Physics Theory for undergraduate students. The purpose is providing a rigorous and self-contained presentation of the simplest theoretical framework using elementary mathematical tools. A number of examples of relevant applications and an appropriate list of exercises and answered questions are also given. The first part is devoted to Special Relativity concerning in particular space-time relativity and relativistic kinematics. The second part deals with Schroedinger's formulation of quantum mechanics. The presentation concerns mainly one dimensional problems, in particular tunnel effect, discrete energy levels and band spectra. The third part concerns the application of Gibbs statistical methods to quantum systems and in particular to Bose and Fermi gasses.

  18. Modern

    A.V. Bagrov

    2014-06-01

    Full Text Available The article gives an overview of the most important problems of modern meteoric astronomy and briefly describes ways and methods of their solutions. Particular attention is paid to the construction and arrangement of meteoric video cameras intended for registration of the meteoric phenomena as the main method of obtaining reliable and objective observational data on the basis of which the solution of the described tasks is possible.

  19. WE-A-201-00: Anne and Donald Herbert Distinguished Lectureship On Modern Statistical Modeling

    NONE

    2016-06-15

    Regulatory Commission and may be remembered for his critique of the National Academy of Sciences BEIR III report (stating that their methodology “imposes a Delphic quality to the .. risk estimates”.) This led to his appointment as a member of the BEIR V committee. Don presented refresher courses at the AAPM, ASTRO and RSNA meetings and was active in the AAPM as a member or chair of several committees. He was the principal author of AAPM Report 43, which is essentially a critique of established clinical studies prior to 1992. He was co-editor of the Proceedings of many symposia on Time, Dose and Fractionation held in Madison, Wisconsin. He received the AAPM lifetime Achievement award in 2004. Don’s second wife of 46 years, Ann, predeceased him and he is survived by daughters Hillary and Emily, son John and two grandsons. Don was a true gentleman with a unique and erudite writing style illuminated by pithy quotations. If he had a fault it was, perhaps, that he did not realize how much smarter he was than the rest of us. This presentation draws heavily on a biography and video interview in the History and Heritage section of the AAPM website. The quote is his own. Andrzej Niemierko: Statistical modeling plays an essential role in modern medicine for quantitative evaluation of the effect of treatment. This session will feature an overview of statistical modeling techniques used for analyzing the many types of research data and an exploration of recent advances in new statistical modeling methodologies. Learning Objectives: To learn basics of statistical modeling methodology. To discuss statistical models that are frequently used in radiation oncology To discuss advanced modern statistical modeling methods and applications.

  20. WE-A-201-00: Anne and Donald Herbert Distinguished Lectureship On Modern Statistical Modeling

    2016-01-01

    Regulatory Commission and may be remembered for his critique of the National Academy of Sciences BEIR III report (stating that their methodology “imposes a Delphic quality to the .. risk estimates”.) This led to his appointment as a member of the BEIR V committee. Don presented refresher courses at the AAPM, ASTRO and RSNA meetings and was active in the AAPM as a member or chair of several committees. He was the principal author of AAPM Report 43, which is essentially a critique of established clinical studies prior to 1992. He was co-editor of the Proceedings of many symposia on Time, Dose and Fractionation held in Madison, Wisconsin. He received the AAPM lifetime Achievement award in 2004. Don’s second wife of 46 years, Ann, predeceased him and he is survived by daughters Hillary and Emily, son John and two grandsons. Don was a true gentleman with a unique and erudite writing style illuminated by pithy quotations. If he had a fault it was, perhaps, that he did not realize how much smarter he was than the rest of us. This presentation draws heavily on a biography and video interview in the History and Heritage section of the AAPM website. The quote is his own. Andrzej Niemierko: Statistical modeling plays an essential role in modern medicine for quantitative evaluation of the effect of treatment. This session will feature an overview of statistical modeling techniques used for analyzing the many types of research data and an exploration of recent advances in new statistical modeling methodologies. Learning Objectives: To learn basics of statistical modeling methodology. To discuss statistical models that are frequently used in radiation oncology To discuss advanced modern statistical modeling methods and applications.

  1. Modern Instrumental Methods in Forensic Toxicology*

    Smith, Michael L.; Vorce, Shawn P.; Holler, Justin M.; Shimomura, Eric; Magluilo, Joe; Jacobs, Aaron J.; Huestis, Marilyn A.

    2009-01-01

    This article reviews modern analytical instrumentation in forensic toxicology for identification and quantification of drugs and toxins in biological fluids and tissues. A brief description of the theory and inherent strengths and limitations of each methodology is included. The focus is on new technologies that address current analytical limitations. A goal of this review is to encourage innovations to improve our technological capabilities and to encourage use of these analytical techniques in forensic toxicology practice. PMID:17579968

  2. Statistical Models and Methods for Lifetime Data

    Lawless, Jerald F

    2011-01-01

    Praise for the First Edition"An indispensable addition to any serious collection on lifetime data analysis and . . . a valuable contribution to the statistical literature. Highly recommended . . ."-Choice"This is an important book, which will appeal to statisticians working on survival analysis problems."-Biometrics"A thorough, unified treatment of statistical models and methods used in the analysis of lifetime data . . . this is a highly competent and agreeable statistical textbook."-Statistics in MedicineThe statistical analysis of lifetime or response time data is a key tool in engineering,

  3. Multivariate statistical methods a first course

    Marcoulides, George A

    2014-01-01

    Multivariate statistics refer to an assortment of statistical methods that have been developed to handle situations in which multiple variables or measures are involved. Any analysis of more than two variables or measures can loosely be considered a multivariate statistical analysis. An introductory text for students learning multivariate statistical methods for the first time, this book keeps mathematical details to a minimum while conveying the basic principles. One of the principal strategies used throughout the book--in addition to the presentation of actual data analyses--is poin

  4. Use of Modern Birth Control Methods Among Rural Communities in ...

    elearning

    ABSTRACT. This paper studied the extent of utilization of Modern Birth Control Methods (MBCM) among rural dwellers in ... respondents used MBCM while 57% of them used the traditional birth control methods. ..... School of Public Health.

  5. Statistical Methods for Environmental Pollution Monitoring

    Gilbert, Richard O. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    1987-01-01

    The application of statistics to environmental pollution monitoring studies requires a knowledge of statistical analysis methods particularly well suited to pollution data. This book fills that need by providing sampling plans, statistical tests, parameter estimation procedure techniques, and references to pertinent publications. Most of the statistical techniques are relatively simple, and examples, exercises, and case studies are provided to illustrate procedures. The book is logically divided into three parts. Chapters 1, 2, and 3 are introductory chapters. Chapters 4 through 10 discuss field sampling designs and Chapters 11 through 18 deal with a broad range of statistical analysis procedures. Some statistical techniques given here are not commonly seen in statistics book. For example, see methods for handling correlated data (Sections 4.5 and 11.12), for detecting hot spots (Chapter 10), and for estimating a confidence interval for the mean of a lognormal distribution (Section 13.2). Also, Appendix B lists a computer code that estimates and tests for trends over time at one or more monitoring stations using nonparametric methods (Chapters 16 and 17). Unfortunately, some important topics could not be included because of their complexity and the need to limit the length of the book. For example, only brief mention could be made of time series analysis using Box-Jenkins methods and of kriging techniques for estimating spatial and spatial-time patterns of pollution, although multiple references on these topics are provided. Also, no discussion of methods for assessing risks from environmental pollution could be included.

  6. Statistical methods and challenges in connectome genetics

    Pluta, Dustin; Yu, Zhaoxia; Shen, Tong; Chen, Chuansheng; Xue, Gui; Ombao, Hernando

    2018-01-01

    The study of genetic influences on brain connectivity, known as connectome genetics, is an exciting new direction of research in imaging genetics. We here review recent results and current statistical methods in this area, and discuss some

  7. Statistical methods in personality assessment research.

    Schinka, J A; LaLone, L; Broeckel, J A

    1997-06-01

    Emerging models of personality structure and advances in the measurement of personality and psychopathology suggest that research in personality and personality assessment has entered a stage of advanced development, in this article we examine whether researchers in these areas have taken advantage of new and evolving statistical procedures. We conducted a review of articles published in the Journal of Personality, Assessment during the past 5 years. Of the 449 articles that included some form of data analysis, 12.7% used only descriptive statistics, most employed only univariate statistics, and fewer than 10% used multivariate methods of data analysis. We discuss the cost of using limited statistical methods, the possible reasons for the apparent reluctance to employ advanced statistical procedures, and potential solutions to this technical shortcoming.

  8. Barriers to utilization of modern methods of family planning amongst ...

    Barriers to utilization of modern methods of family planning amongst women in a ... is recognized by the world health organization (WHO) as a universal human right. ... Conclusion: The study finds numerous barriers to utilization of family ...

  9. Chinese Cyber Espionage: A Complementary Method to Aid PLA Modernization

    2015-12-01

    COMPLEMENTARY METHOD TO AID PLA MODERNIZATION by Jamie M. Ellis December 2015 Thesis Advisor: Wade L. Huntley Second Reader: Christopher R. Twomey THIS...Master’s Thesis 4. TITLE AND SUBTITLE CHINESE CYBER ESPIONAGE: A COMPLEMENTARY METHOD TO AID PLA MODERNIZATION 5. FUNDING NUMBERS 6. AUTHOR(S) Jamie M...DISTRIBUTION CODE A 13. ABSTRACT (maximum 200 words) In 2013, Mandiant published a report linking one People’s Liberation Army ( PLA ) unit to the

  10. Spatial analysis statistics, visualization, and computational methods

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...

  11. Workshop on Analytical Methods in Statistics

    Jurečková, Jana; Maciak, Matúš; Pešta, Michal

    2017-01-01

    This volume collects authoritative contributions on analytical methods and mathematical statistics. The methods presented include resampling techniques; the minimization of divergence; estimation theory and regression, eventually under shape or other constraints or long memory; and iterative approximations when the optimal solution is difficult to achieve. It also investigates probability distributions with respect to their stability, heavy-tailness, Fisher information and other aspects, both asymptotically and non-asymptotically. The book not only presents the latest mathematical and statistical methods and their extensions, but also offers solutions to real-world problems including option pricing. The selected, peer-reviewed contributions were originally presented at the workshop on Analytical Methods in Statistics, AMISTAT 2015, held in Prague, Czech Republic, November 10-13, 2015.

  12. Modern methods in cereal grain mycology

    Olsson, Johan

    2000-01-01

    A simple rapid DNA extraction method, equally suitable for spores and mycelia is proposed. Heating samples in NaOH and SDS provides DNA of high purity, suitable for Polymerase Chain Reaction (PCR) analysis. For Penicillium roqueforti the detection limit was 6 x lo3 conidia and 1 mg (fresh weight) mycelium in the extraction liquid. The method proved efficient with Aspergillus jlavus, Fusarium graminearum, Rhizopus stolonifer, Eurotium herbariorum, and Cladosporium herbarum, as well. An optimis...

  13. Modern methods of wine quality analysis

    Галина Зуфарівна Гайда

    2015-06-01

    Full Text Available  In this paper physical-chemical and enzymatic methods of quantitative analysis of the basic wine components were reviewed. The results of own experiments were presented for the development of enzyme- and cell-based amperometric sensors on ethanol, lactate, glucose, arginine

  14. Statistical Methods for Stochastic Differential Equations

    Kessler, Mathieu; Sorensen, Michael

    2012-01-01

    The seventh volume in the SemStat series, Statistical Methods for Stochastic Differential Equations presents current research trends and recent developments in statistical methods for stochastic differential equations. Written to be accessible to both new students and seasoned researchers, each self-contained chapter starts with introductions to the topic at hand and builds gradually towards discussing recent research. The book covers Wiener-driven equations as well as stochastic differential equations with jumps, including continuous-time ARMA processes and COGARCH processes. It presents a sp

  15. Modern recycling methods in metallurgical industry

    M. Maj

    2010-04-01

    Full Text Available The contamination of environment caused by increased industrial activities is the main topic of discussions in Poland and in the world. The possibilities of waste recovery and recycling vary in different sectors of the industry, and the specific methods, developed and improved all the time, depend on the type of the waste. In this study, the attention has been focussed mainly on the waste from metallurgical industry and on the available techniques of its recycling

  16. Statistical methods for spatio-temporal systems

    Finkenstadt, Barbel

    2006-01-01

    Statistical Methods for Spatio-Temporal Systems presents current statistical research issues on spatio-temporal data modeling and will promote advances in research and a greater understanding between the mechanistic and the statistical modeling communities.Contributed by leading researchers in the field, each self-contained chapter starts with an introduction of the topic and progresses to recent research results. Presenting specific examples of epidemic data of bovine tuberculosis, gastroenteric disease, and the U.K. foot-and-mouth outbreak, the first chapter uses stochastic models, such as point process models, to provide the probabilistic backbone that facilitates statistical inference from data. The next chapter discusses the critical issue of modeling random growth objects in diverse biological systems, such as bacteria colonies, tumors, and plant populations. The subsequent chapter examines data transformation tools using examples from ecology and air quality data, followed by a chapter on space-time co...

  17. Knowledge attitude to modern family planning methods in Abraka ...

    Objective:. To assess the level of regard and misconceptions of modern family planning methods in Abraka communities. Methods: The interviewer\\'s administered questionnaire method was used to gather the required information from 657 respondents randomly chosen from PO, Ajalomi, Erho, Oria, Otorho, Umeghe, ...

  18. Statistical methods and challenges in connectome genetics

    Pluta, Dustin

    2018-03-12

    The study of genetic influences on brain connectivity, known as connectome genetics, is an exciting new direction of research in imaging genetics. We here review recent results and current statistical methods in this area, and discuss some of the persistent challenges and possible directions for future work.

  19. Statistic methods for searching inundated radioactive entities

    Dubasov, Yu.V.; Krivokhatskij, A.S.; Khramov, N.N.

    1993-01-01

    The problem of searching flooded radioactive object in a present area was considered. Various models of the searching route plotting are discussed. It is shown that spiral route by random points from the centre of the area examined is the most efficient one. The conclusion is made that, when searching flooded radioactive objects, it is advisable to use multidimensional statistical methods of classification

  20. Application of Turchin's method of statistical regularization

    Zelenyi, Mikhail; Poliakova, Mariia; Nozik, Alexander; Khudyakov, Alexey

    2018-04-01

    During analysis of experimental data, one usually needs to restore a signal after it has been convoluted with some kind of apparatus function. According to Hadamard's definition this problem is ill-posed and requires regularization to provide sensible results. In this article we describe an implementation of the Turchin's method of statistical regularization based on the Bayesian approach to the regularization strategy.

  1. Modern Geometric Methods of Distance Determination

    Thévenin, Frédéric; Falanga, Maurizio; Kuo, Cheng Yu; Pietrzyński, Grzegorz; Yamaguchi, Masaki

    2017-11-01

    Building a 3D picture of the Universe at any distance is one of the major challenges in astronomy, from the nearby Solar System to distant Quasars and galaxies. This goal has forced astronomers to develop techniques to estimate or to measure the distance of point sources on the sky. While most distance estimates used since the beginning of the 20th century are based on our understanding of the physics of objects of the Universe: stars, galaxies, QSOs, the direct measures of distances are based on the geometric methods as developed in ancient Greece: the parallax, which has been applied to stars for the first time in the mid-19th century. In this review, different techniques of geometrical astrometry applied to various stellar and cosmological (Megamaser) objects are presented. They consist in parallax measurements from ground based equipment or from space missions, but also in the study of binary stars or, as we shall see, of binary systems in distant extragalactic sources using radio telescopes. The Gaia mission will be presented in the context of stellar physics and galactic structure, because this key space mission in astronomy will bring a breakthrough in our understanding of stars, galaxies and the Universe in their nature and evolution with time. Measuring the distance to a star is the starting point for an unbiased description of its physics and the estimate of its fundamental parameters like its age. Applying these studies to candles such as the Cepheids will impact our large distance studies and calibration of other candles. The text is constructed as follows: introducing the parallax concept and measurement, we shall present briefly the Gaia satellite which will be the future base catalogue of stellar astronomy in the near future. Cepheids will be discussed just after to demonstrate the state of the art in distance measurements in the Universe with these variable stars, with the objective of 1% of error in distances that could be applied to our closest

  2. Statistical Methods for Unusual Count Data

    Guthrie, Katherine A.; Gammill, Hilary S.; Kamper-Jørgensen, Mads

    2016-01-01

    microchimerism data present challenges for statistical analysis, including a skewed distribution, excess zero values, and occasional large values. Methods for comparing microchimerism levels across groups while controlling for covariates are not well established. We compared statistical models for quantitative...... microchimerism values, applied to simulated data sets and 2 observed data sets, to make recommendations for analytic practice. Modeling the level of quantitative microchimerism as a rate via Poisson or negative binomial model with the rate of detection defined as a count of microchimerism genome equivalents per...

  3. Method and computer program product for maintenance and modernization backlogging

    Mattimore, Bernard G; Reynolds, Paul E; Farrell, Jill M

    2013-02-19

    According to one embodiment, a computer program product for determining future facility conditions includes a computer readable medium having computer readable program code stored therein. The computer readable program code includes computer readable program code for calculating a time period specific maintenance cost, for calculating a time period specific modernization factor, and for calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. In another embodiment, a computer-implemented method for calculating future facility conditions includes calculating a time period specific maintenance cost, calculating a time period specific modernization factor, and calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. Other embodiments are also presented.

  4. [Watsu: a modern method in physiotherapy, body regeneration, and sports].

    Weber-Nowakowska, Katarzyna; Gebska, Magdalena; Zyzniewska-Banaszak, Ewelina

    2013-01-01

    Progress in existing methods of physiotherapy and body regeneration and introduction of new methods has made it possible to precisely select the techniques according to patient needs. The modern therapist is capable of improving the physical and mental condition of the patient. Watsu helps the therapist eliminate symptoms from the locomotor system and reach the psychic sphere at the same time.

  5. The role of nonequilibrium thermo-mechanical statistics in modern technologies and industrial processes: an overview

    Rodrigues, Clóves G.; Silva, Antônio A. P.; Silva, Carlos A. B.; Vasconcellos, Áurea R.; Ramos, J. Galvão; Luzzi, Roberto

    2010-01-01

    The nowadays notable development of all the modern technology, fundamental for the progress and well being of world society, imposes a great deal of stress in the realm of basic Physics, more precisely on Thermo-Statistics. We do face situations in electronics and optoelectronics involving physical-chemical systems far-removed-from equilibrium, where ultrafast (in pico- and femto-second scale) and non-linear processes are present. Further, we need to be aware of the rapid unfolding of nano-te...

  6. The statistical process control methods - SPC

    Floreková Ľubica

    1998-03-01

    Full Text Available Methods of statistical evaluation of quality – SPC (item 20 of the documentation system of quality control of ISO norm, series 900 of various processes, products and services belong amongst basic qualitative methods that enable us to analyse and compare data pertaining to various quantitative parameters. Also they enable, based on the latter, to propose suitable interventions with the aim of improving these processes, products and services. Theoretical basis and applicatibily of the principles of the: - diagnostics of a cause and effects, - Paret analysis and Lorentz curve, - number distribution and frequency curves of random variable distribution, - Shewhart regulation charts, are presented in the contribution.

  7. Statistical methods towards more efficient infiltration measurements.

    Franz, T; Krebs, P

    2006-01-01

    A comprehensive knowledge about the infiltration situation in a catchment is required for operation and maintenance. Due to the high expenditures, an optimisation of necessary measurement campaigns is essential. Methods based on multivariate statistics were developed to improve the information yield of measurements by identifying appropriate gauge locations. The methods have a high degree of freedom against data needs. They were successfully tested on real and artificial data. For suitable catchments, it is estimated that the optimisation potential amounts up to 30% accuracy improvement compared to nonoptimised gauge distributions. Beside this, a correlation between independent reach parameters and dependent infiltration rates could be identified, which is not dominated by the groundwater head.

  8. Modern Statistical Methods in Oceanography: A Hierarchical Perspective

    Wikle, Christopher K.; Milliff, Ralph F.; Herbei, Radu; Leeds, William B.

    2013-01-01

    Processes in ocean physics, air-sea interaction and ocean biogeochemistry span enormous ranges in spatial and temporal scales, that is, from molecular to planetary and from seconds to millennia. Identifying and implementing sustainable human practices depend critically on our understandings of key aspects of ocean physics and ecology within these scale ranges. The set of all ocean data is distorted such that three- and four-dimensional (i.e., time-dependent) in situ data are very sparse, whil...

  9. Statistical trend analysis methods for temporal phenomena

    Lehtinen, E.; Pulkkinen, U.; Poern, K.

    1997-04-01

    We consider point events occurring in a random way in time. In many applications the pattern of occurrence is of intrinsic interest as indicating a trend or some other systematic feature in the rate of occurrence. The purpose of this report is to survey briefly different statistical trend analysis methods and illustrate their applicability to temporal phenomena in particular. The trend testing of point events is usually seen as the testing of the hypotheses concerning the intensity of the occurrence of events. When the intensity function is parametrized, the testing of trend is a typical parametric testing problem. In industrial applications the operational experience generally does not suggest any specified model and method in advance. Therefore, and particularly, if the Poisson process assumption is very questionable, it is desirable to apply tests that are valid for a wide variety of possible processes. The alternative approach for trend testing is to use some non-parametric procedure. In this report we have presented four non-parametric tests: The Cox-Stuart test, the Wilcoxon signed ranks test, the Mann test, and the exponential ordered scores test. In addition to the classical parametric and non-parametric approaches we have also considered the Bayesian trend analysis. First we discuss a Bayesian model, which is based on a power law intensity model. The Bayesian statistical inferences are based on the analysis of the posterior distribution of the trend parameters, and the probability of trend is immediately seen from these distributions. We applied some of the methods discussed in an example case. It should be noted, that this report is a feasibility study rather than a scientific evaluation of statistical methods, and the examples can only be seen as demonstrations of the methods

  10. Statistical trend analysis methods for temporal phenomena

    Lehtinen, E.; Pulkkinen, U. [VTT Automation, (Finland); Poern, K. [Poern Consulting, Nykoeping (Sweden)

    1997-04-01

    We consider point events occurring in a random way in time. In many applications the pattern of occurrence is of intrinsic interest as indicating a trend or some other systematic feature in the rate of occurrence. The purpose of this report is to survey briefly different statistical trend analysis methods and illustrate their applicability to temporal phenomena in particular. The trend testing of point events is usually seen as the testing of the hypotheses concerning the intensity of the occurrence of events. When the intensity function is parametrized, the testing of trend is a typical parametric testing problem. In industrial applications the operational experience generally does not suggest any specified model and method in advance. Therefore, and particularly, if the Poisson process assumption is very questionable, it is desirable to apply tests that are valid for a wide variety of possible processes. The alternative approach for trend testing is to use some non-parametric procedure. In this report we have presented four non-parametric tests: The Cox-Stuart test, the Wilcoxon signed ranks test, the Mann test, and the exponential ordered scores test. In addition to the classical parametric and non-parametric approaches we have also considered the Bayesian trend analysis. First we discuss a Bayesian model, which is based on a power law intensity model. The Bayesian statistical inferences are based on the analysis of the posterior distribution of the trend parameters, and the probability of trend is immediately seen from these distributions. We applied some of the methods discussed in an example case. It should be noted, that this report is a feasibility study rather than a scientific evaluation of statistical methods, and the examples can only be seen as demonstrations of the methods. 14 refs, 10 figs.

  11. Mathematical methods in quantum and statistical mechanics

    Fishman, L.

    1977-01-01

    The mathematical structure and closed-form solutions pertaining to several physical problems in quantum and statistical mechanics are examined in some detail. The J-matrix method, introduced previously for s-wave scattering and based upon well-established Hilbert Space theory and related generalized integral transformation techniques, is extended to treat the lth partial wave kinetic energy and Coulomb Hamiltonians within the context of square integrable (L 2 ), Laguerre (Slater), and oscillator (Gaussian) basis sets. The theory of relaxation in statistical mechanics within the context of the theory of linear integro-differential equations of the Master Equation type and their corresponding Markov processes is examined. Several topics of a mathematical nature concerning various computational aspects of the L 2 approach to quantum scattering theory are discussed

  12. Statistical methods for assessment of blend homogeneity

    Madsen, Camilla

    2002-01-01

    In this thesis the use of various statistical methods to address some of the problems related to assessment of the homogeneity of powder blends in tablet production is discussed. It is not straight forward to assess the homogeneity of a powder blend. The reason is partly that in bulk materials......, it is shown how to set up parametric acceptance criteria for the batch that gives a high confidence that future samples with a probability larger than a specified value will pass the USP threeclass criteria. Properties and robustness of proposed changes to the USP test for content uniformity are investigated...

  13. Identifying User Profiles from Statistical Grouping Methods

    Francisco Kelsen de Oliveira

    2018-02-01

    Full Text Available This research aimed to group users into subgroups according to their levels of knowledge about technology. Statistical hierarchical and non-hierarchical clustering methods were studied, compared and used in the creations of the subgroups from the similarities of the skill levels with these users’ technology. The research sample consisted of teachers who answered online questionnaires about their skills with the use of software and hardware with educational bias. The statistical methods of grouping were performed and showed the possibilities of groupings of the users. The analyses of these groups allowed to identify the common characteristics among the individuals of each subgroup. Therefore, it was possible to define two subgroups of users, one with skill in technology and another with skill with technology, so that the partial results of the research showed two main algorithms for grouping with 92% similarity in the formation of groups of users with skill with technology and the other with little skill, confirming the accuracy of the techniques of discrimination against individuals.

  14. Statistical sampling method for releasing decontaminated vehicles

    Lively, J.W.; Ware, J.A.

    1996-01-01

    Earth moving vehicles (e.g., dump trucks, belly dumps) commonly haul radiologically contaminated materials from a site being remediated to a disposal site. Traditionally, each vehicle must be surveyed before being released. The logistical difficulties of implementing the traditional approach on a large scale demand that an alternative be devised. A statistical method (MIL-STD-105E, open-quotes Sampling Procedures and Tables for Inspection by Attributesclose quotes) for assessing product quality from a continuous process was adapted to the vehicle decontamination process. This method produced a sampling scheme that automatically compensates and accommodates fluctuating batch sizes and changing conditions without the need to modify or rectify the sampling scheme in the field. Vehicles are randomly selected (sampled) upon completion of the decontamination process to be surveyed for residual radioactive surface contamination. The frequency of sampling is based on the expected number of vehicles passing through the decontamination process in a given period and the confidence level desired. This process has been successfully used for 1 year at the former uranium mill site in Monticello, Utah (a CERCLA regulated clean-up site). The method forces improvement in the quality of the decontamination process and results in a lower likelihood that vehicles exceeding the surface contamination standards are offered for survey. Implementation of this statistical sampling method on Monticello Projects has resulted in more efficient processing of vehicles through decontamination and radiological release, saved hundreds of hours of processing time, provided a high level of confidence that release limits are met, and improved the radiological cleanliness of vehicles leaving the controlled site

  15. Modern Methods of Voice Authentication in Mobile Devices

    Vladimir Leonovich Evseev

    2016-03-01

    Full Text Available Modern methods of voice authentication in mobile devices.The proposed evaluation of the probability errors of the first and second kind for multi-modal methods of voice authentication. The advantages of multimodal multivariate methods before, when authentication takes place in several stages – this is the one-stage, which means convenience for customers. Further development of multimodal methods of authentication will be based on the significantly increased computing power of mobile devices, the growing number and improved accuracy built-in mobile device sensors, as well as to improve the algorithms of signal processing.

  16. Statistical Software for State Space Methods

    Jacques J. F. Commandeur

    2011-05-01

    Full Text Available In this paper we review the state space approach to time series analysis and establish the notation that is adopted in this special volume of the Journal of Statistical Software. We first provide some background on the history of state space methods for the analysis of time series. This is followed by a concise overview of linear Gaussian state space analysis including the modelling framework and appropriate estimation methods. We discuss the important class of unobserved component models which incorporate a trend, a seasonal, a cycle, and fixed explanatory and intervention variables for the univariate and multivariate analysis of time series. We continue the discussion by presenting methods for the computation of different estimates for the unobserved state vector: filtering, prediction, and smoothing. Estimation approaches for the other parameters in the model are also considered. Next, we discuss how the estimation procedures can be used for constructing confidence intervals, detecting outlier observations and structural breaks, and testing model assumptions of residual independence, homoscedasticity, and normality. We then show how ARIMA and ARIMA components models fit in the state space framework to time series analysis. We also provide a basic introduction for non-Gaussian state space models. Finally, we present an overview of the software tools currently available for the analysis of time series with state space methods as they are discussed in the other contributions to this special volume.

  17. Application of pedagogy reflective in statistical methods course and practicum statistical methods

    Julie, Hongki

    2017-08-01

    Subject Elementary Statistics, Statistical Methods and Statistical Methods Practicum aimed to equip students of Mathematics Education about descriptive statistics and inferential statistics. The students' understanding about descriptive and inferential statistics were important for students on Mathematics Education Department, especially for those who took the final task associated with quantitative research. In quantitative research, students were required to be able to present and describe the quantitative data in an appropriate manner, to make conclusions from their quantitative data, and to create relationships between independent and dependent variables were defined in their research. In fact, when students made their final project associated with quantitative research, it was not been rare still met the students making mistakes in the steps of making conclusions and error in choosing the hypothetical testing process. As a result, they got incorrect conclusions. This is a very fatal mistake for those who did the quantitative research. There were some things gained from the implementation of reflective pedagogy on teaching learning process in Statistical Methods and Statistical Methods Practicum courses, namely: 1. Twenty two students passed in this course and and one student did not pass in this course. 2. The value of the most accomplished student was A that was achieved by 18 students. 3. According all students, their critical stance could be developed by them, and they could build a caring for each other through a learning process in this course. 4. All students agreed that through a learning process that they undergo in the course, they can build a caring for each other.

  18. The application of statistical methods to assess economic assets

    D. V. Dianov

    2017-01-01

    out precisely in the boundary of the typological group to which the object is identified.The rationale for the comprehensive application of statistical methods in the implementation of cost and comparative approaches in the assessment of economic assets, practical component, are a primary result of scientific research. It is not enough to use methodological developments in the assessment activities in modern conditions of market development and scientific and technical level for the large-scale evaluation of all available material resources of the economy and their total potential. The application of mathematical-statistical apparatus, therefore, is an objective necessity for obtaining general indicators of the size of the national wealth.In conclusion, we can mention about the methodical approaches, the building of model algorithms application of statistical methods in solving scientific and practical problems, depending on the identification belonging of the valued objects. It is premature to talk about the formalization of the application of statistical methods, the results of which would be transformed into a сertain reporting. It requires the solution of a question on the fixed assets’ census at least at the level of regions, subjects of the Russian Federation.

  19. Introduction to the basic concepts of modern physics special relativity, quantum and statistical physics

    Becchi, Carlo Maria

    2016-01-01

    This is the third edition of a well-received textbook on modern physics theory. This book provides an elementary but rigorous and self-contained presentation of the simplest theoretical framework that will meet the needs of undergraduate students. In addition, a number of examples of relevant applications and an appropriate list of solved problems are provided.Apart from a substantial extension of the proposed problems, the new edition provides more detailed discussion on Lorentz transformations and their group properties, a deeper treatment of quantum mechanics in a central potential, and a closer comparison of statistical mechanics in classical and in quantum physics. The first part of the book is devoted to special relativity, with a particular focus on space-time relativity and relativistic kinematics. The second part deals with Schrödinger's formulation of quantum mechanics. The presentation concerns mainly one-dimensional problems, but some three-dimensional examples are discussed in detail. The third...

  20. The Monte Carlo method the method of statistical trials

    Shreider, YuA

    1966-01-01

    The Monte Carlo Method: The Method of Statistical Trials is a systematic account of the fundamental concepts and techniques of the Monte Carlo method, together with its range of applications. Some of these applications include the computation of definite integrals, neutron physics, and in the investigation of servicing processes. This volume is comprised of seven chapters and begins with an overview of the basic features of the Monte Carlo method and typical examples of its application to simple problems in computational mathematics. The next chapter examines the computation of multi-dimensio

  1. On two methods of statistical image analysis

    Missimer, J; Knorr, U; Maguire, RP; Herzog, H; Seitz, RJ; Tellman, L; Leenders, K.L.

    1999-01-01

    The computerized brain atlas (CBA) and statistical parametric mapping (SPM) are two procedures for voxel-based statistical evaluation of PET activation studies. Each includes spatial standardization of image volumes, computation of a statistic, and evaluation of its significance. In addition,

  2. International Conference on Modern Mathematical Methods and High Performance Computing in Science and Technology

    Srivastava, HM; Venturino, Ezio; Resch, Michael; Gupta, Vijay

    2016-01-01

    The book discusses important results in modern mathematical models and high performance computing, such as applied operations research, simulation of operations, statistical modeling and applications, invisibility regions and regular meta-materials, unmanned vehicles, modern radar techniques/SAR imaging, satellite remote sensing, coding, and robotic systems. Furthermore, it is valuable as a reference work and as a basis for further study and research. All contributing authors are respected academicians, scientists and researchers from around the globe. All the papers were presented at the international conference on Modern Mathematical Methods and High Performance Computing in Science & Technology (M3HPCST 2015), held at Raj Kumar Goel Institute of Technology, Ghaziabad, India, from 27–29 December 2015, and peer-reviewed by international experts. The conference provided an exceptional platform for leading researchers, academicians, developers, engineers and technocrats from a broad range of disciplines ...

  3. Bayesian statistic methods and theri application in probabilistic simulation models

    Sergio Iannazzo

    2007-03-01

    Full Text Available Bayesian statistic methods are facing a rapidly growing level of interest and acceptance in the field of health economics. The reasons of this success are probably to be found on the theoretical fundaments of the discipline that make these techniques more appealing to decision analysis. To this point should be added the modern IT progress that has developed different flexible and powerful statistical software framework. Among them probably one of the most noticeably is the BUGS language project and its standalone application for MS Windows WinBUGS. Scope of this paper is to introduce the subject and to show some interesting applications of WinBUGS in developing complex economical models based on Markov chains. The advantages of this approach reside on the elegance of the code produced and in its capability to easily develop probabilistic simulations. Moreover an example of the integration of bayesian inference models in a Markov model is shown. This last feature let the analyst conduce statistical analyses on the available sources of evidence and exploit them directly as inputs in the economic model.

  4. Statistical methods for astronomical data analysis

    Chattopadhyay, Asis Kumar

    2014-01-01

    This book introduces “Astrostatistics” as a subject in its own right with rewarding examples, including work by the authors with galaxy and Gamma Ray Burst data to engage the reader. This includes a comprehensive blending of Astrophysics and Statistics. The first chapter’s coverage of preliminary concepts and terminologies for astronomical phenomenon will appeal to both Statistics and Astrophysics readers as helpful context. Statistics concepts covered in the book provide a methodological framework. A unique feature is the inclusion of different possible sources of astronomical data, as well as software packages for converting the raw data into appropriate forms for data analysis. Readers can then use the appropriate statistical packages for their particular data analysis needs. The ideas of statistical inference discussed in the book help readers determine how to apply statistical tests. The authors cover different applications of statistical techniques already developed or specifically introduced for ...

  5. Seasonal UK Drought Forecasting using Statistical Methods

    Richardson, Doug; Fowler, Hayley; Kilsby, Chris; Serinaldi, Francesco

    2016-04-01

    In the UK drought is a recurrent feature of climate with potentially large impacts on public water supply. Water companies' ability to mitigate the impacts of drought by managing diminishing availability depends on forward planning and it would be extremely valuable to improve forecasts of drought on monthly to seasonal time scales. By focusing on statistical forecasting methods, this research aims to provide techniques that are simpler, faster and computationally cheaper than physically based models. In general, statistical forecasting is done by relating the variable of interest (some hydro-meteorological variable such as rainfall or streamflow, or a drought index) to one or more predictors via some formal dependence. These predictors are generally antecedent values of the response variable or external factors such as teleconnections. A candidate model is Generalised Additive Models for Location, Scale and Shape parameters (GAMLSS). GAMLSS is a very flexible class allowing for more general distribution functions (e.g. highly skewed and/or kurtotic distributions) and the modelling of not just the location parameter but also the scale and shape parameters. Additionally GAMLSS permits the forecasting of an entire distribution, allowing the output to be assessed in probabilistic terms rather than simply the mean and confidence intervals. Exploratory analysis of the relationship between long-memory processes (e.g. large-scale atmospheric circulation patterns, sea surface temperatures and soil moisture content) and drought should result in the identification of suitable predictors to be included in the forecasting model, and further our understanding of the drivers of UK drought.

  6. Introduction to modern theoretical physics. Volume II. Quantum theory and statistical physics

    Harris, E.G.

    1975-01-01

    The topics discussed include the history and principles, some solvable problems, and symmetry in quantum mechanics, interference phenomena, approximation methods, some applications of nonrelativistic quantum mechanics, relativistic wave equations, quantum theory of radiation, second quantization, elementary particles and their interactions, thermodynamics, equilibrium statistical mechanics and its applications, the kinetic theory of gases, and collective phenomena

  7. Modern methods to improve the accuracy in fast neutron dosimetry

    Baers, B.; Karnani, H.; Seren, T.

    1985-01-01

    In order to improve the quality of fast neutron dose estimates at the reactor pressure vessel (PV) some modern methods are presented. In addition to basic principles, some error reduction procedures are also presented based on the combined use of relative measurements, direct sample taking from the pressure vessel and the use of iron and niobium as dosimeters. The influence of large systematic errors could be significantly reduced by carrying out relative measurements. This report also presents the successful use of niobium as a dosimeter by destructive treatment of PV samples. (author)

  8. Modern leadership and management methods for development organizations

    Samosudova Natalia V.

    2017-01-01

    Full Text Available The following article represents an overview of the basic theoretical concepts of leadership and management in the framework of the organization. The main scientific approaches to leadership are described in conjunction with various leadership styles and their correlation with different levels of effectiveness as a result of the organization’s activity. Certain characteristics applicable to leaders and managers are mentioned. Attitude and obligations of a modern construction project manager are discussed, along with the challenges the construction industry represents these days. Ideas about methods of complex analysis for further research and identifying leadership tactics and their impact on the success of the development organization are suggested.

  9. Modern Methods for Cost Management in Construction Enterprises

    Mesároš Peter

    2015-06-01

    Full Text Available Cost management should be seen as an essential function of enterprises which perform their activities in current market environment. One of the main factors affecting the level of achieved profit and favourable market position is cost structure. The company's ability to obtain necessary and reliable information on their own cost, subsequent processing and effective cost management is crucial for achieving success. This study focuses on cost management and the use of modern methods of cost management in construction enterprises. The aim of this paper is to identify approaches to cost management in Slovak construction enterprises, based on own empirical research.

  10. The statistical mind in modern society. The Netherlands 1850-1940. Volume II: statistics and scientific work

    Stamhuis, I.H.; Klep, P.M.M.; Maarseveen, J.G.S.J. van

    2008-01-01

    In the period 1850-1940 statistics developed as a new combination of theory and practice. A wide range of phenomena were looked at in a novel way and this statistical mindset had a pervasive influence in contemporary society. This development of statistics is closely interlinked with the process of

  11. Applying contemporary statistical techniques

    Wilcox, Rand R

    2003-01-01

    Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc

  12. METHODS OF TRAINING OF MODERN AIRCRAFT FLIGHT CREWS FOR INFLIGHT ABNORMAL CIRCUMSTANCES

    Yurii Hryshchenko

    2017-03-01

    Full Text Available Purpose: The purpose of this article is the theoretical justification of the existing methods and development of new methods of training the crews of modern aircraft for inflight abnormal circumstances. Methods: The article describes the research methods of engineering psychology, mathematical statistics and analysis of the correlation functions. Results: The example of the two accidents of aircraft with modern avionics is shown in the problem statement. The pilot made a sharp movement of the steering wheel while go-around, which has led to a sharp diving and impossibility of coming out of it. It was shown that the developed anti-stress training methods allow crews to train a human operator to prevent such events. The theoretical solution of the problem of optimization of the flight on the final approach, considering the human factor, is suggested to solve using the method of analysis of the autocorrelation function. Conclusions: It is necessary to additionally implement methods of teaching the counteracting of factorial overlaps into the training course using the complex modern aircraft simulators. It is enough to analyze a single pitch angle curve of the autocorrelation function to determine the phenomena of amplification of integral-differential motor dynamic stereotype of the pilot.

  13. Modern acupuncture-like stimulation methods: a literature review

    Min-Ho Jun

    2015-12-01

    Full Text Available Acupuncture therapy has been proved to be effective for diverse diseases, symptoms, and conditions in numerous clinical trials. The growing popularity of acupuncture therapy has triggered the development of modern acupuncture-like stimulation devices (ASDs, which are equivalent or superior to manual acupuncture with respect to safety, decreased risk of infection, and facilitation of clinical trials. Here, we aim to summarize the research on modern ASDs, with a focus on featured devices undergoing active research and their effectiveness and target symptoms, along with annual publication rates. We searched the popular electronic databases Medline, PubMed, the Cochrane Library, and Web of Science, and analyzed English-language studies on humans. Thereby, a total of 728 studies were identified, of which 195 studies met our inclusion criteria. Electrical stimulators were found to be the earliest and most widely studied devices (133 articles, followed by laser (44 articles, magnetic (16 articles, and ultrasound (2 articles stimulators. A total of 114 studies used randomized controlled trials, and 109 studies reported therapeutic benefits. The majority of the studies (32% focused on analgesia and pain-relief effects, followed by effects on brain activity (16%. All types of the reviewed ASDs were associated with increasing annual publication trends; specifically, the annual growth in publications regarding noninvasive stimulation methods was more rapid than that regarding invasive methods. Based on this observation, we anticipate that the noninvasive or minimally invasive ASDs will become more popular in acupuncture therapy.

  14. Modern methods of studying surfaces and their application to glasses

    Rauschenbach, B.; Haehnert, M.

    1977-05-01

    In the works are demonstrated modern methods for study of solid surfaces and its use of glasses. Study of the interaction of ions, electrons and photons with the glass surface provides information about the composition of the surface and its structure on an atomic scale. A qualitative analysis of a surface can be made with the aid of the Auger electron spectroscopy (AES) and the electron spectroscopy for chemical analysis (ESCA) and with the ion scattering (ISS and RBS) and the secondary ion mass spectrometry (SIMS). The structure of a surface can be studied by means of ion scattering and low-energy electron diffraction (LEED) and the topography of a surface by means of scanning electron microscopy (SEM). The ellipsometry is generally confined to measuring the thickness of very thin layers. The application these methods to the glass surfaces is demonstrated on series of examples. (author)

  15. Fibonacci’s Computation Methods vs Modern Algorithms

    Ernesto Burattini

    2013-12-01

    Full Text Available In this paper we discuss some computational procedures given by Leonardo Pisano Fibonacci in his famous Liber Abaci book, and we propose their translation into a modern language for computers (C ++. Among the other we describe the method of “cross” multiplication, we evaluate its computational complexity in algorithmic terms and we show the output of a C ++ code that describes the development of the method applied to the product of two integers. In a similar way we show the operations performed on fractions introduced by Fibonacci. Thanks to the possibility to reproduce on a computer, the Fibonacci’s different computational procedures, it was possible to identify some calculation errors present in the different versions of the original text.

  16. Statistical learning methods in high-energy and astrophysics analysis

    Zimmermann, J. [Forschungszentrum Juelich GmbH, Zentrallabor fuer Elektronik, 52425 Juelich (Germany) and Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)]. E-mail: zimmerm@mppmu.mpg.de; Kiesling, C. [Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)

    2004-11-21

    We discuss several popular statistical learning methods used in high-energy- and astro-physics analysis. After a short motivation for statistical learning we present the most popular algorithms and discuss several examples from current research in particle- and astro-physics. The statistical learning methods are compared with each other and with standard methods for the respective application.

  17. Development of a Research Methods and Statistics Concept Inventory

    Veilleux, Jennifer C.; Chapman, Kate M.

    2017-01-01

    Research methods and statistics are core courses in the undergraduate psychology major. To assess learning outcomes, it would be useful to have a measure that assesses research methods and statistical literacy beyond course grades. In two studies, we developed and provided initial validation results for a research methods and statistical knowledge…

  18. Statistical learning methods in high-energy and astrophysics analysis

    Zimmermann, J.; Kiesling, C.

    2004-01-01

    We discuss several popular statistical learning methods used in high-energy- and astro-physics analysis. After a short motivation for statistical learning we present the most popular algorithms and discuss several examples from current research in particle- and astro-physics. The statistical learning methods are compared with each other and with standard methods for the respective application

  19. INNOVATIVE METHODS OF TEACHING HISTORY AT MODERN UNIVERSITIES

    A. Yu. Suslov

    2017-01-01

    Full Text Available Introduction. As a discipline, History holds a specific place among disciplines of a humanitarian cycle of educational programs of higher education institutions regardless of university specialities. History plays an important role in the course of formation of a citizen and development of critical thinking of a personality as an element of a common culture. However, new federal standards require a drastic reduction of the classroom hours for studying a History course by students of non-humanitarian specialties, and, at the same time, enhancement of the contents of a discipline (its reorientation from History of Russia towards World History. Therefore, History programmes and courses demand up-to-date approaches, methods and didactic means to provide formation of holistic worldview of future experts.The aim of the article is to consider the features of innovative methods application in teaching history in high school taking into consideration modernization processes.Methodology and research methods. The research undertaken is based on activity and competence-based approaches. The methods of analysis and synthesis of the academic literature on the research topic were used; the methods of reflection and generalization of teaching activities of the Department of Humanitarian Disciplines of theKazanNationalResearchTechnologicalUniversity were applied as well.Results and scientific novelty. A modern view on historical education has been proposed as means of students’ systems thinking formation, designing the ideas about the world historical process among students, the mission ofRussia in this process, and evolution ofRussia as a part of the modern civilization. It is stated that History university course is designed not only to give the students strong subject knowledge, but also to create axiological orientations and abilities on the basis of the analysis of historical collisions, objective and subjective factors of society development. Moreover

  20. A chronicle of permutation statistical methods 1920–2000, and beyond

    Berry, Kenneth J; Mielke Jr , Paul W

    2014-01-01

    The focus of this book is on the birth and historical development of permutation statistical methods from the early 1920s to the near present. Beginning with the seminal contributions of R.A. Fisher, E.J.G. Pitman, and others in the 1920s and 1930s, permutation statistical methods were initially introduced to validate the assumptions of classical statistical methods. Permutation methods have advantages over classical methods in that they are optimal for small data sets and non-random samples, are data-dependent, and are free of distributional assumptions. Permutation probability values may be exact, or estimated via moment- or resampling-approximation procedures. Because permutation methods are inherently computationally-intensive, the evolution of computers and computing technology that made modern permutation methods possible accompanies the historical narrative. Permutation analogs of many well-known statistical tests are presented in a historical context, including multiple correlation and regression, ana...

  1. The "Diagnostic and Statistical Manual of Mental Disorders" as a Major Form of Dehumanization in the Modern World

    Gambrill, Eileen

    2014-01-01

    The "Diagnostic and Statistical Manual of Mental Disorders" (DSM) is one of the most successful technologies in modern times. In spite of well-argued critiques, the DSM and the idea of "mental illness" on which it is based flourish, with ever more (mis)behaviors labeled as brain diseases. Problems in living and related distress…

  2. Statistical models and methods for reliability and survival analysis

    Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo

    2013-01-01

    Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

  3. Modern methodic of power cardio training in students’ physical education

    A.Yu. Osipov

    2016-12-01

    Full Text Available Purpose: significant increase of students’ physical condition and health level at the account of application of modern power cardio training methodic. Material: 120 students (60 boys and 60 girls participated in the research. The age of the tested was 19 years. The research took one year. We used methodic of power and functional impact on trainees’ organism (HOT IRON. Such methodic is some systems of physical exercises with weights (mini-barbells, to be fulfilled under accompaniment of specially selected music. Results: we showed advantages of power-cardio and fitness trainings in students’ health improvement and in elimination obesity. Control tests showed experimental group students achieved confidently higher physical indicators. Boys demonstrated increase of physical strength and general endurance indicators. Girls had confidently better indicators of physical strength, flexibility and general endurance. Increase of control group students’ body mass can be explained by students’ insufficient physical activity at trainings, conducted as per traditional program. Conclusions: students’ trainings by power-cardio methodic with application HOT IRON exercises facilitate development the following physical qualities: strength and endurance in boys and strength, flexibility and endurance in girls. Besides, it was found that such systems of exercises facilitate normalization of boys’ body mass and correction of girls’ constitution.

  4. Pair Programming as a Modern Method of Teaching Computer Science

    Irena Nančovska Šerbec

    2008-10-01

    Full Text Available At the Faculty of Education, University of Ljubljana we educate future computer science teachers. Beside didactical, pedagogical, mathematical and other interdisciplinary knowledge, students gain knowledge and skills of programming that are crucial for computer science teachers. For all courses, the main emphasis is the absorption of professional competences, related to the teaching profession and the programming profile. The latter are selected according to the well-known document, the ACM Computing Curricula. The professional knowledge is therefore associated and combined with the teaching knowledge and skills. In the paper we present how to achieve competences related to programming by using different didactical models (semiotic ladder, cognitive objectives taxonomy, problem solving and modern teaching method “pair programming”. Pair programming differs from standard methods (individual work, seminars, projects etc.. It belongs to the extreme programming as a discipline of software development and is known to have positive effects on teaching first programming language. We have experimentally observed pair programming in the introductory programming course. The paper presents and analyzes the results of using this method: the aspects of satisfaction during programming and the level of gained knowledge. The results are in general positive and demonstrate the promising usage of this teaching method.

  5. Statistical methods of estimating mining costs

    Long, K.R.

    2011-01-01

    Until it was defunded in 1995, the U.S. Bureau of Mines maintained a Cost Estimating System (CES) for prefeasibility-type economic evaluations of mineral deposits and estimating costs at producing and non-producing mines. This system had a significant role in mineral resource assessments to estimate costs of developing and operating known mineral deposits and predicted undiscovered deposits. For legal reasons, the U.S. Geological Survey cannot update and maintain CES. Instead, statistical tools are under development to estimate mining costs from basic properties of mineral deposits such as tonnage, grade, mineralogy, depth, strip ratio, distance from infrastructure, rock strength, and work index. The first step was to reestimate "Taylor's Rule" which relates operating rate to available ore tonnage. The second step was to estimate statistical models of capital and operating costs for open pit porphyry copper mines with flotation concentrators. For a sample of 27 proposed porphyry copper projects, capital costs can be estimated from three variables: mineral processing rate, strip ratio, and distance from nearest railroad before mine construction began. Of all the variables tested, operating costs were found to be significantly correlated only with strip ratio.

  6. Innovative statistical methods for public health data

    Wilson, Jeffrey

    2015-01-01

    The book brings together experts working in public health and multi-disciplinary areas to present recent issues in statistical methodological development and their applications. This timely book will impact model development and data analyses of public health research across a wide spectrum of analysis. Data and software used in the studies are available for the reader to replicate the models and outcomes. The fifteen chapters range in focus from techniques for dealing with missing data with Bayesian estimation, health surveillance and population definition and implications in applied latent class analysis, to multiple comparison and meta-analysis in public health data. Researchers in biomedical and public health research will find this book to be a useful reference, and it can be used in graduate level classes.

  7. Methods of contemporary mathematical statistical physics

    2009-01-01

    This volume presents a collection of courses introducing the reader to the recent progress with attention being paid to laying solid grounds and developing various basic tools. An introductory chapter on lattice spin models is useful as a background for other lectures of the collection. The topics include new results on phase transitions for gradient lattice models (with introduction to the techniques of the reflection positivity), stochastic geometry reformulation of classical and quantum Ising models, the localization/delocalization transition for directed polymers. A general rigorous framework for theory of metastability is presented and particular applications in the context of Glauber and Kawasaki dynamics of lattice models are discussed. A pedagogical account of several recently discussed topics in nonequilibrium statistical mechanics with an emphasis on general principles is followed by a discussion of kinetically constrained spin models that are reflecting important peculiar features of glassy dynamic...

  8. MSD Recombination Method in Statistical Machine Translation

    Gros, Jerneja Žganec

    2008-11-01

    Freely available tools and language resources were used to build the VoiceTRAN statistical machine translation (SMT) system. Various configuration variations of the system are presented and evaluated. The VoiceTRAN SMT system outperformed the baseline conventional rule-based MT system in all English-Slovenian in-domain test setups. To further increase the generalization capability of the translation model for lower-coverage out-of-domain test sentences, an "MSD-recombination" approach was proposed. This approach not only allows a better exploitation of conventional translation models, but also performs well in the more demanding translation direction; that is, into a highly inflectional language. Using this approach in the out-of-domain setup of the English-Slovenian JRC-ACQUIS task, we have achieved significant improvements in translation quality.

  9. Statistical physics

    Sadovskii, Michael V

    2012-01-01

    This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.

  10. Statistical methods for categorical data analysis

    Powers, Daniel

    2008-01-01

    This book provides a comprehensive introduction to methods and models for categorical data analysis and their applications in social science research. Companion website also available, at https://webspace.utexas.edu/dpowers/www/

  11. An overview of recent developments in genomics and associated statistical methods.

    Bickel, Peter J; Brown, James B; Huang, Haiyan; Li, Qunhua

    2009-11-13

    The landscape of genomics has changed drastically in the last two decades. Increasingly inexpensive sequencing has shifted the primary focus from the acquisition of biological sequences to the study of biological function. Assays have been developed to study many intricacies of biological systems, and publicly available databases have given rise to integrative analyses that combine information from many sources to draw complex conclusions. Such research was the focus of the recent workshop at the Isaac Newton Institute, 'High dimensional statistics in biology'. Many computational methods from modern genomics and related disciplines were presented and discussed. Using, as much as possible, the material from these talks, we give an overview of modern genomics: from the essential assays that make data-generation possible, to the statistical methods that yield meaningful inference. We point to current analytical challenges, where novel methods, or novel applications of extant methods, are presently needed.

  12. Statistical methods and computing for big data

    Wang, Chun; Chen, Ming-Hui; Schifano, Elizabeth; Wu, Jing

    2016-01-01

    Big data are data on a massive scale in terms of volume, intensity, and complexity that exceed the capacity of standard analytic tools. They present opportunities as well as challenges to statisticians. The role of computational statisticians in scientific discovery from big data analyses has been under-recognized even by peer statisticians. This article summarizes recent methodological and software developments in statistics that address the big data challenges. Methodologies are grouped into three classes: subsampling-based, divide and conquer, and online updating for stream data. As a new contribution, the online updating approach is extended to variable selection with commonly used criteria, and their performances are assessed in a simulation study with stream data. Software packages are summarized with focuses on the open source R and R packages, covering recent tools that help break the barriers of computer memory and computing power. Some of the tools are illustrated in a case study with a logistic regression for the chance of airline delay. PMID:27695593

  13. Statistical methods and computing for big data.

    Wang, Chun; Chen, Ming-Hui; Schifano, Elizabeth; Wu, Jing; Yan, Jun

    2016-01-01

    Big data are data on a massive scale in terms of volume, intensity, and complexity that exceed the capacity of standard analytic tools. They present opportunities as well as challenges to statisticians. The role of computational statisticians in scientific discovery from big data analyses has been under-recognized even by peer statisticians. This article summarizes recent methodological and software developments in statistics that address the big data challenges. Methodologies are grouped into three classes: subsampling-based, divide and conquer, and online updating for stream data. As a new contribution, the online updating approach is extended to variable selection with commonly used criteria, and their performances are assessed in a simulation study with stream data. Software packages are summarized with focuses on the open source R and R packages, covering recent tools that help break the barriers of computer memory and computing power. Some of the tools are illustrated in a case study with a logistic regression for the chance of airline delay.

  14. Simple statistical methods for software engineering data and patterns

    Pandian, C Ravindranath

    2015-01-01

    Although there are countless books on statistics, few are dedicated to the application of statistical methods to software engineering. Simple Statistical Methods for Software Engineering: Data and Patterns fills that void. Instead of delving into overly complex statistics, the book details simpler solutions that are just as effective and connect with the intuition of problem solvers.Sharing valuable insights into software engineering problems and solutions, the book not only explains the required statistical methods, but also provides many examples, review questions, and case studies that prov

  15. Cratering statistics on asteroids: Methods and perspectives

    Chapman, C.

    2014-07-01

    Crater size-frequency distributions (SFDs) on the surfaces of solid-surfaced bodies in the solar system have provided valuable insights about planetary surface processes and about impactor populations since the first spacecraft images were obtained in the 1960s. They can be used to determine relative age differences between surficial units, to obtain absolute model ages if the impactor flux and scaling laws are understood, to assess various endogenic planetary or asteroidal processes that degrade craters or resurface units, as well as assess changes in impactor populations across the solar system and/or with time. The first asteroid SFDs were measured from Galileo images of Gaspra and Ida (cf., Chapman 2002). Despite the superficial simplicity of these studies, they are fraught with many difficulties, including confusion by secondary and/or endogenic cratering and poorly understood aspects of varying target properties (including regoliths, ejecta blankets, and nearly-zero-g rubble piles), widely varying attributes of impactors, and a host of methodological problems including recognizability of degraded craters, which is affected by illumination angle and by the ''personal equations'' of analysts. Indeed, controlled studies (Robbins et al. 2014) demonstrate crater-density differences of a factor of two or more between experienced crater counters. These inherent difficulties have been especially apparent in divergent results for Vesta from different members of the Dawn Science Team (cf. Russell et al. 2013). Indeed, they have been exacerbated by misuse of a widely available tool (Craterstats: hrscview.fu- berlin.de/craterstats.html), which incorrectly computes error bars for proper interpretation of cumulative SFDs, resulting in derived model ages specified to three significant figures and interpretations of statistically insignificant kinks. They are further exacerbated, and for other small-body crater SFDs analyzed by the Berlin group, by stubbornly adopting

  16. Statistical methods for handling incomplete data

    Kim, Jae Kwang

    2013-01-01

    ""… this book nicely blends the theoretical material and its application through examples, and will be of interest to students and researchers as a textbook or a reference book. Extensive coverage of recent advances in handling missing data provides resources and guidelines for researchers and practitioners in implementing the methods in new settings. … I plan to use this as a textbook for my teaching and highly recommend it.""-Biometrics, September 2014

  17. Teaching biology through statistics: application of statistical methods in genetics and zoology courses.

    Colon-Berlingeri, Migdalisel; Burrowes, Patricia A

    2011-01-01

    Incorporation of mathematics into biology curricula is critical to underscore for undergraduate students the relevance of mathematics to most fields of biology and the usefulness of developing quantitative process skills demanded in modern biology. At our institution, we have made significant changes to better integrate mathematics into the undergraduate biology curriculum. The curricular revision included changes in the suggested course sequence, addition of statistics and precalculus as prerequisites to core science courses, and incorporating interdisciplinary (math-biology) learning activities in genetics and zoology courses. In this article, we describe the activities developed for these two courses and the assessment tools used to measure the learning that took place with respect to biology and statistics. We distinguished the effectiveness of these learning opportunities in helping students improve their understanding of the math and statistical concepts addressed and, more importantly, their ability to apply them to solve a biological problem. We also identified areas that need emphasis in both biology and mathematics courses. In light of our observations, we recommend best practices that biology and mathematics academic departments can implement to train undergraduates for the demands of modern biology.

  18. Chinese cyber espionage: a complementary method to aid PLA modernization

    Ellis, Jamie M.

    2015-01-01

    Approved for public release; distribution is unlimited In 2013, Mandiant published a report linking one People’s Liberation Army (PLA) unit to the virtual exploitation of 11 modern U.S. military platforms. In the last two decades, Chinese cyber espionage has cultivated a significant reputation in cyberspace for its high-volume, illicit exploitation of defense technology. At the same time, the PLA has also rapidly modernized its naval, fighter jet, and air defense technologies. This thesis ...

  19. Statistical methods and their applications in constructional engineering

    1977-01-01

    An introduction into the basic terms of statistics is followed by a discussion of elements of the probability theory, customary discrete and continuous distributions, simulation methods, statistical supporting framework dynamics, and a cost-benefit analysis of the methods introduced. (RW) [de

  20. Online Statistics Labs in MSW Research Methods Courses: Reducing Reluctance toward Statistics

    Elliott, William; Choi, Eunhee; Friedline, Terri

    2013-01-01

    This article presents results from an evaluation of an online statistics lab as part of a foundations research methods course for master's-level social work students. The article discusses factors that contribute to an environment in social work that fosters attitudes of reluctance toward learning and teaching statistics in research methods…

  1. Application of nonparametric statistic method for DNBR limit calculation

    Dong Bo; Kuang Bo; Zhu Xuenong

    2013-01-01

    Background: Nonparametric statistical method is a kind of statistical inference method not depending on a certain distribution; it calculates the tolerance limits under certain probability level and confidence through sampling methods. The DNBR margin is one important parameter of NPP design, which presents the safety level of NPP. Purpose and Methods: This paper uses nonparametric statistical method basing on Wilks formula and VIPER-01 subchannel analysis code to calculate the DNBR design limits (DL) of 300 MW NPP (Nuclear Power Plant) during the complete loss of flow accident, simultaneously compared with the DL of DNBR through means of ITDP to get certain DNBR margin. Results: The results indicate that this method can gain 2.96% DNBR margin more than that obtained by ITDP methodology. Conclusions: Because of the reduction of the conservation during analysis process, the nonparametric statistical method can provide greater DNBR margin and the increase of DNBR margin is benefited for the upgrading of core refuel scheme. (authors)

  2. Statistical Diversions

    Petocz, Peter; Sowey, Eric

    2008-01-01

    In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…

  3. The estimation of the measurement results with using statistical methods

    Ukrmetrteststandard, 4, Metrologichna Str., 03680, Kyiv (Ukraine))" data-affiliation=" (State Enterprise Ukrmetrteststandard, 4, Metrologichna Str., 03680, Kyiv (Ukraine))" >Velychko, O; UkrNDIspirtbioprod, 3, Babushkina Lane, 03190, Kyiv (Ukraine))" data-affiliation=" (State Scientific Institution UkrNDIspirtbioprod, 3, Babushkina Lane, 03190, Kyiv (Ukraine))" >Gordiyenko, T

    2015-01-01

    The row of international standards and guides describe various statistical methods that apply for a management, control and improvement of processes with the purpose of realization of analysis of the technical measurement results. The analysis of international standards and guides on statistical methods estimation of the measurement results recommendations for those applications in laboratories is described. For realization of analysis of standards and guides the cause-and-effect Ishikawa diagrams concerting to application of statistical methods for estimation of the measurement results are constructed

  4. The estimation of the measurement results with using statistical methods

    Velychko, O.; Gordiyenko, T.

    2015-02-01

    The row of international standards and guides describe various statistical methods that apply for a management, control and improvement of processes with the purpose of realization of analysis of the technical measurement results. The analysis of international standards and guides on statistical methods estimation of the measurement results recommendations for those applications in laboratories is described. For realization of analysis of standards and guides the cause-and-effect Ishikawa diagrams concerting to application of statistical methods for estimation of the measurement results are constructed.

  5. Application of modern tests for stationarity to single-trial MEG data: transferring powerful statistical tools from econometrics to neuroscience.

    Kipiński, Lech; König, Reinhard; Sielużycki, Cezary; Kordecki, Wojciech

    2011-10-01

    Stationarity is a crucial yet rarely questioned assumption in the analysis of time series of magneto- (MEG) or electroencephalography (EEG). One key drawback of the commonly used tests for stationarity of encephalographic time series is the fact that conclusions on stationarity are only indirectly inferred either from the Gaussianity (e.g. the Shapiro-Wilk test or Kolmogorov-Smirnov test) or the randomness of the time series and the absence of trend using very simple time-series models (e.g. the sign and trend tests by Bendat and Piersol). We present a novel approach to the analysis of the stationarity of MEG and EEG time series by applying modern statistical methods which were specifically developed in econometrics to verify the hypothesis that a time series is stationary. We report our findings of the application of three different tests of stationarity--the Kwiatkowski-Phillips-Schmidt-Schin (KPSS) test for trend or mean stationarity, the Phillips-Perron (PP) test for the presence of a unit root and the White test for homoscedasticity--on an illustrative set of MEG data. For five stimulation sessions, we found already for short epochs of duration of 250 and 500 ms that, although the majority of the studied epochs of single MEG trials were usually mean-stationary (KPSS test and PP test), they were classified as nonstationary due to their heteroscedasticity (White test). We also observed that the presence of external auditory stimulation did not significantly affect the findings regarding the stationarity of the data. We conclude that the combination of these tests allows a refined analysis of the stationarity of MEG and EEG time series.

  6. Computational methods in the pricing and risk management of modern financial derivatives

    Deutsch, Hans-Peter

    1999-09-01

    In the last 20 years modern finance has developed into a complex mathematically challenging field. Very complicated risks exist in financial markets which need very advanced methods to measure and/or model them. The financial instruments invented by the market participants to trade these risk, the so called derivatives are usually even more complicated than the risks themselves and also sometimes generate new riks. Topics like random walks, stochastic differential equations, martingale measures, time series analysis, implied correlations, etc. are of common use in the field. This is why more and more people with a science background, such as physicists, mathematicians, or computer scientists, are entering the field of finance. The measurement and management of all theses risks is the key to the continuing success of banks. This talk gives insight into today's common methods of modern market risk management such as variance-covariance, historical simulation, Monte Carlo, “Greek” ratios, etc., including the statistical concepts on which they are based. Derivatives are at the same time the main reason for and the most effective means of conducting risk management. As such, they stand at the beginning and end of risk management. The valuation of derivatives and structured financial instruments is therefore the prerequisite, the condition sine qua non, for all risk management. This talk introduces some of the important valuation methods used in modern derivatives pricing such as present value, Black-Scholes, binomial trees, Monte Carlo, etc. In summary this talk highlights an area outside physics where there is a lot of interesting work to do, especially for physicists. Or as one of our consultants said: The fascinating thing about this job is that Arthur Andersen hired me not ALTHOUGH I am a physicist but BECAUSE I am a physicist.

  7. Statistical methods for accurately determining criticality code bias

    Trumble, E.F.; Kimball, K.D.

    1997-01-01

    A system of statistically treating validation calculations for the purpose of determining computer code bias is provided in this paper. The following statistical treatments are described: weighted regression analysis, lower tolerance limit, lower tolerance band, and lower confidence band. These methods meet the criticality code validation requirements of ANS 8.1. 8 refs., 5 figs., 4 tabs

  8. Review of Statistical Learning Methods in Integrated Omics Studies (An Integrated Information Science).

    Zeng, Irene Sui Lan; Lumley, Thomas

    2018-01-01

    Integrated omics is becoming a new channel for investigating the complex molecular system in modern biological science and sets a foundation for systematic learning for precision medicine. The statistical/machine learning methods that have emerged in the past decade for integrated omics are not only innovative but also multidisciplinary with integrated knowledge in biology, medicine, statistics, machine learning, and artificial intelligence. Here, we review the nontrivial classes of learning methods from the statistical aspects and streamline these learning methods within the statistical learning framework. The intriguing findings from the review are that the methods used are generalizable to other disciplines with complex systematic structure, and the integrated omics is part of an integrated information science which has collated and integrated different types of information for inferences and decision making. We review the statistical learning methods of exploratory and supervised learning from 42 publications. We also discuss the strengths and limitations of the extended principal component analysis, cluster analysis, network analysis, and regression methods. Statistical techniques such as penalization for sparsity induction when there are fewer observations than the number of features and using Bayesian approach when there are prior knowledge to be integrated are also included in the commentary. For the completeness of the review, a table of currently available software and packages from 23 publications for omics are summarized in the appendix.

  9. [Applications of mathematical statistics methods on compatibility researches of traditional Chinese medicines formulae].

    Mai, Lan-Yin; Li, Yi-Xuan; Chen, Yong; Xie, Zhen; Li, Jie; Zhong, Ming-Yu

    2014-05-01

    The compatibility of traditional Chinese medicines (TCMs) formulae containing enormous information, is a complex component system. Applications of mathematical statistics methods on the compatibility researches of traditional Chinese medicines formulae have great significance for promoting the modernization of traditional Chinese medicines and improving clinical efficacies and optimizations of formulae. As a tool for quantitative analysis, data inference and exploring inherent rules of substances, the mathematical statistics method can be used to reveal the working mechanisms of the compatibility of traditional Chinese medicines formulae in qualitatively and quantitatively. By reviewing studies based on the applications of mathematical statistics methods, this paper were summarized from perspective of dosages optimization, efficacies and changes of chemical components as well as the rules of incompatibility and contraindication of formulae, will provide the references for further studying and revealing the working mechanisms and the connotations of traditional Chinese medicines.

  10. Statistical power analysis a simple and general model for traditional and modern hypothesis tests

    Murphy, Kevin R; Wolach, Allen

    2014-01-01

    Noted for its accessible approach, this text applies the latest approaches of power analysis to both null hypothesis and minimum-effect testing using the same basic unified model. Through the use of a few simple procedures and examples, the authors show readers with little expertise in statistical analysis how to obtain the values needed to carry out the power analysis for their research. Illustrations of how these analyses work and how they can be used to choose the appropriate criterion for defining statistically significant outcomes are sprinkled throughout. The book presents a simple and g

  11. Utilisation des methodes modernes et reversibles de contraception ...

    Cette étude descriptive et transversale de 6 ans, allant du 1er Janvier 1999 au 10 Mai 2005 à la clinique de gynécologie obstétrique de l\\'hôpital Donka du CHU de Conakry, a fait ressortir le niveau d\\'utilisation des méthodes modernes de contraception et a permis d\\'élaboré des recommandations pour l\\'amélioration de la ...

  12. Statistics of Monte Carlo methods used in radiation transport calculation

    Datta, D.

    2009-01-01

    Radiation transport calculation can be carried out by using either deterministic or statistical methods. Radiation transport calculation based on statistical methods is basic theme of the Monte Carlo methods. The aim of this lecture is to describe the fundamental statistics required to build the foundations of Monte Carlo technique for radiation transport calculation. Lecture note is organized in the following way. Section (1) will describe the introduction of Basic Monte Carlo and its classification towards the respective field. Section (2) will describe the random sampling methods, a key component of Monte Carlo radiation transport calculation, Section (3) will provide the statistical uncertainty of Monte Carlo estimates, Section (4) will describe in brief the importance of variance reduction techniques while sampling particles such as photon, or neutron in the process of radiation transport

  13. Statistical methods for evaluating the attainment of cleanup standards

    Gilbert, R.O.; Simpson, J.C.

    1992-12-01

    This document is the third volume in a series of volumes sponsored by the US Environmental Protection Agency (EPA), Statistical Policy Branch, that provide statistical methods for evaluating the attainment of cleanup Standards at Superfund sites. Volume 1 (USEPA 1989a) provides sampling designs and tests for evaluating attainment of risk-based standards for soils and solid media. Volume 2 (USEPA 1992) provides designs and tests for evaluating attainment of risk-based standards for groundwater. The purpose of this third volume is to provide statistical procedures for designing sampling programs and conducting statistical tests to determine whether pollution parameters in remediated soils and solid media at Superfund sites attain site-specific reference-based standards. This.document is written for individuals who may not have extensive training or experience with statistical methods. The intended audience includes EPA regional remedial project managers, Superfund-site potentially responsible parties, state environmental protection agencies, and contractors for these groups.

  14. Method for statistical data analysis of multivariate observations

    Gnanadesikan, R

    1997-01-01

    A practical guide for multivariate statistical techniques-- now updated and revised In recent years, innovations in computer technology and statistical methodologies have dramatically altered the landscape of multivariate data analysis. This new edition of Methods for Statistical Data Analysis of Multivariate Observations explores current multivariate concepts and techniques while retaining the same practical focus of its predecessor. It integrates methods and data-based interpretations relevant to multivariate analysis in a way that addresses real-world problems arising in many areas of inte

  15. Statistical limitations in functional neuroimaging. I. Non-inferential methods and statistical models.

    Petersson, K M; Nichols, T E; Poline, J B; Holmes, A P

    1999-01-01

    Functional neuroimaging (FNI) provides experimental access to the intact living brain making it possible to study higher cognitive functions in humans. In this review and in a companion paper in this issue, we discuss some common methods used to analyse FNI data. The emphasis in both papers is on assumptions and limitations of the methods reviewed. There are several methods available to analyse FNI data indicating that none is optimal for all purposes. In order to make optimal use of the methods available it is important to know the limits of applicability. For the interpretation of FNI results it is also important to take into account the assumptions, approximations and inherent limitations of the methods used. This paper gives a brief overview over some non-inferential descriptive methods and common statistical models used in FNI. Issues relating to the complex problem of model selection are discussed. In general, proper model selection is a necessary prerequisite for the validity of the subsequent statistical inference. The non-inferential section describes methods that, combined with inspection of parameter estimates and other simple measures, can aid in the process of model selection and verification of assumptions. The section on statistical models covers approaches to global normalization and some aspects of univariate, multivariate, and Bayesian models. Finally, approaches to functional connectivity and effective connectivity are discussed. In the companion paper we review issues related to signal detection and statistical inference. PMID:10466149

  16. Analysis of Statistical Methods Currently used in Toxicology Journals.

    Na, Jihye; Yang, Hyeri; Bae, SeungJin; Lim, Kyung-Min

    2014-09-01

    Statistical methods are frequently used in toxicology, yet it is not clear whether the methods employed by the studies are used consistently and conducted based on sound statistical grounds. The purpose of this paper is to describe statistical methods used in top toxicology journals. More specifically, we sampled 30 papers published in 2014 from Toxicology and Applied Pharmacology, Archives of Toxicology, and Toxicological Science and described methodologies used to provide descriptive and inferential statistics. One hundred thirteen endpoints were observed in those 30 papers, and most studies had sample size less than 10, with the median and the mode being 6 and 3 & 6, respectively. Mean (105/113, 93%) was dominantly used to measure central tendency, and standard error of the mean (64/113, 57%) and standard deviation (39/113, 34%) were used to measure dispersion, while few studies provide justifications regarding why the methods being selected. Inferential statistics were frequently conducted (93/113, 82%), with one-way ANOVA being most popular (52/93, 56%), yet few studies conducted either normality or equal variance test. These results suggest that more consistent and appropriate use of statistical method is necessary which may enhance the role of toxicology in public health.

  17. Modern statistics for the social and behavioral sciences a practical introduction

    Wilcox, Rand

    2011-01-01

    Relative advantages/disadvantages of various techniques are presented so that the reader can be helped to understand the choices they make on using the techniques. … A considerable number of illustrations are included and the book focuses on using R for its computer software application. … A useful text for … postgraduate students in the social science disciplines.-Susan Starkings, International Statistical Review, 2012This is an interesting and valuable book … By gathering a mass of results on that topic into a single volume with references, alternative procedures, and supporting software, th

  18. Brief guidelines for methods and statistics in medical research

    Ab Rahman, Jamalludin

    2015-01-01

    This book serves as a practical guide to methods and statistics in medical research. It includes step-by-step instructions on using SPSS software for statistical analysis, as well as relevant examples to help those readers who are new to research in health and medical fields. Simple texts and diagrams are provided to help explain the concepts covered, and print screens for the statistical steps and the SPSS outputs are provided, together with interpretations and examples of how to report on findings. Brief Guidelines for Methods and Statistics in Medical Research offers a valuable quick reference guide for healthcare students and practitioners conducting research in health related fields, written in an accessible style.

  19. Problematic Methods in teaching Modern History. An alternative or a necessity?

    Yohany Peralta Pérez

    2012-06-01

    Full Text Available The article presents the results of the analysis from the study of the theoretical research on the use of problematic methods in Teaching Learning Process of Modern H istory course in Rafael Maria de Mendive University of Pinar del Rio. An anal ysis o f the use of problematic methods in the Process of Teaching Modern History course from the definition of method taking into account the theoretical assumptions of scholars of the subject matter and the advantages and disadvantages provided by the use of these methods in the Teaching Learning Process of Modern History course.

  20. Quantitative EEG Applying the Statistical Recognition Pattern Method

    Engedal, Knut; Snaedal, Jon; Hoegh, Peter

    2015-01-01

    BACKGROUND/AIM: The aim of this study was to examine the discriminatory power of quantitative EEG (qEEG) applying the statistical pattern recognition (SPR) method to separate Alzheimer's disease (AD) patients from elderly individuals without dementia and from other dementia patients. METHODS...

  1. An Overview of Short-term Statistical Forecasting Methods

    Elias, Russell J.; Montgomery, Douglas C.; Kulahci, Murat

    2006-01-01

    An overview of statistical forecasting methodology is given, focusing on techniques appropriate to short- and medium-term forecasts. Topics include basic definitions and terminology, smoothing methods, ARIMA models, regression methods, dynamic regression models, and transfer functions. Techniques...... for evaluating and monitoring forecast performance are also summarized....

  2. Hierarchical modelling for the environmental sciences statistical methods and applications

    Clark, James S

    2006-01-01

    New statistical tools are changing the way in which scientists analyze and interpret data and models. Hierarchical Bayes and Markov Chain Monte Carlo methods for analysis provide a consistent framework for inference and prediction where information is heterogeneous and uncertain, processes are complicated, and responses depend on scale. Nowhere are these methods more promising than in the environmental sciences.

  3. Zubarev's Nonequilibrium Statistical Operator Method in the Generalized Statistics of Multiparticle Systems

    Glushak, P. A.; Markiv, B. B.; Tokarchuk, M. V.

    2018-01-01

    We present a generalization of Zubarev's nonequilibrium statistical operator method based on the principle of maximum Renyi entropy. In the framework of this approach, we obtain transport equations for the basic set of parameters of the reduced description of nonequilibrium processes in a classical system of interacting particles using Liouville equations with fractional derivatives. For a classical systems of particles in a medium with a fractal structure, we obtain a non-Markovian diffusion equation with fractional spatial derivatives. For a concrete model of the frequency dependence of a memory function, we obtain generalized Kettano-type diffusion equation with the spatial and temporal fractality taken into account. We present a generalization of nonequilibrium thermofield dynamics in Zubarev's nonequilibrium statistical operator method in the framework of Renyi statistics.

  4. Descriptive and inferential statistical methods used in burns research.

    Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars

    2010-05-01

    Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals

  5. Statistical-mechanical entropy by the thin-layer method

    Feng, He; Kim, Sung Won

    2003-01-01

    G. Hooft first studied the statistical-mechanical entropy of a scalar field in a Schwarzschild black hole background by the brick-wall method and hinted that the statistical-mechanical entropy is the statistical origin of the Bekenstein-Hawking entropy of the black hole. However, according to our viewpoint, the statistical-mechanical entropy is only a quantum correction to the Bekenstein-Hawking entropy of the black-hole. The brick-wall method based on thermal equilibrium at a large scale cannot be applied to the cases out of equilibrium such as a nonstationary black hole. The statistical-mechanical entropy of a scalar field in a nonstationary black hole background is calculated by the thin-layer method. The condition of local equilibrium near the horizon of the black hole is used as a working postulate and is maintained for a black hole which evaporates slowly enough and whose mass is far greater than the Planck mass. The statistical-mechanical entropy is also proportional to the area of the black hole horizon. The difference from the stationary black hole is that the result relies on a time-dependent cutoff

  6. Classical and modern numerical analysis theory, methods and practice

    Ackleh, Azmy S; Kearfott, R Baker; Seshaiyer, Padmanabhan

    2009-01-01

    Mathematical Review and Computer Arithmetic Mathematical Review Computer Arithmetic Interval ComputationsNumerical Solution of Nonlinear Equations of One Variable Introduction Bisection Method The Fixed Point Method Newton's Method (Newton-Raphson Method) The Univariate Interval Newton MethodSecant Method and Müller's Method Aitken Acceleration and Steffensen's Method Roots of Polynomials Additional Notes and SummaryNumerical Linear Algebra Basic Results from Linear Algebra Normed Linear Spaces Direct Methods for Solving Linear SystemsIterative Methods for Solving Linear SystemsThe Singular Value DecompositionApproximation TheoryIntroduction Norms, Projections, Inner Product Spaces, and Orthogonalization in Function SpacesPolynomial ApproximationPiecewise Polynomial ApproximationTrigonometric ApproximationRational ApproximationWavelet BasesLeast Squares Approximation on a Finite Point SetEigenvalue-Eigenvector Computation Basic Results from Linear Algebra The Power Method The Inverse Power Method Deflation T...

  7. Academic Training Lecture: Statistical Methods for Particle Physics

    PH Department

    2012-01-01

    2, 3, 4 and 5 April 2012 Academic Training Lecture  Regular Programme from 11:00 to 12:00 -  Bldg. 222-R-001 - Filtration Plant Statistical Methods for Particle Physics by Glen Cowan (Royal Holloway) The series of four lectures will introduce some of the important statistical methods used in Particle Physics, and should be particularly relevant to those involved in the analysis of LHC data. The lectures will include an introduction to statistical tests, parameter estimation, and the application of these tools to searches for new phenomena.  Both frequentist and Bayesian methods will be described, with particular emphasis on treatment of systematic uncertainties.  The lectures will also cover unfolding, that is, estimation of a distribution in binned form where the variable in question is subject to measurement errors.

  8. Methods library of embedded R functions at Statistics Norway

    Øyvind Langsrud

    2017-11-01

    Full Text Available Statistics Norway is modernising the production processes. An important element in this work is a library of functions for statistical computations. In principle, the functions in such a methods library can be programmed in several languages. A modernised production environment demand that these functions can be reused for different statistics products, and that they are embedded within a common IT system. The embedding should be done in such a way that the users of the methods do not need to know the underlying programming language. As a proof of concept, Statistics Norway soon has established a methods library offering a limited number of methods for macro-editing, imputation and confidentiality. This is done within an area of municipal statistics with R as the only programming language. This paper presents the details and experiences from this work. The problem of fitting real word applications to simple and strict standards is discussed and exemplified by the development of solutions to regression imputation and table suppression.

  9. Application of blended learning in teaching statistical methods

    Barbara Dębska

    2012-12-01

    Full Text Available The paper presents the application of a hybrid method (blended learning - linking traditional education with on-line education to teach selected problems of mathematical statistics. This includes the teaching of the application of mathematical statistics to evaluate laboratory experimental results. An on-line statistics course was developed to form an integral part of the module ‘methods of statistical evaluation of experimental results’. The course complies with the principles outlined in the Polish National Framework of Qualifications with respect to the scope of knowledge, skills and competencies that students should have acquired at course completion. The paper presents the structure of the course and the educational content provided through multimedia lessons made accessible on the Moodle platform. Following courses which used the traditional method of teaching and courses which used the hybrid method of teaching, students test results were compared and discussed to evaluate the effectiveness of the hybrid method of teaching when compared to the effectiveness of the traditional method of teaching.

  10. Statistical Methods for Particle Physics (4/4)

    CERN. Geneva

    2012-01-01

    The series of four lectures will introduce some of the important statistical methods used in Particle Physics, and should be particularly relevant to those involved in the analysis of LHC data. The lectures will include an introduction to statistical tests, parameter estimation, and the application of these tools to searches for new phenomena. Both frequentist and Bayesian methods will be described, with particular emphasis on treatment of systematic uncertainties. The lectures will also cover unfolding, that is, estimation of a distribution in binned form where the variable in question is subject to measurement errors.

  11. Statistical Methods for Particle Physics (1/4)

    CERN. Geneva

    2012-01-01

    The series of four lectures will introduce some of the important statistical methods used in Particle Physics, and should be particularly relevant to those involved in the analysis of LHC data. The lectures will include an introduction to statistical tests, parameter estimation, and the application of these tools to searches for new phenomena. Both frequentist and Bayesian methods will be described, with particular emphasis on treatment of systematic uncertainties. The lectures will also cover unfolding, that is, estimation of a distribution in binned form where the variable in question is subject to measurement errors.

  12. Statistical Methods for Particle Physics (2/4)

    CERN. Geneva

    2012-01-01

    The series of four lectures will introduce some of the important statistical methods used in Particle Physics, and should be particularly relevant to those involved in the analysis of LHC data. The lectures will include an introduction to statistical tests, parameter estimation, and the application of these tools to searches for new phenomena. Both frequentist and Bayesian methods will be described, with particular emphasis on treatment of systematic uncertainties. The lectures will also cover unfolding, that is, estimation of a distribution in binned form where the variable in question is subject to measurement errors.

  13. Statistical Methods for Particle Physics (3/4)

    CERN. Geneva

    2012-01-01

    The series of four lectures will introduce some of the important statistical methods used in Particle Physics, and should be particularly relevant to those involved in the analysis of LHC data. The lectures will include an introduction to statistical tests, parameter estimation, and the application of these tools to searches for new phenomena. Both frequentist and Bayesian methods will be described, with particular emphasis on treatment of systematic uncertainties. The lectures will also cover unfolding, that is, estimation of a distribution in binned form where the variable in question is subject to measurement errors.

  14. Understanding common statistical methods, Part I: descriptive methods, probability, and continuous data.

    Skinner, Carl G; Patel, Manish M; Thomas, Jerry D; Miller, Michael A

    2011-01-01

    Statistical methods are pervasive in medical research and general medical literature. Understanding general statistical concepts will enhance our ability to critically appraise the current literature and ultimately improve the delivery of patient care. This article intends to provide an overview of the common statistical methods relevant to medicine.

  15. Methods and statistics for combining motif match scores.

    Bailey, T L; Gribskov, M

    1998-01-01

    Position-specific scoring matrices are useful for representing and searching for protein sequence motifs. A sequence family can often be described by a group of one or more motifs, and an effective search must combine the scores for matching a sequence to each of the motifs in the group. We describe three methods for combining match scores and estimating the statistical significance of the combined scores and evaluate the search quality (classification accuracy) and the accuracy of the estimate of statistical significance of each. The three methods are: 1) sum of scores, 2) sum of reduced variates, 3) product of score p-values. We show that method 3) is superior to the other two methods in both regards, and that combining motif scores indeed gives better search accuracy. The MAST sequence homology search algorithm utilizing the product of p-values scoring method is available for interactive use and downloading at URL http:/(/)www.sdsc.edu/MEME.

  16. Advances in Statistical Methods for Substance Abuse Prevention Research

    MacKinnon, David P.; Lockwood, Chondra M.

    2010-01-01

    The paper describes advances in statistical methods for prevention research with a particular focus on substance abuse prevention. Standard analysis methods are extended to the typical research designs and characteristics of the data collected in prevention research. Prevention research often includes longitudinal measurement, clustering of data in units such as schools or clinics, missing data, and categorical as well as continuous outcome variables. Statistical methods to handle these features of prevention data are outlined. Developments in mediation, moderation, and implementation analysis allow for the extraction of more detailed information from a prevention study. Advancements in the interpretation of prevention research results include more widespread calculation of effect size and statistical power, the use of confidence intervals as well as hypothesis testing, detailed causal analysis of research findings, and meta-analysis. The increased availability of statistical software has contributed greatly to the use of new methods in prevention research. It is likely that the Internet will continue to stimulate the development and application of new methods. PMID:12940467

  17. Statistical methods of parameter estimation for deterministically chaotic time series

    Pisarenko, V. F.; Sornette, D.

    2004-03-01

    We discuss the possibility of applying some standard statistical methods (the least-square method, the maximum likelihood method, and the method of statistical moments for estimation of parameters) to deterministically chaotic low-dimensional dynamic system (the logistic map) containing an observational noise. A “segmentation fitting” maximum likelihood (ML) method is suggested to estimate the structural parameter of the logistic map along with the initial value x1 considered as an additional unknown parameter. The segmentation fitting method, called “piece-wise” ML, is similar in spirit but simpler and has smaller bias than the “multiple shooting” previously proposed. Comparisons with different previously proposed techniques on simulated numerical examples give favorable results (at least, for the investigated combinations of sample size N and noise level). Besides, unlike some suggested techniques, our method does not require the a priori knowledge of the noise variance. We also clarify the nature of the inherent difficulties in the statistical analysis of deterministically chaotic time series and the status of previously proposed Bayesian approaches. We note the trade off between the need of using a large number of data points in the ML analysis to decrease the bias (to guarantee consistency of the estimation) and the unstable nature of dynamical trajectories with exponentially fast loss of memory of the initial condition. The method of statistical moments for the estimation of the parameter of the logistic map is discussed. This method seems to be the unique method whose consistency for deterministically chaotic time series is proved so far theoretically (not only numerically).

  18. Statistical methods with applications to demography and life insurance

    Khmaladze, Estáte V

    2013-01-01

    Suitable for statisticians, mathematicians, actuaries, and students interested in the problems of insurance and analysis of lifetimes, Statistical Methods with Applications to Demography and Life Insurance presents contemporary statistical techniques for analyzing life distributions and life insurance problems. It not only contains traditional material but also incorporates new problems and techniques not discussed in existing actuarial literature. The book mainly focuses on the analysis of an individual life and describes statistical methods based on empirical and related processes. Coverage ranges from analyzing the tails of distributions of lifetimes to modeling population dynamics with migrations. To help readers understand the technical points, the text covers topics such as the Stieltjes, Wiener, and Itô integrals. It also introduces other themes of interest in demography, including mixtures of distributions, analysis of longevity and extreme value theory, and the age structure of a population. In addi...

  19. Landslide Susceptibility Statistical Methods: A Critical and Systematic Literature Review

    Mihir, Monika; Malamud, Bruce; Rossi, Mauro; Reichenbach, Paola; Ardizzone, Francesca

    2014-05-01

    Landslide susceptibility assessment, the subject of this systematic review, is aimed at understanding the spatial probability of slope failures under a set of geomorphological and environmental conditions. It is estimated that about 375 landslides that occur globally each year are fatal, with around 4600 people killed per year. Past studies have brought out the increasing cost of landslide damages which primarily can be attributed to human occupation and increased human activities in the vulnerable environments. Many scientists, to evaluate and reduce landslide risk, have made an effort to efficiently map landslide susceptibility using different statistical methods. In this paper, we do a critical and systematic landslide susceptibility literature review, in terms of the different statistical methods used. For each of a broad set of studies reviewed we note: (i) study geography region and areal extent, (ii) landslide types, (iii) inventory type and temporal period covered, (iv) mapping technique (v) thematic variables used (vi) statistical models, (vii) assessment of model skill, (viii) uncertainty assessment methods, (ix) validation methods. We then pulled out broad trends within our review of landslide susceptibility, particularly regarding the statistical methods. We found that the most common statistical methods used in the study of landslide susceptibility include logistic regression, artificial neural network, discriminant analysis and weight of evidence. Although most of the studies we reviewed assessed the model skill, very few assessed model uncertainty. In terms of geographic extent, the largest number of landslide susceptibility zonations were in Turkey, Korea, Spain, Italy and Malaysia. However, there are also many landslides and fatalities in other localities, particularly India, China, Philippines, Nepal and Indonesia, Guatemala, and Pakistan, where there are much fewer landslide susceptibility studies available in the peer-review literature. This

  20. Experiential Approach to Teaching Statistics and Research Methods ...

    Statistics and research methods are among the more demanding topics for students of education to master at both the undergraduate and postgraduate levels. It is our conviction that teaching these topics should be combined with real practical experiences. We discuss an experiential teaching/ learning approach that ...

  1. Application of statistical methods at copper wire manufacturing

    Z. Hajduová

    2009-01-01

    Full Text Available Six Sigma is a method of management that strives for near perfection. The Six Sigma methodology uses data and rigorous statistical analysis to identify defects in a process or product, reduce variability and achieve as close to zero defects as possible. The paper presents the basic information on this methodology.

  2. Statistical and Machine Learning forecasting methods: Concerns and ways forward.

    Makridakis, Spyros; Spiliotis, Evangelos; Assimakopoulos, Vassilios

    2018-01-01

    Machine Learning (ML) methods have been proposed in the academic literature as alternatives to statistical ones for time series forecasting. Yet, scant evidence is available about their relative performance in terms of accuracy and computational requirements. The purpose of this paper is to evaluate such performance across multiple forecasting horizons using a large subset of 1045 monthly time series used in the M3 Competition. After comparing the post-sample accuracy of popular ML methods with that of eight traditional statistical ones, we found that the former are dominated across both accuracy measures used and for all forecasting horizons examined. Moreover, we observed that their computational requirements are considerably greater than those of statistical methods. The paper discusses the results, explains why the accuracy of ML models is below that of statistical ones and proposes some possible ways forward. The empirical results found in our research stress the need for objective and unbiased ways to test the performance of forecasting methods that can be achieved through sizable and open competitions allowing meaningful comparisons and definite conclusions.

  3. Computerized statistical analysis with bootstrap method in nuclear medicine

    Zoccarato, O.; Sardina, M.; Zatta, G.; De Agostini, A.; Barbesti, S.; Mana, O.; Tarolo, G.L.

    1988-01-01

    Statistical analysis of data samples involves some hypothesis about the features of data themselves. The accuracy of these hypotheses can influence the results of statistical inference. Among the new methods of computer-aided statistical analysis, the bootstrap method appears to be one of the most powerful, thanks to its ability to reproduce many artificial samples starting from a single original sample and because it works without hypothesis about data distribution. The authors applied the bootstrap method to two typical situation of Nuclear Medicine Department. The determination of the normal range of serum ferritin, as assessed by radioimmunoassay and defined by the mean value ±2 standard deviations, starting from an experimental sample of small dimension, shows an unacceptable lower limit (ferritin plasmatic levels below zero). On the contrary, the results obtained by elaborating 5000 bootstrap samples gives ans interval of values (10.95 ng/ml - 72.87 ng/ml) corresponding to the normal ranges commonly reported. Moreover the authors applied the bootstrap method in evaluating the possible error associated with the correlation coefficient determined between left ventricular ejection fraction (LVEF) values obtained by first pass radionuclide angiocardiography with 99m Tc and 195m Au. The results obtained indicate a high degree of statistical correlation and give the range of r 2 values to be considered acceptable for this type of studies

  4. Statistical and Machine Learning forecasting methods: Concerns and ways forward

    Makridakis, Spyros; Assimakopoulos, Vassilios

    2018-01-01

    Machine Learning (ML) methods have been proposed in the academic literature as alternatives to statistical ones for time series forecasting. Yet, scant evidence is available about their relative performance in terms of accuracy and computational requirements. The purpose of this paper is to evaluate such performance across multiple forecasting horizons using a large subset of 1045 monthly time series used in the M3 Competition. After comparing the post-sample accuracy of popular ML methods with that of eight traditional statistical ones, we found that the former are dominated across both accuracy measures used and for all forecasting horizons examined. Moreover, we observed that their computational requirements are considerably greater than those of statistical methods. The paper discusses the results, explains why the accuracy of ML models is below that of statistical ones and proposes some possible ways forward. The empirical results found in our research stress the need for objective and unbiased ways to test the performance of forecasting methods that can be achieved through sizable and open competitions allowing meaningful comparisons and definite conclusions. PMID:29584784

  5. Illinois' Forests, 2005: Statistics, Methods, and Quality Assurance

    Susan J. Crocker; Charles J. Barnett; Mark A. Hatfield

    2013-01-01

    The first full annual inventory of Illinois' forests was completed in 2005. This report contains 1) descriptive information on methods, statistics, and quality assurance of data collection, 2) a glossary of terms, 3) tables that summarize quality assurance, and 4) a core set of tabular estimates for a variety of forest resources. A detailed analysis of inventory...

  6. Kansas's forests, 2005: statistics, methods, and quality assurance

    Patrick D. Miles; W. Keith Moser; Charles J. Barnett

    2011-01-01

    The first full annual inventory of Kansas's forests was completed in 2005 after 8,868 plots were selected and 468 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of Kansas inventory is presented...

  7. South Dakota's forests, 2005: statistics, methods, and quality assurance

    Patrick D. Miles; Ronald J. Piva; Charles J. Barnett

    2011-01-01

    The first full annual inventory of South Dakota's forests was completed in 2005 after 8,302 plots were selected and 325 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of the South Dakota...

  8. Nebraska's forests, 2005: statistics, methods, and quality assurance

    Patrick D. Miles; Dacia M. Meneguzzo; Charles J. Barnett

    2011-01-01

    The first full annual inventory of Nebraska's forests was completed in 2005 after 8,335 plots were selected and 274 forested plots were visited and measured. This report includes detailed information on forest inventory methods, and data quality estimates. Tables of various important resource statistics are presented. Detailed analysis of the inventory data are...

  9. North Dakota's forests, 2005: statistics, methods, and quality assurance

    Patrick D. Miles; David E. Haugen; Charles J. Barnett

    2011-01-01

    The first full annual inventory of North Dakota's forests was completed in 2005 after 7,622 plots were selected and 164 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of the North Dakota...

  10. Peer-Assisted Learning in Research Methods and Statistics

    Stone, Anna; Meade, Claire; Watling, Rosamond

    2012-01-01

    Feedback from students on a Level 1 Research Methods and Statistics module, studied as a core part of a BSc Psychology programme, highlighted demand for additional tutorials to help them to understand basic concepts. Students in their final year of study commonly request work experience to enhance their employability. All students on the Level 1…

  11. A statistical method for 2D facial landmarking

    Dibeklioğlu, H.; Salah, A.A.; Gevers, T.

    2012-01-01

    Many facial-analysis approaches rely on robust and accurate automatic facial landmarking to correctly function. In this paper, we describe a statistical method for automatic facial-landmark localization. Our landmarking relies on a parsimonious mixture model of Gabor wavelet features, computed in

  12. Investigating salt frost scaling by using statistical methods

    Hasholt, Marianne Tange; Clemmensen, Line Katrine Harder

    2010-01-01

    A large data set comprising data for 118 concrete mixes on mix design, air void structure, and the outcome of freeze/thaw testing according to SS 13 72 44 has been analysed by use of statistical methods. The results show that with regard to mix composition, the most important parameter...

  13. Olefination of carbonyl compounds: modern and classical methods

    Korotchenko, V N; Nenajdenko, Valentine G; Balenkova, Elizabeth S [Department of Chemistry, M.V. Lomonosov Moscow State University, Moscow (Russian Federation); Shastin, Aleksey V [Institute of Problems of Chemical Physics, Russian Academy of Sciences, Chernogolovka, Moscow Region (Russian Federation)

    2004-10-31

    The published data on the methods for alkene synthesis by olefination of carbonyl compounds are generalised and systematised. The main attention is given to the use of transition metals and organoelement compounds. The review covers the data on both classical and newly developed methods that are little known to chemists at large.

  14. Olefination of carbonyl compounds: modern and classical methods

    Korotchenko, V. N.; Nenajdenko, Valentine G.; Balenkova, Elizabeth S.; Shastin, Aleksey V.

    2004-10-01

    The published data on the methods for alkene synthesis by olefination of carbonyl compounds are generalised and systematised. The main attention is given to the use of transition metals and organoelement compounds. The review covers the data on both classical and newly developed methods that are little known to chemists at large.

  15. A Bayesian statistical method for particle identification in shower counters

    Takashimizu, N.; Kimura, A.; Shibata, A.; Sasaki, T.

    2004-01-01

    We report an attempt on identifying particles using a Bayesian statistical method. We have developed the mathematical model and software for this purpose. We tried to identify electrons and charged pions in shower counters using this method. We designed an ideal shower counter and studied the efficiency of identification using Monte Carlo simulation based on Geant4. Without having any other information, e.g. charges of particles which are given by tracking detectors, we have achieved 95% identifications of both particles

  16. Quantum statistical Monte Carlo methods and applications to spin systems

    Suzuki, M.

    1986-01-01

    A short review is given concerning the quantum statistical Monte Carlo method based on the equivalence theorem that d-dimensional quantum systems are mapped onto (d+1)-dimensional classical systems. The convergence property of this approximate tansformation is discussed in detail. Some applications of this general appoach to quantum spin systems are reviewed. A new Monte Carlo method, ''thermo field Monte Carlo method,'' is presented, which is an extension of the projection Monte Carlo method at zero temperature to that at finite temperatures

  17. Thermodynamics, Gibbs Method and Statistical Physics of Electron Gases Gibbs Method and Statistical Physics of Electron Gases

    Askerov, Bahram M

    2010-01-01

    This book deals with theoretical thermodynamics and the statistical physics of electron and particle gases. While treating the laws of thermodynamics from both classical and quantum theoretical viewpoints, it posits that the basis of the statistical theory of macroscopic properties of a system is the microcanonical distribution of isolated systems, from which all canonical distributions stem. To calculate the free energy, the Gibbs method is applied to ideal and non-ideal gases, and also to a crystalline solid. Considerable attention is paid to the Fermi-Dirac and Bose-Einstein quantum statistics and its application to different quantum gases, and electron gas in both metals and semiconductors is considered in a nonequilibrium state. A separate chapter treats the statistical theory of thermodynamic properties of an electron gas in a quantizing magnetic field.

  18. ACARP Project C10059. ACARP manual of modern coal testing methods. Volume 1: The manual

    Sakurovs, R.; Creelman, R.; Pohl, J.; Juniper, L. [CSIRO Energy Technology, Sydney, NSW (Australia)

    2002-07-01

    The Manual summarises the purpose, applicability, and limitations of a range of standard and modern coal testing methods that have potential to assist the coal company technologist to better evaluate coal performance. The first volume sets out the Modern Coal Testing Methods in summarised form that can be used as a quick guide to practitioners to assist in selecting the best technique to solve their problems.

  19. Application of statistical method for FBR plant transient computation

    Kikuchi, Norihiro; Mochizuki, Hiroyasu

    2014-01-01

    Highlights: • A statistical method with a large trial number up to 10,000 is applied to the plant system analysis. • A turbine trip test conducted at the “Monju” reactor is selected as a plant transient. • A reduction method of trial numbers is discussed. • The result with reduced trial number can express the base regions of the computed distribution. -- Abstract: It is obvious that design tolerances, errors included in operation, and statistical errors in empirical correlations effect on the transient behavior. The purpose of the present study is to apply above mentioned statistical errors to a plant system computation in order to evaluate the statistical distribution contained in the transient evolution. A selected computation case is the turbine trip test conducted at 40% electric power of the prototype fast reactor “Monju”. All of the heat transport systems of “Monju” are modeled with the NETFLOW++ system code which has been validated using the plant transient tests of the experimental fast reactor Joyo, and “Monju”. The effects of parameters on upper plenum temperature are confirmed by sensitivity analyses, and dominant parameters are chosen. The statistical errors are applied to each computation deck by using a pseudorandom number and the Monte-Carlo method. The dSFMT (Double precision SIMD-oriented Fast Mersenne Twister) that is developed version of Mersenne Twister (MT), is adopted as the pseudorandom number generator. In the present study, uniform random numbers are generated by dSFMT, and these random numbers are transformed to the normal distribution by the Box–Muller method. Ten thousands of different computations are performed at once. In every computation case, the steady calculation is performed for 12,000 s, and transient calculation is performed for 4000 s. In the purpose of the present statistical computation, it is important that the base regions of distribution functions should be calculated precisely. A large number of

  20. The awareness and use of modern contraceptive methods among ...

    ... and are increasingly being exposed to unplanned pregnancies with the attendant ... Few studies have been conducted among young women probably due to the ... Methods: It was a prospective, descriptive, cross-sectional study carried out ...

  1. MOLECULAR GENETIC MARKERS AND METHODS OF THEIR IDENTIFICATION IN MODERN FISH-FARMING

    I. Hrytsyniak

    2014-03-01

    Full Text Available Purpose. The application of molecular genetic markers has been widely used in modern experimental fish-farming in recent years. This methodology is currently presented by a differentiated approach with individual mechanisms and clearly defined possibilities. Numerous publications in the scientific literature that are dedicated to molecular genetic markers for the most part offer purely practical data. Thus, the synthesis and analysis of existing information on the general principles of action and the limits of the main methods of using molecular genetic markers is an actual problem. In particular, such a description will make it possible to plan more effectively the experiment and to obtain the desired results with high reliability. Findings. The main types of variable parts of DNA that can be used as molecular genetic markers in determining the level of stock hybridization, conducting genetic inventory of population and solving other problems in modern fish-farming are described in this paper. Also, the article provides an overview of principal modern methods that can be used to identify molecular genetic markers. Originality. This work is a generalization of modern ideas about the mechanisms of experiments with molecular genetic markers in fish-farming. Information is provided in the form of consistent presentation of the principles and purpose of each method, as well as significant advances during their practical application. Practical value. The proposed review of classic and modern literature data on molecular genetic markers can be used for planning, modernization and correction of research activity in modern fish-farming.

  2. Statistical methods for assessing agreement between continuous measurements

    Sokolowski, Ineta; Hansen, Rikke Pilegaard; Vedsted, Peter

    Background: Clinical research often involves study of agreement amongst observers. Agreement can be measured in different ways, and one can obtain quite different values depending on which method one uses. Objective: We review the approaches that have been discussed to assess the agreement between...... continuous measures and discuss their strengths and weaknesses. Different methods are illustrated using actual data from the `Delay in diagnosis of cancer in general practice´ project in Aarhus, Denmark. Subjects and Methods: We use weighted kappa-statistic, intraclass correlation coefficient (ICC......), concordance coefficient, Bland-Altman limits of agreement and percentage of agreement to assess the agreement between patient reported delay and doctor reported delay in diagnosis of cancer in general practice. Key messages: The correct statistical approach is not obvious. Many studies give the product...

  3. Statistical disclosure control for microdata methods and applications in R

    Templ, Matthias

    2017-01-01

    This book on statistical disclosure control presents the theory, applications and software implementation of the traditional approach to (micro)data anonymization, including data perturbation methods, disclosure risk, data utility, information loss and methods for simulating synthetic data. Introducing readers to the R packages sdcMicro and simPop, the book also features numerous examples and exercises with solutions, as well as case studies with real-world data, accompanied by the underlying R code to allow readers to reproduce all results. The demand for and volume of data from surveys, registers or other sources containing sensible information on persons or enterprises have increased significantly over the last several years. At the same time, privacy protection principles and regulations have imposed restrictions on the access and use of individual data. Proper and secure microdata dissemination calls for the application of statistical disclosure control methods to the data before release. This book is in...

  4. Applied statistical methods in agriculture, health and life sciences

    Lawal, Bayo

    2014-01-01

    This textbook teaches crucial statistical methods to answer research questions using a unique range of statistical software programs, including MINITAB and R. This textbook is developed for undergraduate students in agriculture, nursing, biology and biomedical research. Graduate students will also find it to be a useful way to refresh their statistics skills and to reference software options. The unique combination of examples is approached using MINITAB and R for their individual strengths. Subjects covered include among others data description, probability distributions, experimental design, regression analysis, randomized design and biological assay. Unlike other biostatistics textbooks, this text also includes outliers, influential observations in regression and an introduction to survival analysis. Material is taken from the author's extensive teaching and research in Africa, USA and the UK. Sample problems, references and electronic supplementary material accompany each chapter.

  5. Identification of mine waters by statistical multivariate methods

    Mali, N [IGGG, Ljubljana (Slovenia)

    1992-01-01

    Three water-bearing aquifers are present in the Velenje lignite mine. The aquifer waters have differing chemical composition; a geochemical water analysis can therefore determine the source of mine water influx. Mine water samples from different locations in the mine were analyzed, the results of chemical content and of electric conductivity of mine water were statistically processed by means of MICROGAS, SPSS-X and IN STATPAC computer programs, which apply three multivariate statistical methods (discriminate, cluster and factor analysis). Reliability of calculated values was determined with the Kolmogorov and Smirnov tests. It is concluded that laboratory analysis of single water samples can produce measurement errors, but statistical processing of water sample data can identify origin and movement of mine water. 15 refs.

  6. Nonequilibrium Statistical Operator Method and Generalized Kinetic Equations

    Kuzemsky, A. L.

    2018-01-01

    We consider some principal problems of nonequilibrium statistical thermodynamics in the framework of the Zubarev nonequilibrium statistical operator approach. We present a brief comparative analysis of some approaches to describing irreversible processes based on the concept of nonequilibrium Gibbs ensembles and their applicability to describing nonequilibrium processes. We discuss the derivation of generalized kinetic equations for a system in a heat bath. We obtain and analyze a damped Schrödinger-type equation for a dynamical system in a heat bath. We study the dynamical behavior of a particle in a medium taking the dissipation effects into account. We consider the scattering problem for neutrons in a nonequilibrium medium and derive a generalized Van Hove formula. We show that the nonequilibrium statistical operator method is an effective, convenient tool for describing irreversible processes in condensed matter.

  7. Modern methods for the synthesis of peptide-oligonucleotide conjugates

    Zubin, Evgenii M; Oretskaya, Tat'yana S; Romanova, Elena A

    2002-01-01

    The published data on the methods of chemical solution and solid-phase synthesis of peptide-oligonucleotide conjugates are reviewed. The known methods are systematised and their advantages and disadvantages are considered. The approaches to the solution synthesis of peptide-oligonucleotide conjugates are systematised according to the type of chemical bonds between the fragments, whereas those to the solid-phase synthesis are classified according to the procedure used for the preparation of conjugates, viz., stepwise elongation of oligonucleotide and peptide chains on the same polymeric support or solid-phase condensation of two presynthesised fragments. The bibliography includes 141 references.

  8. Modern methods of early diagnostics of juvenile arthritis

    Chernenkov Y.V.

    2013-06-01

    Full Text Available The problem of inflammatory diseases of joints is one of the most important issues in the pediatrics. Nowadays the significant attention in this sphere is paid to the search of new accurate criteria of diagnostics. It will help estimate the severity of disease, determine the prognosis, choose the method of treatment and monitoring and evaluate the efficacy of the therapy.

  9. High accuracy mantle convection simulation through modern numerical methods

    Kronbichler, Martin

    2012-08-21

    Numerical simulation of the processes in the Earth\\'s mantle is a key piece in understanding its dynamics, composition, history and interaction with the lithosphere and the Earth\\'s core. However, doing so presents many practical difficulties related to the numerical methods that can accurately represent these processes at relevant scales. This paper presents an overview of the state of the art in algorithms for high-Rayleigh number flows such as those in the Earth\\'s mantle, and discusses their implementation in the Open Source code Aspect (Advanced Solver for Problems in Earth\\'s ConvecTion). Specifically, we show how an interconnected set of methods for adaptive mesh refinement (AMR), higher order spatial and temporal discretizations, advection stabilization and efficient linear solvers can provide high accuracy at a numerical cost unachievable with traditional methods, and how these methods can be designed in a way so that they scale to large numbers of processors on compute clusters. Aspect relies on the numerical software packages deal.II and Trilinos, enabling us to focus on high level code and keeping our implementation compact. We present results from validation tests using widely used benchmarks for our code, as well as scaling results from parallel runs. © 2012 The Authors Geophysical Journal International © 2012 RAS.

  10. Identifying Reflectors in Seismic Images via Statistic and Syntactic Methods

    Carlos A. Perez

    2010-04-01

    Full Text Available In geologic interpretation of seismic reflection data, accurate identification of reflectors is the foremost step to ensure proper subsurface structural definition. Reflector information, along with other data sets, is a key factor to predict the presence of hydrocarbons. In this work, mathematic and pattern recognition theory was adapted to design two statistical and two syntactic algorithms which constitute a tool in semiautomatic reflector identification. The interpretive power of these four schemes was evaluated in terms of prediction accuracy and computational speed. Among these, the semblance method was confirmed to render the greatest accuracy and speed. Syntactic methods offer an interesting alternative due to their inherently structural search method.

  11. New Graphical Methods and Test Statistics for Testing Composite Normality

    Marc S. Paolella

    2015-07-01

    Full Text Available Several graphical methods for testing univariate composite normality from an i.i.d. sample are presented. They are endowed with correct simultaneous error bounds and yield size-correct tests. As all are based on the empirical CDF, they are also consistent for all alternatives. For one test, called the modified stabilized probability test, or MSP, a highly simplified computational method is derived, which delivers the test statistic and also a highly accurate p-value approximation, essentially instantaneously. The MSP test is demonstrated to have higher power against asymmetric alternatives than the well-known and powerful Jarque-Bera test. A further size-correct test, based on combining two test statistics, is shown to have yet higher power. The methodology employed is fully general and can be applied to any i.i.d. univariate continuous distribution setting.

  12. Statistical learning modeling method for space debris photometric measurement

    Sun, Wenjing; Sun, Jinqiu; Zhang, Yanning; Li, Haisen

    2016-03-01

    Photometric measurement is an important way to identify the space debris, but the present methods of photometric measurement have many constraints on star image and need complex image processing. Aiming at the problems, a statistical learning modeling method for space debris photometric measurement is proposed based on the global consistency of the star image, and the statistical information of star images is used to eliminate the measurement noises. First, the known stars on the star image are divided into training stars and testing stars. Then, the training stars are selected as the least squares fitting parameters to construct the photometric measurement model, and the testing stars are used to calculate the measurement accuracy of the photometric measurement model. Experimental results show that, the accuracy of the proposed photometric measurement model is about 0.1 magnitudes.

  13. Statistic method of research reactors maximum permissible power calculation

    Grosheva, N.A.; Kirsanov, G.A.; Konoplev, K.A.; Chmshkyan, D.V.

    1998-01-01

    The technique for calculating maximum permissible power of a research reactor at which the probability of the thermal-process accident does not exceed the specified value, is presented. The statistical method is used for the calculations. It is regarded that the determining function related to the reactor safety is the known function of the reactor power and many statistically independent values which list includes the reactor process parameters, geometrical characteristics of the reactor core and fuel elements, as well as random factors connected with the reactor specific features. Heat flux density or temperature is taken as a limiting factor. The program realization of the method discussed is briefly described. The results of calculating the PIK reactor margin coefficients for different probabilities of the thermal-process accident are considered as an example. It is shown that the probability of an accident with fuel element melting in hot zone is lower than 10 -8 1 per year for the reactor rated power [ru

  14. Applied systems ecology: models, data, and statistical methods

    Eberhardt, L L

    1976-01-01

    In this report, systems ecology is largely equated to mathematical or computer simulation modelling. The need for models in ecology stems from the necessity to have an integrative device for the diversity of ecological data, much of which is observational, rather than experimental, as well as from the present lack of a theoretical structure for ecology. Different objectives in applied studies require specialized methods. The best predictive devices may be regression equations, often non-linear in form, extracted from much more detailed models. A variety of statistical aspects of modelling, including sampling, are discussed. Several aspects of population dynamics and food-chain kinetics are described, and it is suggested that the two presently separated approaches should be combined into a single theoretical framework. It is concluded that future efforts in systems ecology should emphasize actual data and statistical methods, as well as modelling.

  15. Multivariate methods and forecasting with IBM SPSS statistics

    Aljandali, Abdulkader

    2017-01-01

    This is the second of a two-part guide to quantitative analysis using the IBM SPSS Statistics software package; this volume focuses on multivariate statistical methods and advanced forecasting techniques. More often than not, regression models involve more than one independent variable. For example, forecasting methods are commonly applied to aggregates such as inflation rates, unemployment, exchange rates, etc., that have complex relationships with determining variables. This book introduces multivariate regression models and provides examples to help understand theory underpinning the model. The book presents the fundamentals of multivariate regression and then moves on to examine several related techniques that have application in business-orientated fields such as logistic and multinomial regression. Forecasting tools such as the Box-Jenkins approach to time series modeling are introduced, as well as exponential smoothing and naïve techniques. This part also covers hot topics such as Factor Analysis, Dis...

  16. Mathematical and Statistical Methods for Actuarial Sciences and Finance

    Legros, Florence; Perna, Cira; Sibillo, Marilena

    2017-01-01

    This volume gathers selected peer-reviewed papers presented at the international conference "MAF 2016 – Mathematical and Statistical Methods for Actuarial Sciences and Finance”, held in Paris (France) at the Université Paris-Dauphine from March 30 to April 1, 2016. The contributions highlight new ideas on mathematical and statistical methods in actuarial sciences and finance. The cooperation between mathematicians and statisticians working in insurance and finance is a very fruitful field, one that yields unique  theoretical models and practical applications, as well as new insights in the discussion of problems of national and international interest. This volume is addressed to academicians, researchers, Ph.D. students and professionals.

  17. Modern analytic methods applied to the art and archaeology

    Tenorio C, M. D.; Longoria G, L. C.

    2010-01-01

    The interaction of diverse areas as the analytic chemistry, the history of the art and the archaeology has allowed the development of a variety of techniques used in archaeology, in conservation and restoration. These methods have been used to date objects, to determine the origin of the old materials and to reconstruct their use and to identify the degradation processes that affect the integrity of the art works. The objective of this chapter is to offer a general vision on the researches that have been realized in the Instituto Nacional de Investigaciones Nucleares (ININ) in the field of cultural goods. A series of researches carried out in collaboration with national investigators and of the foreigner is described shortly, as well as with the great support of degree students and master in archaeology of the National School of Anthropology and History, since one of the goals that have is to diffuse the knowledge of the existence of these techniques among the young archaeologists, so that they have a wider vision of what they could use in an in mediate future and they can check hypothesis with scientific methods. (Author)

  18. Selected methods of waste monitoring using modern analytical techniques

    Hlavacek, I.; Hlavackova, I.

    1993-11-01

    Issues of the inspection and control of bituminized and cemented waste are discussed, and some methods of their nondestructive testing are described. Attention is paid to the inspection techniques, non-nuclear spectral techniques in particular, as employed for quality control of the wastes, waste concentrates, spent waste leaching solutions, as well as for the examination of environmental samples (waters and soils) from the surroundings of nuclear power plants. Some leaching tests used abroad for this purpose and practical analyses by the ICP-AES technique are given by way of example. The ICP-MS technique, which is unavailable in the Czech Republic, is routinely employed abroad for alpha nuclide measurements; examples of such analyses are also given. The next topic discussed includes the monitoring of organic acids and complexants to determine the degree of their thermal decomposition during the bituminization of wastes on an industrial line. All of the methods and procedures highlighted can be used as technical support during the monitoring of radioactive waste properties in industrial conditions, in the chemical and radiochemical analyses of wastes and related matter, in the calibration of nondestructive testing instrumentation, in the monitoring of contamination of the surroundings of nuclear facilities, and in trace analysis. (author). 10 tabs., 1 fig., 14 refs

  19. Statistical methods for longitudinal data with agricultural applications

    Anantharama Ankinakatte, Smitha

    The PhD study focuses on modeling two kings of longitudinal data arising in agricultural applications: continuous time series data and discrete longitudinal data. Firstly, two statistical methods, neural networks and generalized additive models, are applied to predict masistis using multivariate...... algorithm. This was found to compare favourably with the algorithm implemented in the well-known Beagle software. Finally, an R package to apply APFA models developed as part of the PhD project is described...

  20. Statistical methods in nuclear material accountancy: Past, present and future

    Pike, D.J.; Woods, A.J.

    1983-01-01

    The analysis of nuclear material inventory data is motivated by the desire to detect any loss or diversion of nuclear material, insofar as such detection may be feasible by statistical analysis of repeated inventory and throughput measurements. The early regulations, which laid down the specifications for the analysis of inventory data, were framed without acknowledging the essentially sequential nature of the data. It is the broad aim of this paper to discuss the historical nature of statistical analysis of inventory data including an evaluation of why statistical methods should be required at all. If it is accepted that statistical techniques are required, then two main areas require extensive discussion. First, it is important to assess the extent to which stated safeguards aims can be met in practice. Second, there is a vital need for reassessment of the statistical techniques which have been proposed for use in nuclear material accountancy. Part of this reassessment must involve a reconciliation of the apparent differences in philosophy shown by statisticians; but, in addition, the techniques themselves need comparative study to see to what extent they are capable of meeting realistic safeguards aims. This paper contains a brief review of techniques with an attempt to compare and contrast the approaches. It will be suggested that much current research is following closely similar lines, and that national and international bodies should encourage collaborative research and practical in-plant implementations. The techniques proposed require credibility and power; but at this point in time statisticians require credibility and a greater level of unanimity in their approach. A way ahead is proposed based on a clear specification of realistic safeguards aims, and a development of a unified statistical approach with encouragement for the performance of joint research. (author)

  1. Modern spectrometric methods for the analysis of labelled compounds

    Kaspersen, F.M.; Funke, C.W.; Wagenaars, G.N.; Jacobs, P.L.

    1988-01-01

    A proper analysis of chemical compounds should give information about the chemical identity (not only the structure but also enantiomeric form), the chemical purity and chemical composition (e.g. giving information about counter-ions, solvents of crystallization). For labelled compounds information is also needed about isotopic purity (defined as the % of isotope present in the compound), the position/distribution of the isotope in the molecule and degree of labelling/specific activity. In the past ten years the possibilities for spectrometric analyses of labelled compounds have increased enormously and this chapter will give an overview of these methods with the exception of (radio)chromatography that will be dealt with in another chapter. (author)

  2. GLASS FIBERS – MODERN METHOD IN THE WOOD BEAMS REINFORCEMENT

    Cătălina IANĂŞI

    2017-05-01

    Full Text Available : One of the defining goals of this paper is getting new resistant material which combine the qualities of basic materials that get into its composition but not to borrow from them their negative properties. Specifically, the use of GFRP composite materials as reinforcement for wood beams under bending loads requires paying attention to several aspects of the problem such as the number of the composite layers applied on the wood beams. The results obtained in this paper indicate that the behavior of reinforced beams is totally different from that of un-reinforced one. The main conclusion of the tests is that the tensioning forces allow beam taking a maximum load for a while, something that is particularly useful when we consider a real construction, The experiments have shown that the method of increasing resistance of wood constructions with composite materials is good for it and easy to implement.

  3. State analysis of BOP using statistical and heuristic methods

    Heo, Gyun Young; Chang, Soon Heung

    2003-01-01

    Under the deregulation environment, the performance enhancement of BOP in nuclear power plants is being highlighted. To analyze performance level of BOP, we use the performance test procedures provided from an authorized institution such as ASME. However, through plant investigation, it was proved that the requirements of the performance test procedures about the reliability and quantity of sensors was difficult to be satisfied. As a solution of this, state analysis method that are the expanded concept of signal validation, was proposed on the basis of the statistical and heuristic approaches. Authors recommended the statistical linear regression model by analyzing correlation among BOP parameters as a reference state analysis method. Its advantage is that its derivation is not heuristic, it is possible to calculate model uncertainty, and it is easy to apply to an actual plant. The error of the statistical linear regression model is below 3% under normal as well as abnormal system states. Additionally a neural network model was recommended since the statistical model is impossible to apply to the validation of all of the sensors and is sensitive to the outlier that is the signal located out of a statistical distribution. Because there are a lot of sensors need to be validated in BOP, wavelet analysis (WA) were applied as a pre-processor for the reduction of input dimension and for the enhancement of training accuracy. The outlier localization capability of WA enhanced the robustness of the neural network. The trained neural network restored the degraded signals to the values within ±3% of the true signals

  4. Statistical optics

    Goodman, Joseph W

    2015-01-01

    This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications.  The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i

  5. A Modern Method to Monitor Office Blood Pressure

    Emiliya Khazan

    2017-10-01

    Full Text Available The diagnosis and management of hypertension relies on accurate and precise blood pressure (BP measurements and monitoring techniques. Variability in traditional office based BP readings can contribute to misclassification and potential misdiagnosis of hypertension, leading to inappropriate treatment and possibly avoidable adverse drug events. Both home blood pressure monitoring (HBPM and 24-hour ambulatory blood pressure monitoring (ABPM can improve characterization of BP status over traditional office values and can predict cardiovascular morbidity and mortality risk; however, they are limited by availability and/or practical use in many situations. Available in-office blood pressure measuring methods include manual auscultation, automated oscillometric, and automated office blood pressure (AOBP devices. A strong correlation exists between AOBP and awake ABPM measurements and has been linked to better prediction of end-organ damage and white coat response compared to standard office BP methods. While AOBP does not provide nocturnal BP readings, it can be utilized in several outpatient settings, and has the capability to decrease utilization of ABPM, white coat effect, and improve optimization of cardiovascular assessment, evaluation, and therapeutic assessment in clinical practice. Hypertension affects over 80 million adults in the United States (US and is a major risk factor for cardiovascular morbidity and mortality [1]. The condition’s ubiquitous nature and broad impact potentially makes understanding the diagnosis and treatment of hypertension key elements of managing cardiovascular risk. Though much attention is paid to the treatment of hypertension, from 2009 to 2012, 45.9% of US patients with hypertension were uncontrolled [1]. Appreciating the aspects of proper assessment of blood pressure is crucial and creates the foundation for approaching hypertension management. Until recently, hypertension was defined as an appropriately

  6. Modern Methods for Modeling Change in Obesity Research in Nursing.

    Sereika, Susan M; Zheng, Yaguang; Hu, Lu; Burke, Lora E

    2017-08-01

    Persons receiving treatment for weight loss often demonstrate heterogeneity in lifestyle behaviors and health outcomes over time. Traditional repeated measures approaches focus on the estimation and testing of an average temporal pattern, ignoring the interindividual variability about the trajectory. An alternate person-centered approach, group-based trajectory modeling, can be used to identify distinct latent classes of individuals following similar trajectories of behavior or outcome change as a function of age or time and can be expanded to include time-invariant and time-dependent covariates and outcomes. Another latent class method, growth mixture modeling, builds on group-based trajectory modeling to investigate heterogeneity within the distinct trajectory classes. In this applied methodologic study, group-based trajectory modeling for analyzing changes in behaviors or outcomes is described and contrasted with growth mixture modeling. An illustration of group-based trajectory modeling is provided using calorie intake data from a single-group, single-center prospective study for weight loss in adults who are either overweight or obese.

  7. [Evidence-based medicine: modern scientific methods for determining usefulness].

    Schmidt, J G

    1999-01-01

    For quite some time, clinical epidemiology has introduced the art of critical appraisal of evidence as well as the methods of how to design sound clinical studies and trials. Almost unnoticed by most medical institutions a new hierarchy of evidence has emerged which puts well thought out trials, able to document unbiased treatment benefit in terms of patient suffering, above pathophysiological theory. Many controlled trials have shown, in the meantime, that the control of laboratory or other kind of pathologies and the correction of anatomical abnormalities do not necessarily mean a benefit for the patient. Concepts relating to this dissection of evidence include: Surrogate fallacy ("cosmetics" of laboratory results or ligament or cartilage "cosmetics" in surgery), confounding (spurious causal relationships), selection bias (comparison with selected groups) as well as lead-time bias (mistaking earlier diagnosis as increase of survival), length bias (overlooking differences in the aggressiveness of diseases as determinants of disease stage distributions) and overdiagnosis bias (mistaking the increasing detection of clinically silent pathologies as improvement of prognosis). Moreover, absolute instead of relative risk reduction needs to be used to measure patient benefit. The incorporation of decision-analysis and of the concepts or clinical epidemiology will improve the efficiency and quality of medicine much more effectively than the sole focus on technical medical performance. Evidence based medicine is the systematic and critical appraisal of medical interventions, based on the understanding how to avoid the fallacies and biases mentioned.

  8. Statistical benchmarking in utility regulation: Role, standards and methods

    Newton Lowry, Mark; Getachew, Lullit

    2009-01-01

    Statistical benchmarking is being used with increasing frequency around the world in utility rate regulation. We discuss how and where benchmarking is in use for this purpose and the pros and cons of regulatory benchmarking. We then discuss alternative performance standards and benchmarking methods in regulatory applications. We use these to propose guidelines for the appropriate use of benchmarking in the rate setting process. The standards, which we term the competitive market and frontier paradigms, have a bearing on method selection. These along with regulatory experience suggest that benchmarking can either be used for prudence review in regulation or to establish rates or rate setting mechanisms directly

  9. Statistical methods of spin assignment in compound nuclear reactions

    Mach, H.; Johns, M.W.

    1984-01-01

    Spin assignment to nuclear levels can be obtained from standard in-beam gamma-ray spectroscopy techniques and in the case of compound nuclear reactions can be complemented by statistical methods. These are based on a correlation pattern between level spin and gamma-ray intensities feeding low-lying levels. Three types of intensity and level spin correlations are found suitable for spin assignment: shapes of the excitation functions, ratio of intensity at two beam energies or populated in two different reactions, and feeding distributions. Various empirical attempts are examined and the range of applicability of these methods as well as the limitations associated with them are given. 12 references

  10. Statistical methods of spin assignment in compound nuclear reactions

    Mach, H.; Johns, M.W.

    1985-01-01

    Spin assignment to nuclear levels can be obtained from standard in-beam gamma-ray spectroscopy techniques and in the case of compound nuclear reactions can be complemented by statistical methods. These are based on a correlation pattern between level spin and gamma-ray intensities feeding low-lying levels. Three types of intensity and level spin correlations are found suitable for spin assignment: shapes of the excitation functions, ratio of intensity at two beam energies or populated in two different reactions, and feeding distributions. Various empirical attempts are examined and the range of applicability of these methods as well as the limitations associated with them are given

  11. On second quantization methods applied to classical statistical mechanics

    Matos Neto, A.; Vianna, J.D.M.

    1984-01-01

    A method of expressing statistical classical results in terms of mathematical entities usually associated to quantum field theoretical treatment of many particle systems (Fock space, commutators, field operators, state vector) is discussed. It is developed a linear response theory using the 'second quantized' Liouville equation introduced by Schonberg. The relationship of this method to that of Prigogine et al. is briefly analyzed. The chain of equations and the spectral representations for the new classical Green's functions are presented. Generalized operators defined on Fock space are discussed. It is shown that the correlation functions can be obtained from Green's functions defined with generalized operators. (Author) [pt

  12. Modern diagnostic method of microelementosis of school age children

    Rasulov, S.K.

    2006-01-01

    Full text: Human and animal pathology stipulated by deficiency of vitally important (or 'essential') microelements or their excess, has got its combined name microelementosis [1]. In connection with high biological activity of microelements in organism in different physiologic and pathologic status the quantitative determination of several metals in biomedium of organism is of great importance in the study of microelement metabolism. However, objective and representative data on estimation of school children's provision with microelements are practically absent. The objective of the study was to investigate contents of microelements connected with deficiency of biometals participating in hemopoiesis (Cu, Zn, Co, Mn) in biomedium of the organism of school children in Zarafshan region of the Republic of Uzbekistan. We have applied the method of neutron-activation analysis for determination of microelements (Fe, Zn, Cu, Co, Mn) in hair, whole blood, blood serum, urine, saliva, food-stuff samples and in more than 20 elements of other biomedia, as per designed method in Nuclear Physics Institute, Republic of Uzbekistan [4]. The study was carried out on 245 practically healthy children aged 7-17, 131 boys and 33 girls living in four different areas of Samarkand region. According to the designed method the determination of Mn, Cu was done as follows: samples together with standards were packed in polyethylene container and underwent irradiation in vertical channel of the reactor by neutron flow 5x10 13 neutron cm -2 sec - 1 , (for 15 seconds). The measurement of direct activity was conducted in 2 hours for determining of Cu and Mn. For determining of iron, cobalt, zinc the irradiation test measurement was done within 15 hours one month after irradiation by the corresponding radionuclides. In all measurement of element contents different standards were applied: Intralaboratory data was received by fixing a certain number of elements on ashless filter paper and comparison

  13. Methods for estimating low-flow statistics for Massachusetts streams

    Ries, Kernell G.; Friesz, Paul J.

    2000-01-01

    Methods and computer software are described in this report for determining flow duration, low-flow frequency statistics, and August median flows. These low-flow statistics can be estimated for unregulated streams in Massachusetts using different methods depending on whether the location of interest is at a streamgaging station, a low-flow partial-record station, or an ungaged site where no data are available. Low-flow statistics for streamgaging stations can be estimated using standard U.S. Geological Survey methods described in the report. The MOVE.1 mathematical method and a graphical correlation method can be used to estimate low-flow statistics for low-flow partial-record stations. The MOVE.1 method is recommended when the relation between measured flows at a partial-record station and daily mean flows at a nearby, hydrologically similar streamgaging station is linear, and the graphical method is recommended when the relation is curved. Equations are presented for computing the variance and equivalent years of record for estimates of low-flow statistics for low-flow partial-record stations when either a single or multiple index stations are used to determine the estimates. The drainage-area ratio method or regression equations can be used to estimate low-flow statistics for ungaged sites where no data are available. The drainage-area ratio method is generally as accurate as or more accurate than regression estimates when the drainage-area ratio for an ungaged site is between 0.3 and 1.5 times the drainage area of the index data-collection site. Regression equations were developed to estimate the natural, long-term 99-, 98-, 95-, 90-, 85-, 80-, 75-, 70-, 60-, and 50-percent duration flows; the 7-day, 2-year and the 7-day, 10-year low flows; and the August median flow for ungaged sites in Massachusetts. Streamflow statistics and basin characteristics for 87 to 133 streamgaging stations and low-flow partial-record stations were used to develop the equations. The

  14. [Cognitive functions, their development and modern diagnostic methods].

    Klasik, Adam; Janas-Kozik, Małgorzata; Krupka-Matuszczyk, Irena; Augustyniak, Ewa

    2006-01-01

    provided a theory. The psychometric approach concentrates on studying the differences in intelligence. The aim of this approach is to test intelligence by means of standardized tests (e.g. WISC-R, WAIS-R) used to show the individual differences among humans. Human cognitive functions determine individuals' adaptation capabilities and disturbances in this area indicate a number of psychopathological changes and are a symptom enabling to differentiate or diagnose one with a disorder. That is why the psychological assessment of cognitive functions is an important part of patients' diagnosis. Contemporary neuropsychological studies are to a great extent based computer tests. The use of computer methods has a number of measurement-related advantages. It allows for standardized testing environment, increasing therefore its reliability and standardizes the patient assessment process. Special attention should be paid to the neuropsychological tests included in the Vienna Test System (Cognitron, SIGNAL, RT, VIGIL, DAUF), which are used to assess the operational memory span, learning processes, reaction time, attention selective function, attention continuity as well as attention interference resistance. It also seems justified to present the CPT id test (Continuous Performance Test) as well as Free Recall. CPT is a diagnostic tool used to assess the attention selective function, attention continuity of attention, attention interference resistance as well as attention alertness. The Free Recall test is used in the memory processes diagnostics to assess patients' operational memory as well as the information organization degree in operational memory. The above mentioned neuropsychological tests are tools used in clinical assessment of cognitive function disorders.

  15. Literature in Focus: Statistical Methods in Experimental Physics

    2007-01-01

    Frederick James was a high-energy physicist who became the CERN "expert" on statistics and is now well-known around the world, in part for this famous text. The first edition of Statistical Methods in Experimental Physics was originally co-written with four other authors and was published in 1971 by North Holland (now an imprint of Elsevier). It became such an important text that demand for it has continued for more than 30 years. Fred has updated it and it was released in a second edition by World Scientific in 2006. It is still a top seller and there is no exaggeration in calling it «the» reference on the subject. A full review of the title appeared in the October CERN Courier.Come and meet the author to hear more about how this book has flourished during its 35-year lifetime. Frederick James Statistical Methods in Experimental Physics Monday, 26th of November, 4 p.m. Council Chamber (Bldg. 503-1-001) The author will be introduced...

  16. Fuel rod design by statistical methods for MOX fuel

    Heins, L.; Landskron, H.

    2000-01-01

    Statistical methods in fuel rod design have received more and more attention during the last years. One of different possible ways to use statistical methods in fuel rod design can be described as follows: Monte Carlo calculations are performed using the fuel rod code CARO. For each run with CARO, the set of input data is modified: parameters describing the design of the fuel rod (geometrical data, density etc.) and modeling parameters are randomly selected according to their individual distributions. Power histories are varied systematically in a way that each power history of the relevant core management calculation is represented in the Monte Carlo calculations with equal frequency. The frequency distributions of the results as rod internal pressure and cladding strain which are generated by the Monte Carlo calculation are evaluated and compared with the design criteria. Up to now, this methodology has been applied to licensing calculations for PWRs and BWRs, UO 2 and MOX fuel, in 3 countries. Especially for the insertion of MOX fuel resulting in power histories with relatively high linear heat generation rates at higher burnup, the statistical methodology is an appropriate approach to demonstrate the compliance of licensing requirements. (author)

  17. Heterogeneous Rock Simulation Using DIP-Micromechanics-Statistical Methods

    H. Molladavoodi

    2018-01-01

    Full Text Available Rock as a natural material is heterogeneous. Rock material consists of minerals, crystals, cement, grains, and microcracks. Each component of rock has a different mechanical behavior under applied loading condition. Therefore, rock component distribution has an important effect on rock mechanical behavior, especially in the postpeak region. In this paper, the rock sample was studied by digital image processing (DIP, micromechanics, and statistical methods. Using image processing, volume fractions of the rock minerals composing the rock sample were evaluated precisely. The mechanical properties of the rock matrix were determined based on upscaling micromechanics. In order to consider the rock heterogeneities effect on mechanical behavior, the heterogeneity index was calculated in a framework of statistical method. A Weibull distribution function was fitted to the Young modulus distribution of minerals. Finally, statistical and Mohr–Coulomb strain-softening models were used simultaneously as a constitutive model in DEM code. The acoustic emission, strain energy release, and the effect of rock heterogeneities on the postpeak behavior process were investigated. The numerical results are in good agreement with experimental data.

  18. THE FLUORBOARD A STATISTICALLY BASED DASHBOARD METHOD FOR IMPROVING SAFETY

    PREVETTE, S.S.

    2005-01-01

    The FluorBoard is a statistically based dashboard method for improving safety. Fluor Hanford has achieved significant safety improvements--including more than a 80% reduction in OSHA cases per 200,000 hours, during its work at the US Department of Energy's Hanford Site in Washington state. The massive project on the former nuclear materials production site is considered one of the largest environmental cleanup projects in the world. Fluor Hanford's safety improvements were achieved by a committed partnering of workers, managers, and statistical methodology. Safety achievements at the site have been due to a systematic approach to safety. This includes excellent cooperation between the field workers, the safety professionals, and management through OSHA Voluntary Protection Program principles. Fluor corporate values are centered around safety, and safety excellence is important for every manager in every project. In addition, Fluor Hanford has utilized a rigorous approach to using its safety statistics, based upon Dr. Shewhart's control charts, and Dr. Deming's management and quality methods

  19. Antisperm antibodies as a factor of male infertility. Relevance, modern methods of diagnosis and treatment

    O. A. Nikiforov

    2017-08-01

    Full Text Available According to WHO statistics 40 % of childless marriage is due to factors of male infertility. One of them is the presence of antisperm antibodies in the male organism, which may be in blood serum, on the surface of spermatozoids and seminal plasma. Aim. Оn the grounds of specialized literature analysis, to show the relevance of this problem in Reproductive Medicine, to descript Basic methods of Modern treatment and diagnosis of this pathology in the body of infertile males. The most common methods of antisperm antibodies identifying are: MAR-test sample Shuvarskiy–Sims–Hyuner, Kurtsrok–Miller test, the method of latex agglutination, solid-phase immunoenzymatic blood test. Indications for antisperm antibodies determining are: modified indices, deviations in post-coital test, a negative test of sperm and cervical mucus interaction in vitro, unexplained infertility in the married couples, failure or low indices during IVF (in vitro fertilization and of course, the exclusion of other causes of infertility. When antisperm antibodies are detected, the strategy of treatment may be destined to reduction of their titer for further pregnancy. Such types of therapy can be used: contraceptive (long-term use contraception barrier to reduce antisperm antibodies titer in women, plasmapheresis, artificial insemination with pretreated from antisperm antibodies husband's sperm, methods of assisted reproductive technologies. Conclusoins. The formation of antisperm antibodies leads to infertility of immunological genesis (in 20 % of couples with unexplained infertility. To confirm their presence in the male body it is necessary to perform the MAR-test, Shuvarsky test, other tests and, of course, the exclusion of other causes of infertility. Men of reproductive age with an immunological factor of infertility provides for a comprehensive treatment, including elimination of all possible causative and contributing factors of infertility (infection of the male

  20. Statistical physics and computational methods for evolutionary game theory

    Javarone, Marco Alberto

    2018-01-01

    This book presents an introduction to Evolutionary Game Theory (EGT) which is an emerging field in the area of complex systems attracting the attention of researchers from disparate scientific communities. EGT allows one to represent and study several complex phenomena, such as the emergence of cooperation in social systems, the role of conformity in shaping the equilibrium of a population, and the dynamics in biological and ecological systems. Since EGT models belong to the area of complex systems, statistical physics constitutes a fundamental ingredient for investigating their behavior. At the same time, the complexity of some EGT models, such as those realized by means of agent-based methods, often require the implementation of numerical simulations. Therefore, beyond providing an introduction to EGT, this book gives a brief overview of the main statistical physics tools (such as phase transitions and the Ising model) and computational strategies for simulating evolutionary games (such as Monte Carlo algor...

  1. Huffman and linear scanning methods with statistical language models.

    Roark, Brian; Fried-Oken, Melanie; Gibbons, Chris

    2015-03-01

    Current scanning access methods for text generation in AAC devices are limited to relatively few options, most notably row/column variations within a matrix. We present Huffman scanning, a new method for applying statistical language models to binary-switch, static-grid typing AAC interfaces, and compare it to other scanning options under a variety of conditions. We present results for 16 adults without disabilities and one 36-year-old man with locked-in syndrome who presents with complex communication needs and uses AAC scanning devices for writing. Huffman scanning with a statistical language model yielded significant typing speedups for the 16 participants without disabilities versus any of the other methods tested, including two row/column scanning methods. A similar pattern of results was found with the individual with locked-in syndrome. Interestingly, faster typing speeds were obtained with Huffman scanning using a more leisurely scan rate than relatively fast individually calibrated scan rates. Overall, the results reported here demonstrate great promise for the usability of Huffman scanning as a faster alternative to row/column scanning.

  2. Statistical Method to Overcome Overfitting Issue in Rational Function Models

    Alizadeh Moghaddam, S. H.; Mokhtarzade, M.; Alizadeh Naeini, A.; Alizadeh Moghaddam, S. A.

    2017-09-01

    Rational function models (RFMs) are known as one of the most appealing models which are extensively applied in geometric correction of satellite images and map production. Overfitting is a common issue, in the case of terrain dependent RFMs, that degrades the accuracy of RFMs-derived geospatial products. This issue, resulting from the high number of RFMs' parameters, leads to ill-posedness of the RFMs. To tackle this problem, in this study, a fast and robust statistical approach is proposed and compared to Tikhonov regularization (TR) method, as a frequently-used solution to RFMs' overfitting. In the proposed method, a statistical test, namely, significance test is applied to search for the RFMs' parameters that are resistant against overfitting issue. The performance of the proposed method was evaluated for two real data sets of Cartosat-1 satellite images. The obtained results demonstrate the efficiency of the proposed method in term of the achievable level of accuracy. This technique, indeed, shows an improvement of 50-80% over the TR.

  3. Radiological decontamination, survey, and statistical release method for vehicles

    Goodwill, M.E.; Lively, J.W.; Morris, R.L.

    1996-06-01

    Earth-moving vehicles (e.g., dump trucks, belly dumps) commonly haul radiologically contaminated materials from a site being remediated to a disposal site. Traditionally, each vehicle must be surveyed before being released. The logistical difficulties of implementing the traditional approach on a large scale demand that an alternative be devised. A statistical method for assessing product quality from a continuous process was adapted to the vehicle decontamination process. This method produced a sampling scheme that automatically compensates and accommodates fluctuating batch sizes and changing conditions without the need to modify or rectify the sampling scheme in the field. Vehicles are randomly selected (sampled) upon completion of the decontamination process to be surveyed for residual radioactive surface contamination. The frequency of sampling is based on the expected number of vehicles passing through the decontamination process in a given period and the confidence level desired. This process has been successfully used for 1 year at the former uranium millsite in Monticello, Utah (a cleanup site regulated under the Comprehensive Environmental Response, Compensation, and Liability Act). The method forces improvement in the quality of the decontamination process and results in a lower likelihood that vehicles exceeding the surface contamination standards are offered for survey. Implementation of this statistical sampling method on Monticello projects has resulted in more efficient processing of vehicles through decontamination and radiological release, saved hundreds of hours of processing time, provided a high level of confidence that release limits are met, and improved the radiological cleanliness of vehicles leaving the controlled site

  4. Statistics

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  5. [Reasearch on evolution and transition of processing method of fuzi in ancient and modern times].

    Liu, Chan-Chan; Cheng, Ming-En; Duan, Hai-Yan; Peng, Hua-Sheng

    2014-04-01

    Fuzi is a medicine used for rescuing from collapse by restoring yang as well as a famous toxic traditional Chinese medicine. In order to ensure the efficacy and safe medication, Fuzi has mostly been applied after being processed. There have been different Fuzi processing methods recorded by doctors of previous generations. Besides, there have also been differences in Fuzi processing methods recorded in modern pharmacopeia and ancient medical books. In this study, the authors traced back to medical books between the Han Dynasty and the period of Republic of China, and summarized Fuzi processing methods collected in ancient and modern literatures. According to the results, Fuzi processing methods and using methods have changed along with the evolution of dynasties, with differences in ancient and modern processing methods. Before the Tang Dynasty, Fuzi had been mostly processed and soaked. From Tang to Ming Dynasties, Fuzi had been mostly processed, soaked and stir-fried. During the Qing Dynasty, Fuzi had been mostly soaked and boiled. In the modem times, Fuzi is mostly processed by being boiled and soaked. Before the Tang Dynasty, a whole piece of Fuzi herbs or their fragments had been applied in medicines; Whereas their fragments are primarily used in the modern times. Because different processing methods have great impacts on the toxicity of Fuzi, it is suggested to study Fuzi processing methods.

  6. ACARP Project C10059. ACARP manual of modern coal testing methods. Volume 2: Appendices

    Sakurovs, R.; Creelman, R.; Pohl, J.; Juniper, L. [CSIRO Energy Technology, Sydney, NSW (Australia)

    2002-07-01

    The Manual summarises the purpose, applicability, and limitations of a range of standard and modern coal testing methods that have potential to assist the coal company technologist to better evaluate coal performance. It is presented in two volumes. This second volume provides more detailed information regarding the methods discussed in Volume 1.

  7. The Innovation Blaze-Method of Development Professional Thinking Designers in the Modern Higher Education

    Alekseeva, Irina V.; Barsukova, Natalia I.; Pallotta, Valentina I.; Skovorodnikova, Nadia A.

    2017-01-01

    This article proved the urgency of the problem of development of professional thinking of students studying design in modern conditions of higher education. The authors substantiate for the need of an innovative Blaise-method development of professional design thinking of students in higher education. "Blaise-method" named by us in…

  8. Mathematical and statistical methods for actuarial sciences and finance

    Sibillo, Marilena

    2014-01-01

    The interaction between mathematicians and statisticians working in the actuarial and financial fields is producing numerous meaningful scientific results. This volume, comprising a series of four-page papers, gathers new ideas relating to mathematical and statistical methods in the actuarial sciences and finance. The book covers a variety of topics of interest from both theoretical and applied perspectives, including: actuarial models; alternative testing approaches; behavioral finance; clustering techniques; coherent and non-coherent risk measures; credit-scoring approaches; data envelopment analysis; dynamic stochastic programming; financial contagion models; financial ratios; intelligent financial trading systems; mixture normality approaches; Monte Carlo-based methodologies; multicriteria methods; nonlinear parameter estimation techniques; nonlinear threshold models; particle swarm optimization; performance measures; portfolio optimization; pricing methods for structured and non-structured derivatives; r...

  9. Evolutionary Computation Methods and their applications in Statistics

    Francesco Battaglia

    2013-05-01

    Full Text Available A brief discussion of the genesis of evolutionary computation methods, their relationship to artificial intelligence, and the contribution of genetics and Darwin’s theory of natural evolution is provided. Then, the main evolutionary computation methods are illustrated: evolution strategies, genetic algorithms, estimation of distribution algorithms, differential evolution, and a brief description of some evolutionary behavior methods such as ant colony and particle swarm optimization. We also discuss the role of the genetic algorithm for multivariate probability distribution random generation, rather than as a function optimizer. Finally, some relevant applications of genetic algorithm to statistical problems are reviewed: selection of variables in regression, time series model building, outlier identification, cluster analysis, design of experiments.

  10. Hybrid perturbation methods based on statistical time series models

    San-Juan, Juan Félix; San-Martín, Montserrat; Pérez, Iván; López, Rosario

    2016-04-01

    In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.

  11. Classification of Specialized Farms Applying Multivariate Statistical Methods

    Zuzana Hloušková

    2017-01-01

    Full Text Available Classification of specialized farms applying multivariate statistical methods The paper is aimed at application of advanced multivariate statistical methods when classifying cattle breeding farming enterprises by their economic size. Advantage of the model is its ability to use a few selected indicators compared to the complex methodology of current classification model that requires knowledge of detailed structure of the herd turnover and structure of cultivated crops. Output of the paper is intended to be applied within farm structure research focused on future development of Czech agriculture. As data source, the farming enterprises database for 2014 has been used, from the FADN CZ system. The predictive model proposed exploits knowledge of actual size classes of the farms tested. Outcomes of the linear discriminatory analysis multifactor classification method have supported the chance of filing farming enterprises in the group of Small farms (98 % filed correctly, and the Large and Very Large enterprises (100 % filed correctly. The Medium Size farms have been correctly filed at 58.11 % only. Partial shortages of the process presented have been found when discriminating Medium and Small farms.

  12. Application of mathematical statistics methods to study fluorite deposits

    Chermeninov, V.B.

    1980-01-01

    Considered are the applicability of mathematical-statistical methods for the increase of reliability of sampling and geological tasks (study of regularities of ore formation). Compared is the reliability of core sampling (regarding the selective abrasion of fluorite) and neutron activation logging for fluorine. The core sampling data are characterized by higher dispersion than neutron activation logging results (mean value of variation coefficients are 75% and 56% respectively). However the hypothesis of the equality of average two sampling is confirmed; this fact testifies to the absence of considerable variability of ore bodies

  13. Algebraic methods in statistical mechanics and quantum field theory

    Emch, Dr Gérard G

    2009-01-01

    This systematic algebraic approach concerns problems involving a large number of degrees of freedom. It extends the traditional formalism of quantum mechanics, and it eliminates conceptual and mathematical difficulties common to the development of statistical mechanics and quantum field theory. Further, the approach is linked to research in applied and pure mathematics, offering a reflection of the interplay between formulation of physical motivations and self-contained descriptions of the mathematical methods.The four-part treatment begins with a survey of algebraic approaches to certain phys

  14. Statistical methods for determining the effect of mammography screening

    Lophaven, Søren

    2016-01-01

    In an overview of five randomised controlled trials from Sweden, a reduction of 29% was found in breast cancer mortality in women aged 50-69 at randomisation after a follow up of 5-13 years. Organised, population based, mammography service screening was introduced on the basis of these resultsin...... in 2007-2008. Women aged 50-69 were invited to screening every second year. Taking advantage of the registers of population and health, we present statistical methods for evaluating the effect of mammography screening on breast cancer mortality (Olsen et al. 2005, Njor et al. 2015 and Weedon-Fekjær etal...

  15. Integration of modern statistical tools for the analysis of climate extremes into the web-GIS “CLIMATE”

    Ryazanova, A. A.; Okladnikov, I. G.; Gordov, E. P.

    2017-11-01

    The frequency of occurrence and magnitude of precipitation and temperature extreme events show positive trends in several geographical regions. These events must be analyzed and studied in order to better understand their impact on the environment, predict their occurrences, and mitigate their effects. For this purpose, we augmented web-GIS called “CLIMATE” to include a dedicated statistical package developed in the R language. The web-GIS “CLIMATE” is a software platform for cloud storage processing and visualization of distributed archives of spatial datasets. It is based on a combined use of web and GIS technologies with reliable procedures for searching, extracting, processing, and visualizing the spatial data archives. The system provides a set of thematic online tools for the complex analysis of current and future climate changes and their effects on the environment. The package includes new powerful methods of time-dependent statistics of extremes, quantile regression and copula approach for the detailed analysis of various climate extreme events. Specifically, the very promising copula approach allows obtaining the structural connections between the extremes and the various environmental characteristics. The new statistical methods integrated into the web-GIS “CLIMATE” can significantly facilitate and accelerate the complex analysis of climate extremes using only a desktop PC connected to the Internet.

  16. A method for statistically comparing spatial distribution maps

    Reynolds Mary G

    2009-01-01

    Full Text Available Abstract Background Ecological niche modeling is a method for estimation of species distributions based on certain ecological parameters. Thus far, empirical determination of significant differences between independently generated distribution maps for a single species (maps which are created through equivalent processes, but with different ecological input parameters, has been challenging. Results We describe a method for comparing model outcomes, which allows a statistical evaluation of whether the strength of prediction and breadth of predicted areas is measurably different between projected distributions. To create ecological niche models for statistical comparison, we utilized GARP (Genetic Algorithm for Rule-Set Production software to generate ecological niche models of human monkeypox in Africa. We created several models, keeping constant the case location input records for each model but varying the ecological input data. In order to assess the relative importance of each ecological parameter included in the development of the individual predicted distributions, we performed pixel-to-pixel comparisons between model outcomes and calculated the mean difference in pixel scores. We used a two sample Student's t-test, (assuming as null hypothesis that both maps were identical to each other regardless of which input parameters were used to examine whether the mean difference in corresponding pixel scores from one map to another was greater than would be expected by chance alone. We also utilized weighted kappa statistics, frequency distributions, and percent difference to look at the disparities in pixel scores. Multiple independent statistical tests indicated precipitation as the single most important independent ecological parameter in the niche model for human monkeypox disease. Conclusion In addition to improving our understanding of the natural factors influencing the distribution of human monkeypox disease, such pixel-to-pixel comparison

  17. Statistical methods in the mechanical design of fuel assemblies

    Radsak, C.; Streit, D.; Muench, C.J. [AREVA NP GmbH, Erlangen (Germany)

    2013-07-01

    The mechanical design of a fuel assembly is still being mainly performed in a de terministic way. This conservative approach is however not suitable to provide a realistic quantification of the design margins with respect to licensing criter ia for more and more demanding operating conditions (power upgrades, burnup increase,..). This quantification can be provided by statistical methods utilizing all available information (e.g. from manufacturing, experience feedback etc.) of the topic under consideration. During optimization e.g. of the holddown system certain objectives in the mechanical design of a fuel assembly (FA) can contradict each other, such as sufficient holddown forces enough to prevent fuel assembly lift-off and reducing the holddown forces to minimize axial loads on the fuel assembly structure to ensure no negative effect on the control rod movement.By u sing a statistical method the fuel assembly design can be optimized much better with respect to these objectives than it would be possible based on a deterministic approach. This leads to a more realistic assessment and safer way of operating fuel assemblies. Statistical models are defined on the one hand by the quanti le that has to be maintained concerning the design limit requirements (e.g. one FA quantile) and on the other hand by the confidence level which has to be met. Using the above example of the holddown force, a feasible quantile can be define d based on the requirement that less than one fuel assembly (quantile > 192/19 3 [%] = 99.5 %) in the core violates the holddown force limit w ith a confidence of 95%. (orig.)

  18. Statistics

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  19. A novel statistical method for classifying habitat generalists and specialists

    Chazdon, Robin L; Chao, Anne; Colwell, Robert K

    2011-01-01

    in second-growth (SG) and old-growth (OG) rain forests in the Caribbean lowlands of northeastern Costa Rica. We evaluate the multinomial model in detail for the tree data set. Our results for birds were highly concordant with a previous nonstatistical classification, but our method classified a higher......: (1) generalist; (2) habitat A specialist; (3) habitat B specialist; and (4) too rare to classify with confidence. We illustrate our multinomial classification method using two contrasting data sets: (1) bird abundance in woodland and heath habitats in southeastern Australia and (2) tree abundance...... fraction (57.7%) of bird species with statistical confidence. Based on a conservative specialization threshold and adjustment for multiple comparisons, 64.4% of tree species in the full sample were too rare to classify with confidence. Among the species classified, OG specialists constituted the largest...

  20. Statistically Consistent k-mer Methods for Phylogenetic Tree Reconstruction.

    Allman, Elizabeth S; Rhodes, John A; Sullivant, Seth

    2017-02-01

    Frequencies of k-mers in sequences are sometimes used as a basis for inferring phylogenetic trees without first obtaining a multiple sequence alignment. We show that a standard approach of using the squared Euclidean distance between k-mer vectors to approximate a tree metric can be statistically inconsistent. To remedy this, we derive model-based distance corrections for orthologous sequences without gaps, which lead to consistent tree inference. The identifiability of model parameters from k-mer frequencies is also studied. Finally, we report simulations showing that the corrected distance outperforms many other k-mer methods, even when sequences are generated with an insertion and deletion process. These results have implications for multiple sequence alignment as well since k-mer methods are usually the first step in constructing a guide tree for such algorithms.

  1. Safety bey statistics? A critical view on statistical methods applied in health physics

    Kraut, W.

    2016-01-01

    The only proper way to describe uncertainties in health physics is by statistical means. But statistics never can replace Your personal evaluation of effect, nor can statistics transmute randomness into certainty like an ''uncertainty laundry''. The paper discusses these problems in routine practical work.

  2. Trends in citations to books on epidemiological and statistical methods in the biomedical literature.

    Miquel Porta

    Full Text Available BACKGROUND: There are no analyses of citations to books on epidemiological and statistical methods in the biomedical literature. Such analyses may shed light on how concepts and methods changed while biomedical research evolved. Our aim was to analyze the number and time trends of citations received from biomedical articles by books on epidemiological and statistical methods, and related disciplines. METHODS AND FINDINGS: The data source was the Web of Science. The study books were published between 1957 and 2010. The first year of publication of the citing articles was 1945. We identified 125 books that received at least 25 citations. Books first published in 1980-1989 had the highest total and median number of citations per year. Nine of the 10 most cited texts focused on statistical methods. Hosmer & Lemeshow's Applied logistic regression received the highest number of citations and highest average annual rate. It was followed by books by Fleiss, Armitage, et al., Rothman, et al., and Kalbfleisch and Prentice. Fifth in citations per year was Sackett, et al., Evidence-based medicine. The rise of multivariate methods, clinical epidemiology, or nutritional epidemiology was reflected in the citation trends. Educational textbooks, practice-oriented books, books on epidemiological substantive knowledge, and on theory and health policies were much less cited. None of the 25 top-cited books had the theoretical or sociopolitical scope of works by Cochrane, McKeown, Rose, or Morris. CONCLUSIONS: Books were mainly cited to reference methods. Books first published in the 1980s continue to be most influential. Older books on theory and policies were rooted in societal and general medical concerns, while the most modern books are almost purely on methods.

  3. A Statistic-Based Calibration Method for TIADC System

    Kuojun Yang

    2015-01-01

    Full Text Available Time-interleaved technique is widely used to increase the sampling rate of analog-to-digital converter (ADC. However, the channel mismatches degrade the performance of time-interleaved ADC (TIADC. Therefore, a statistic-based calibration method for TIADC is proposed in this paper. The average value of sampling points is utilized to calculate offset error, and the summation of sampling points is used to calculate gain error. After offset and gain error are obtained, they are calibrated by offset and gain adjustment elements in ADC. Timing skew is calibrated by an iterative method. The product of sampling points of two adjacent subchannels is used as a metric for calibration. The proposed method is employed to calibrate mismatches in a four-channel 5 GS/s TIADC system. Simulation results show that the proposed method can estimate mismatches accurately in a wide frequency range. It is also proved that an accurate estimation can be obtained even if the signal noise ratio (SNR of input signal is 20 dB. Furthermore, the results obtained from a real four-channel 5 GS/s TIADC system demonstrate the effectiveness of the proposed method. We can see that the spectra spurs due to mismatches have been effectively eliminated after calibration.

  4. Modern methods of surveyor observations in opencast mining under complex hydrogeological conditions.

    Usoltseva, L. A.; Lushpei, V. P.; Mursin, VA

    2017-10-01

    The article considers the possibility of linking the modern methods of surveying security of open mining works to improve industrial safety in the Primorsky Territory, as well as their use in the educational process. Industrial Safety in the management of Surface Mining depends largely on the applied assessment methods and methods of stability of pit walls and slopes of dumps in the complex mining and hydro-geological conditions.

  5. A Literature Review Fuzzy Pay-Off-Method – A Modern Approach in Valuation

    Daniel Manaţe

    2015-01-01

    Full Text Available This article proposes to present a modern approach in the analysis of updated cash flows. The approach is based on the Fuzzy Pay-Off-Method (FPOM for Real Option Valuation (ROV. This article describes a few types of models for the valuation of real options currently in use. In support for the chosen FPOM method, we included the mathematical model that stands at the basis of this method and a case study.

  6. The perturbed angular correlation method - a modern technique in studying solids

    Unterricker, S.; Hunger, H.J.

    1979-01-01

    Starting from theoretical fundamentals the differential perturbed angular correlation method has been explained. By using the probe nucleus 111 Cd the magnetic dipole interaction in Fesub(x)Alsub(1-x) alloys and the electric quadrupole interaction in Cd have been measured. The perturbed angular correlation method is a modern nuclear measuring method and can be applied in studying ordering processes, phase transformations and radiation damages in metals, semiconductors and insulators

  7. Are Statistics Labs Worth the Effort?--Comparison of Introductory Statistics Courses Using Different Teaching Methods

    Jose H. Guardiola

    2010-01-01

    Full Text Available This paper compares the academic performance of students in three similar elementary statistics courses taught by the same instructor, but with the lab component differing among the three. One course is traditionally taught without a lab component; the second with a lab component using scenarios and an extensive use of technology, but without explicit coordination between lab and lecture; and the third using a lab component with an extensive use of technology that carefully coordinates the lab with the lecture. Extensive use of technology means, in this context, using Minitab software in the lab section, doing homework and quizzes using MyMathlab ©, and emphasizing interpretation of computer output during lectures. Initially, an online instrument based on Gardner’s multiple intelligences theory, is given to students to try to identify students’ learning styles and intelligence types as covariates. An analysis of covariance is performed in order to compare differences in achievement. In this study there is no attempt to measure difference in student performance across the different treatments. The purpose of this study is to find indications of associations among variables that support the claim that statistics labs could be associated with superior academic achievement in one of these three instructional environments. Also, this study tries to identify individual student characteristics that could be associated with superior academic performance. This study did not find evidence of any individual student characteristics that could be associated with superior achievement. The response variable was computed as percentage of correct answers for the three exams during the semester added together. The results of this study indicate a significant difference across these three different instructional methods, showing significantly higher mean scores for the response variable on students taking the lab component that was carefully coordinated with

  8. Trends in citations to books on epidemiological and statistical methods in the biomedical literature.

    Porta, Miquel; Vandenbroucke, Jan P; Ioannidis, John P A; Sanz, Sergio; Fernandez, Esteve; Bhopal, Raj; Morabia, Alfredo; Victora, Cesar; Lopez, Tomàs

    2013-01-01

    There are no analyses of citations to books on epidemiological and statistical methods in the biomedical literature. Such analyses may shed light on how concepts and methods changed while biomedical research evolved. Our aim was to analyze the number and time trends of citations received from biomedical articles by books on epidemiological and statistical methods, and related disciplines. The data source was the Web of Science. The study books were published between 1957 and 2010. The first year of publication of the citing articles was 1945. We identified 125 books that received at least 25 citations. Books first published in 1980-1989 had the highest total and median number of citations per year. Nine of the 10 most cited texts focused on statistical methods. Hosmer & Lemeshow's Applied logistic regression received the highest number of citations and highest average annual rate. It was followed by books by Fleiss, Armitage, et al., Rothman, et al., and Kalbfleisch and Prentice. Fifth in citations per year was Sackett, et al., Evidence-based medicine. The rise of multivariate methods, clinical epidemiology, or nutritional epidemiology was reflected in the citation trends. Educational textbooks, practice-oriented books, books on epidemiological substantive knowledge, and on theory and health policies were much less cited. None of the 25 top-cited books had the theoretical or sociopolitical scope of works by Cochrane, McKeown, Rose, or Morris. Books were mainly cited to reference methods. Books first published in the 1980s continue to be most influential. Older books on theory and policies were rooted in societal and general medical concerns, while the most modern books are almost purely on methods.

  9. Teaching Biology through Statistics: Application of Statistical Methods in Genetics and Zoology Courses

    Colon-Berlingeri, Migdalisel; Burrowes, Patricia A.

    2011-01-01

    Incorporation of mathematics into biology curricula is critical to underscore for undergraduate students the relevance of mathematics to most fields of biology and the usefulness of developing quantitative process skills demanded in modern biology. At our institution, we have made significant changes to better integrate mathematics into the…

  10. Some aspects of Trim-algorithm modernization for Monte-Carlo method

    Dovnar, S.V.; Grigor'ev, V.V.; Kamyshan, M.A.; Leont'ev, A.V.; Yanusko, S.V.

    2001-01-01

    Some aspects of Trim-algorithm modernization in Monte-Carlo method are discussed. This modification permits to raise the universality of program work with various potentials of ion-atom interactions and to improve the calculation precision for scattering angle θ c

  11. Statistics

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  12. Statistical methods for mechanistic model validation: Salt Repository Project

    Eggett, D.L.

    1988-07-01

    As part of the Department of Energy's Salt Repository Program, Pacific Northwest Laboratory (PNL) is studying the emplacement of nuclear waste containers in a salt repository. One objective of the SRP program is to develop an overall waste package component model which adequately describes such phenomena as container corrosion, waste form leaching, spent fuel degradation, etc., which are possible in the salt repository environment. The form of this model will be proposed, based on scientific principles and relevant salt repository conditions with supporting data. The model will be used to predict the future characteristics of the near field environment. This involves several different submodels such as the amount of time it takes a brine solution to contact a canister in the repository, how long it takes a canister to corrode and expose its contents to the brine, the leach rate of the contents of the canister, etc. These submodels are often tested in a laboratory and should be statistically validated (in this context, validate means to demonstrate that the model adequately describes the data) before they can be incorporated into the waste package component model. This report describes statistical methods for validating these models. 13 refs., 1 fig., 3 tabs

  13. Statistical methods of evaluating and comparing imaging techniques

    Freedman, L.S.

    1987-01-01

    Over the past 20 years several new methods of generating images of internal organs and the anatomy of the body have been developed and used to enhance the accuracy of diagnosis and treatment. These include ultrasonic scanning, radioisotope scanning, computerised X-ray tomography (CT) and magnetic resonance imaging (MRI). The new techniques have made a considerable impact on radiological practice in hospital departments, not least on the investigational process for patients suspected or known to have malignant disease. As a consequence of the increased range of imaging techniques now available, there has developed a need to evaluate and compare their usefulness. Over the past 10 years formal studies of the application of imaging technology have been conducted and many reports have appeared in the literature. These studies cover a range of clinical situations. Likewise, the methodologies employed for evaluating and comparing the techniques in question have differed widely. While not attempting an exhaustive review of the clinical studies which have been reported, this paper aims to examine the statistical designs and analyses which have been used. First a brief review of the different types of study is given. Examples of each type are then chosen to illustrate statistical issues related to their design and analysis. In the final sections it is argued that a form of classification for these different types of study might be helpful in clarifying relationships between them and bringing a perspective to the field. A classification based upon a limited analogy with clinical trials is suggested

  14. Development and testing of improved statistical wind power forecasting methods.

    Mendes, J.; Bessa, R.J.; Keko, H.; Sumaili, J.; Miranda, V.; Ferreira, C.; Gama, J.; Botterud, A.; Zhou, Z.; Wang, J. (Decision and Information Sciences); (INESC Porto)

    2011-12-06

    (with spatial and/or temporal dependence). Statistical approaches to uncertainty forecasting basically consist of estimating the uncertainty based on observed forecasting errors. Quantile regression (QR) is currently a commonly used approach in uncertainty forecasting. In Chapter 3, we propose new statistical approaches to the uncertainty estimation problem by employing kernel density forecast (KDF) methods. We use two estimators in both offline and time-adaptive modes, namely, the Nadaraya-Watson (NW) and Quantilecopula (QC) estimators. We conduct detailed tests of the new approaches using QR as a benchmark. One of the major issues in wind power generation are sudden and large changes of wind power output over a short period of time, namely ramping events. In Chapter 4, we perform a comparative study of existing definitions and methodologies for ramp forecasting. We also introduce a new probabilistic method for ramp event detection. The method starts with a stochastic algorithm that generates wind power scenarios, which are passed through a high-pass filter for ramp detection and estimation of the likelihood of ramp events to happen. The report is organized as follows: Chapter 2 presents the results of the application of ITL training criteria to deterministic WPF; Chapter 3 reports the study on probabilistic WPF, including new contributions to wind power uncertainty forecasting; Chapter 4 presents a new method to predict and visualize ramp events, comparing it with state-of-the-art methodologies; Chapter 5 briefly summarizes the main findings and contributions of this report.

  15. Data and statistical methods for analysis of trends and patterns

    Atwood, C.L.; Gentillon, C.D.; Wilson, G.E.

    1992-11-01

    This report summarizes topics considered at a working meeting on data and statistical methods for analysis of trends and patterns in US commercial nuclear power plants. This meeting was sponsored by the Office of Analysis and Evaluation of Operational Data (AEOD) of the Nuclear Regulatory Commission (NRC). Three data sets are briefly described: Nuclear Plant Reliability Data System (NPRDS), Licensee Event Report (LER) data, and Performance Indicator data. Two types of study are emphasized: screening studies, to see if any trends or patterns appear to be present; and detailed studies, which are more concerned with checking the analysis assumptions, modeling any patterns that are present, and searching for causes. A prescription is given for a screening study, and ideas are suggested for a detailed study, when the data take of any of three forms: counts of events per time, counts of events per demand, and non-event data

  16. Statistics

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  17. Statistics

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  18. Statistics

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  19. Implementation of statistical analysis methods for medical physics data

    Teixeira, Marilia S.; Pinto, Nivia G.P.; Barroso, Regina C.; Oliveira, Luis F.

    2009-01-01

    The objective of biomedical research with different radiation natures is to contribute for the understanding of the basic physics and biochemistry of the biological systems, the disease diagnostic and the development of the therapeutic techniques. The main benefits are: the cure of tumors through the therapy, the anticipated detection of diseases through the diagnostic, the using as prophylactic mean for blood transfusion, etc. Therefore, for the better understanding of the biological interactions occurring after exposure to radiation, it is necessary for the optimization of therapeutic procedures and strategies for reduction of radioinduced effects. The group pf applied physics of the Physics Institute of UERJ have been working in the characterization of biological samples (human tissues, teeth, saliva, soil, plants, sediments, air, water, organic matrixes, ceramics, fossil material, among others) using X-rays diffraction and X-ray fluorescence. The application of these techniques for measurement, analysis and interpretation of the biological tissues characteristics are experimenting considerable interest in the Medical and Environmental Physics. All quantitative data analysis must be initiated with descriptive statistic calculation (means and standard deviations) in order to obtain a previous notion on what the analysis will reveal. It is well known que o high values of standard deviation found in experimental measurements of biologicals samples can be attributed to biological factors, due to the specific characteristics of each individual (age, gender, environment, alimentary habits, etc). This work has the main objective the development of a program for the use of specific statistic methods for the optimization of experimental data an analysis. The specialized programs for this analysis are proprietary, another objective of this work is the implementation of a code which is free and can be shared by the other research groups. As the program developed since the

  20. Statistically qualified neuro-analytic failure detection method and system

    Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.

    2002-03-02

    An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.

  1. Improved Statistical Method For Hydrographic Climatic Records Quality Control

    Gourrion, J.; Szekely, T.

    2016-02-01

    Climate research benefits from the continuous development of global in-situ hydrographic networks in the last decades. Apart from the increasing volume of observations available on a large range of temporal and spatial scales, a critical aspect concerns the ability to constantly improve the quality of the datasets. In the context of the Coriolis Dataset for ReAnalysis (CORA) version 4.2, a new quality control method based on a local comparison to historical extreme values ever observed is developed, implemented and validated. Temperature, salinity and potential density validity intervals are directly estimated from minimum and maximum values from an historical reference dataset, rather than from traditional mean and standard deviation estimates. Such an approach avoids strong statistical assumptions on the data distributions such as unimodality, absence of skewness and spatially homogeneous kurtosis. As a new feature, it also allows addressing simultaneously the two main objectives of a quality control strategy, i.e. maximizing the number of good detections while minimizing the number of false alarms. The reference dataset is presently built from the fusion of 1) all ARGO profiles up to early 2014, 2) 3 historical CTD datasets and 3) the Sea Mammals CTD profiles from the MEOP database. All datasets are extensively and manually quality controlled. In this communication, the latest method validation results are also presented. The method has been implemented in the latest version of the CORA dataset and will benefit to the next version of the Copernicus CMEMS dataset.

  2. Modern nuclear medicine methods as a topic of biophysics in veterinary training at UVM in Kosice

    Stanicova, J.; Lohajova, L.

    2004-01-01

    Diagnostic and therapeutic application of ionising radiation is very important in all of branches of medicine including veterinary medicine. In veterinary training at University of Veterinary Medicine in Kosice (UVM), biophysics is a basic subject and it grants physical basis necessary for understanding subsequent subjects such as veterinary surgery, roentgenology, orthopedics. In view of this, traditional methods of radiology such as fluoroscopy, skiagraphy and tomography are explaining. The appearance and application of the theory so called reconstruction of image and also computers led to qualitatively new solutions via the development of modern methods in radiology. Explaining of physical principles, advantages or disadvantages of these new methods is also important in veterinary training although some of them do not use in veterinary practice yet. Two modern methods of nuclear medicine using in diagnostic (SPECT and PET) are discussed bellow. (authors)

  3. Place of modern imaging methods and their influence on the diagnostic process

    Petkov, D.; Lazarova, I.

    1991-01-01

    The main trends in development of the modern imaging diagnostic methods are presented: increasing the specificity of CT, nuclear-magnetic resonance imaging, positron-emission tomography, digital substractional angiography, echography etc. based on modern technical improvements; objective representation of the physiological and biochemical divergencies in particular diseases; interventional radiology; integral application of different methods; improving the sensitivity and specificity of the methods based on developments in pharmacology (new contrast media, parmaceuticals influencing the function of examinated organs, etc.); the possibilities for data compilation and further computerized processing of primary data. Personal experience is reported with the exploitation of these methods in Bulgaria. Attention is also called to the unfavourable impact connected with the too strong technicization of the diagnostic and therapeutic process in a health, deontologic, economical and social respect. 15 refs

  4. A new method to determine the number of experimental data using statistical modeling methods

    Jung, Jung-Ho; Kang, Young-Jin; Lim, O-Kaung; Noh, Yoojeong [Pusan National University, Busan (Korea, Republic of)

    2017-06-15

    For analyzing the statistical performance of physical systems, statistical characteristics of physical parameters such as material properties need to be estimated by collecting experimental data. For accurate statistical modeling, many such experiments may be required, but data are usually quite limited owing to the cost and time constraints of experiments. In this study, a new method for determining a rea- sonable number of experimental data is proposed using an area metric, after obtaining statistical models using the information on the underlying distribution, the Sequential statistical modeling (SSM) approach, and the Kernel density estimation (KDE) approach. The area metric is used as a convergence criterion to determine the necessary and sufficient number of experimental data to be acquired. The pro- posed method is validated in simulations, using different statistical modeling methods, different true models, and different convergence criteria. An example data set with 29 data describing the fatigue strength coefficient of SAE 950X is used for demonstrating the performance of the obtained statistical models that use a pre-determined number of experimental data in predicting the probability of failure for a target fatigue life.

  5. Assessment Methods in Statistical Education An International Perspective

    Bidgood, Penelope; Jolliffe, Flavia

    2010-01-01

    This book is a collaboration from leading figures in statistical education and is designed primarily for academic audiences involved in teaching statistics and mathematics. The book is divided in four sections: (1) Assessment using real-world problems, (2) Assessment statistical thinking, (3) Individual assessment (4) Successful assessment strategies.

  6. Statistical analysis and interpretation of prenatal diagnostic imaging studies, Part 2: descriptive and inferential statistical methods.

    Tuuli, Methodius G; Odibo, Anthony O

    2011-08-01

    The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.

  7. Quality in statistics education : Determinants of course outcomes in methods & statistics education at universities and colleges

    Verhoeven, P.S.

    2009-01-01

    Although Statistics is not a very popular course according to most students, a majority of students still take it, as it is mandatory at most Social Science departments. Therefore it takes special teacher’s skills to teach statistics. In order to do so it is essential for teachers to know what

  8. Statistics

    2003-01-01

    For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products

  9. Statistics

    2004-01-01

    For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees

  10. Statistics

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  11. Statistical method to compare massive parallel sequencing pipelines.

    Elsensohn, M H; Leblay, N; Dimassi, S; Campan-Fournier, A; Labalme, A; Roucher-Boulez, F; Sanlaville, D; Lesca, G; Bardel, C; Roy, P

    2017-03-01

    Today, sequencing is frequently carried out by Massive Parallel Sequencing (MPS) that cuts drastically sequencing time and expenses. Nevertheless, Sanger sequencing remains the main validation method to confirm the presence of variants. The analysis of MPS data involves the development of several bioinformatic tools, academic or commercial. We present here a statistical method to compare MPS pipelines and test it in a comparison between an academic (BWA-GATK) and a commercial pipeline (TMAP-NextGENe®), with and without reference to a gold standard (here, Sanger sequencing), on a panel of 41 genes in 43 epileptic patients. This method used the number of variants to fit log-linear models for pairwise agreements between pipelines. To assess the heterogeneity of the margins and the odds ratios of agreement, four log-linear models were used: a full model, a homogeneous-margin model, a model with single odds ratio for all patients, and a model with single intercept. Then a log-linear mixed model was fitted considering the biological variability as a random effect. Among the 390,339 base-pairs sequenced, TMAP-NextGENe® and BWA-GATK found, on average, 2253.49 and 1857.14 variants (single nucleotide variants and indels), respectively. Against the gold standard, the pipelines had similar sensitivities (63.47% vs. 63.42%) and close but significantly different specificities (99.57% vs. 99.65%; p < 0.001). Same-trend results were obtained when only single nucleotide variants were considered (99.98% specificity and 76.81% sensitivity for both pipelines). The method allows thus pipeline comparison and selection. It is generalizable to all types of MPS data and all pipelines.

  12. Hydrologic extremes - an intercomparison of multiple gridded statistical downscaling methods

    Werner, Arelia T.; Cannon, Alex J.

    2016-04-01

    Gridded statistical downscaling methods are the main means of preparing climate model data to drive distributed hydrological models. Past work on the validation of climate downscaling methods has focused on temperature and precipitation, with less attention paid to the ultimate outputs from hydrological models. Also, as attention shifts towards projections of extreme events, downscaling comparisons now commonly assess methods in terms of climate extremes, but hydrologic extremes are less well explored. Here, we test the ability of gridded downscaling models to replicate historical properties of climate and hydrologic extremes, as measured in terms of temporal sequencing (i.e. correlation tests) and distributional properties (i.e. tests for equality of probability distributions). Outputs from seven downscaling methods - bias correction constructed analogues (BCCA), double BCCA (DBCCA), BCCA with quantile mapping reordering (BCCAQ), bias correction spatial disaggregation (BCSD), BCSD using minimum/maximum temperature (BCSDX), the climate imprint delta method (CI), and bias corrected CI (BCCI) - are used to drive the Variable Infiltration Capacity (VIC) model over the snow-dominated Peace River basin, British Columbia. Outputs are tested using split-sample validation on 26 climate extremes indices (ClimDEX) and two hydrologic extremes indices (3-day peak flow and 7-day peak flow). To characterize observational uncertainty, four atmospheric reanalyses are used as climate model surrogates and two gridded observational data sets are used as downscaling target data. The skill of the downscaling methods generally depended on reanalysis and gridded observational data set. However, CI failed to reproduce the distribution and BCSD and BCSDX the timing of winter 7-day low-flow events, regardless of reanalysis or observational data set. Overall, DBCCA passed the greatest number of tests for the ClimDEX indices, while BCCAQ, which is designed to more accurately resolve event

  13. Multiresolution, Geometric, and Learning Methods in Statistical Image Processing, Object Recognition, and Sensor Fusion

    Willsky, Alan

    2004-01-01

    .... Our research blends methods from several fields-statistics and probability, signal and image processing, mathematical physics, scientific computing, statistical learning theory, and differential...

  14. Improved statistical method for temperature and salinity quality control

    Gourrion, Jérôme; Szekely, Tanguy

    2017-04-01

    Climate research and Ocean monitoring benefit from the continuous development of global in-situ hydrographic networks in the last decades. Apart from the increasing volume of observations available on a large range of temporal and spatial scales, a critical aspect concerns the ability to constantly improve the quality of the datasets. In the context of the Coriolis Dataset for ReAnalysis (CORA) version 4.2, a new quality control method based on a local comparison to historical extreme values ever observed is developed, implemented and validated. Temperature, salinity and potential density validity intervals are directly estimated from minimum and maximum values from an historical reference dataset, rather than from traditional mean and standard deviation estimates. Such an approach avoids strong statistical assumptions on the data distributions such as unimodality, absence of skewness and spatially homogeneous kurtosis. As a new feature, it also allows addressing simultaneously the two main objectives of an automatic quality control strategy, i.e. maximizing the number of good detections while minimizing the number of false alarms. The reference dataset is presently built from the fusion of 1) all ARGO profiles up to late 2015, 2) 3 historical CTD datasets and 3) the Sea Mammals CTD profiles from the MEOP database. All datasets are extensively and manually quality controlled. In this communication, the latest method validation results are also presented. The method has already been implemented in the latest version of the delayed-time CMEMS in-situ dataset and will be deployed soon in the equivalent near-real time products.

  15. DEVELOPMENT OF A METHOD STATISTICAL ANALYSIS ACCURACY AND PROCESS STABILITY PRODUCTION OF EPOXY RESIN ED-20

    N. V. Zhelninskaya

    2015-01-01

    Full Text Available Statistical methods play an important role in the objective evaluation of quantitative and qualitative characteristics of the process and are one of the most important elements of the quality assurance system production and total quality management process. To produce a quality product, one must know the real accuracy of existing equipment, to determine compliance with the accuracy of a selected technological process specified accuracy products, assess process stability. Most of the random events in life, particularly in manufacturing and scientific research, are characterized by the presence of a large number of random factors, is described by a normal distribution, which is the main in many practical studies. Modern statistical methods is quite difficult to grasp and wide practical use without in-depth mathematical training of all participants in the process. When we know the distribution of a random variable, you can get all the features of this batch of products, to determine the mean value and the variance. Using statistical control methods and quality control in the analysis of accuracy and stability of the technological process of production of epoxy resin ED20. Estimated numerical characteristics of the law of distribution of controlled parameters and determined the percentage of defects of the investigated object products. For sustainability assessment of manufacturing process of epoxy resin ED-20 selected Shewhart control charts, using quantitative data, maps of individual values of X and sliding scale R. Using Pareto charts identify the causes that affect low dynamic viscosity in the largest extent. For the analysis of low values of dynamic viscosity were the causes of defects using Ishikawa diagrams, which shows the most typical factors of the variability of the results of the process. To resolve the problem, it is recommended to modify the polymer composition of carbon fullerenes and to use the developed method for the production of

  16. Discounted Cash Flow and Modern Asset Pricing Methods - Project Selection and Policy Implications

    Emhjellen, Magne; Alaouze, Chris M.

    2002-07-01

    We examine the differences in the net present values (NPV's) of North Sea oil projects obtained using the Weighted Average Cost of Capital (WACC) and a Modern Asset Pricing (MAP) method which involves the separate discounting of project cash flow components. NPV differences of more than $1 Om were found for some oil projects. Thus, the choice of valuation method will affect the development decisions of oil companies. The results of the MAP method are very sensitive to the choice of parameter values for the stochastic process used to model oil prices. Further research is recommended before the MAP method is used as the sole valuation model. (author)

  17. Discounted Cash Flow and Modern Asset Pricing Methods - Project Selection and Policy Implications

    Emhjellen, Magne; Alaouze, Chris M.

    2002-01-01

    We examine the differences in the net present values (NPV's) of North Sea oil projects obtained using the Weighted Average Cost of Capital (WACC) and a Modern Asset Pricing (MAP) method which involves the separate discounting of project cash flow components. NPV differences of more than $1 Om were found for some oil projects. Thus, the choice of valuation method will affect the development decisions of oil companies. The results of the MAP method are very sensitive to the choice of parameter values for the stochastic process used to model oil prices. Further research is recommended before the MAP method is used as the sole valuation model. (author)

  18. Discounted Cash Flow and Modern Asset Pricing Methods - Project Selection and Policy Implications

    Emhjellen, Magne; Alaouze, Chris M

    2002-07-01

    We examine the differences in the net present values (NPV's) of North Sea oil projects obtained using the Weighted Average Cost of Capital (WACC) and a Modern Asset Pricing (MAP) method which involves the separate discounting of project cash flow components. NPV differences of more than $1 Om were found for some oil projects. Thus, the choice of valuation method will affect the development decisions of oil companies. The results of the MAP method are very sensitive to the choice of parameter values for the stochastic process used to model oil prices. Further research is recommended before the MAP method is used as the sole valuation model. (author)

  19. A comparison of discounted cashflow and modern asset pricing methods - project selection and policy implications

    Emhjellen, Magne; Alaouze, Chris M.

    2003-01-01

    We examine the differences in the net present values (NPVs) of North Sea oil projects obtained using the weighted average cost of capital and a modern asset pricing (MAP) method which involves the separate discounting of project cashflow components. NPV differences of more than $10 million were found for some oil projects. Thus, the choice of valuation method will affect the development decisions of oil companies and could influence tax policy. The results of the MAP method are very sensitive to the choice of parameter values for the stochastic process used to model oil prices. Further research is recommended before the MAP method is used as the sole valuation model

  20. Modern nonparametric, robust and multivariate methods festschrift in honour of Hannu Oja

    Taskinen, Sara

    2015-01-01

    Written by leading experts in the field, this edited volume brings together the latest findings in the area of nonparametric, robust and multivariate statistical methods. The individual contributions cover a wide variety of topics ranging from univariate nonparametric methods to robust methods for complex data structures. Some examples from statistical signal processing are also given. The volume is dedicated to Hannu Oja on the occasion of his 65th birthday and is intended for researchers as well as PhD students with a good knowledge of statistics.

  1. Statistical error estimation of the Feynman-α method using the bootstrap method

    Endo, Tomohiro; Yamamoto, Akio; Yagi, Takahiro; Pyeon, Cheol Ho

    2016-01-01

    Applicability of the bootstrap method is investigated to estimate the statistical error of the Feynman-α method, which is one of the subcritical measurement techniques on the basis of reactor noise analysis. In the Feynman-α method, the statistical error can be simply estimated from multiple measurements of reactor noise, however it requires additional measurement time to repeat the multiple times of measurements. Using a resampling technique called 'bootstrap method' standard deviation and confidence interval of measurement results obtained by the Feynman-α method can be estimated as the statistical error, using only a single measurement of reactor noise. In order to validate our proposed technique, we carried out a passive measurement of reactor noise without any external source, i.e. with only inherent neutron source by spontaneous fission and (α,n) reactions in nuclear fuels at the Kyoto University Criticality Assembly. Through the actual measurement, it is confirmed that the bootstrap method is applicable to approximately estimate the statistical error of measurement results obtained by the Feynman-α method. (author)

  2. A statistical method for draft tube pressure pulsation analysis

    Doerfler, P K; Ruchonnet, N

    2012-01-01

    Draft tube pressure pulsation (DTPP) in Francis turbines is composed of various components originating from different physical phenomena. These components may be separated because they differ by their spatial relationships and by their propagation mechanism. The first step for such an analysis was to distinguish between so-called synchronous and asynchronous pulsations; only approximately periodic phenomena could be described in this manner. However, less regular pulsations are always present, and these become important when turbines have to operate in the far off-design range, in particular at very low load. The statistical method described here permits to separate the stochastic (random) component from the two traditional 'regular' components. It works in connection with the standard technique of model testing with several pressure signals measured in draft tube cone. The difference between the individual signals and the averaged pressure signal, together with the coherence between the individual pressure signals is used for analysis. An example reveals that a generalized, non-periodic version of the asynchronous pulsation is important at low load.

  3. Information Geometry, Inference Methods and Chaotic Energy Levels Statistics

    Cafaro, Carlo

    2008-01-01

    In this Letter, we propose a novel information-geometric characterization of chaotic (integrable) energy level statistics of a quantum antiferromagnetic Ising spin chain in a tilted (transverse) external magnetic field. Finally, we conjecture our results might find some potential physical applications in quantum energy level statistics.

  4. Statistical methods for decision making in mine action

    Larsen, Jan

    The lecture discusses the basics of statistical decision making in connection with humanitarian mine action. There is special focus on: 1) requirements for mine detection; 2) design and evaluation of mine equipment; 3) performance improvement by statistical learning and information fusion; 4...

  5. Statistics a guide to the use of statistical methods in the physical sciences

    Barlow, Roger J

    1989-01-01

    The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition F. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A.C. Phillips Computing for Scienti

  6. Methods of geometric function theory in classical and modern problems for polynomials

    Dubinin, Vladimir N

    2012-01-01

    This paper gives a survey of classical and modern theorems on polynomials, proved using methods of geometric function theory. Most of the paper is devoted to results of the author and his students, established by applying majorization principles for holomorphic functions, the theory of univalent functions, the theory of capacities, and symmetrization. Auxiliary results and the proofs of some of the theorems are presented. Bibliography: 124 titles.

  7. Review of Polynomial Chaos-Based Methods for Uncertainty Quantification in Modern Integrated Circuits

    Arun Kaintura; Tom Dhaene; Domenico Spina

    2018-01-01

    Advances in manufacturing process technology are key ensembles for the production of integrated circuits in the sub-micrometer region. It is of paramount importance to assess the effects of tolerances in the manufacturing process on the performance of modern integrated circuits. The polynomial chaos expansion has emerged as a suitable alternative to standard Monte Carlo-based methods that are accurate, but computationally cumbersome. This paper provides an overview of the most recent developm...

  8. The role of classical and modern teaching methods in business education

    Conțu Eleonora Gabriela

    2017-07-01

    Full Text Available Nowadays the training-educational process is a dynamic and complex process which uses both classical and modern teaching methods in order to obtain performance in education. Even though traditional teaching methods have a formal character interaction between teacher and students, this is face-to-face and therefore students can give an immediate feedback. From this point of view classical teaching methods are important from time to time. In Romania, in the European context the role of effective learning strategies represents the key point for the education process. The role of teachers in developing creativity to those students who want to learn in an interactive way is very important because they should imply that students directly in the training -educational process. In this context the educational process must be student centered because only in this way their critical thinking and creativity is developed. We can say that when non-formal and informal learning is combined with formal learning the scope of pedagogy is accomplish. In contemporary context education is regarded as an innovative concept which is used to produce performance at the individual level and also, at institutional level, education provides support in order to build strategies according to the challenges from the labour market. The paper is based on a qualitative research, conducted on a sample of 100 people aged between 19 and 23 years old (students at a Business School. The key question raised at this point is: What is the role of classical and modern teaching methods in training-educational process? The objectives of this study are the following: 1. highlighting the context of higher education in Romania; 2. presenting the role of university strategy in contemporary context; 3. highlighting the importance of using classical/modern teaching methods in business education; 4. presenting the role of innovation and creativity in business education; 5. presenting the analysis

  9. The application of modern nodal methods to PWR reactor physics analysis

    Knight, M.P.

    1988-06-01

    The objective of this research is to develop efficient computational procedures for PWR reactor calculations, based on modern nodal methods. The analytic nodal method, which is characterised by the use of exact exponential expansions in transverse-integrated equations, is implemented within an existing finite-difference code. This shows considerable accuracy and efficiency on standard benchmark problems, very much in line with existing experience with nodal methods., Assembly powers can be calculated to within 2.0% with just one mesh per assembly. (author)

  10. Robust Control Methods for On-Line Statistical Learning

    Capobianco Enrico

    2001-01-01

    Full Text Available The issue of controlling that data processing in an experiment results not affected by the presence of outliers is relevant for statistical control and learning studies. Learning schemes should thus be tested for their capacity of handling outliers in the observed training set so to achieve reliable estimates with respect to the crucial bias and variance aspects. We describe possible ways of endowing neural networks with statistically robust properties by defining feasible error criteria. It is convenient to cast neural nets in state space representations and apply both Kalman filter and stochastic approximation procedures in order to suggest statistically robustified solutions for on-line learning.

  11. Statistics and scientific method: an introduction for students and researchers

    Diggle, Peter; Chetwynd, Amanda

    2011-01-01

    "Most introductory statistics text-books are written either in a highly mathematical style for an intended readership of mathematics undergraduate students, or in a recipe-book style for an intended...

  12. Using Statistical Process Control Methods to Classify Pilot Mental Workloads

    Kudo, Terence

    2001-01-01

    .... These include cardiac, ocular, respiratory, and brain activity measures. The focus of this effort is to apply statistical process control methodology on different psychophysiological features in an attempt to classify pilot mental workload...

  13. Sensory evaluation of food: statistical methods and procedures

    O'Mahony, Michael

    1986-01-01

    The aim of this book is to provide basic knowledge of the logic and computation of statistics for the sensory evaluation of food, or for other forms of sensory measurement encountered in, say, psychophysics...

  14. Highly Robust Statistical Methods in Medical Image Analysis

    Kalina, Jan

    2012-01-01

    Roč. 32, č. 2 (2012), s. 3-16 ISSN 0208-5216 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : robust statistics * classification * faces * robust image analysis * forensic science Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.208, year: 2012 http://www.ibib.waw.pl/bbe/bbefulltext/BBE_32_2_003_FT.pdf

  15. Predictors of modern contraceptive methods use among married women of reproductive age groups in Western Ethiopia: a community based cross-sectional study.

    Tekelab, Tesfalidet; Melka, Alemu Sufa; Wirtu, Desalegn

    2015-07-17

    In Ethiopia, the prevalence of modern contraceptive use is very low (27 %) and the percentage of those with unmet needs for family planning is 25 %. The current study identified factors associated with the utilization of modern contraceptive methods among married women in Western Ethiopia. A community based, cross-sectional study was employed from April 10 to April 25, 2014, among married women of reproductive age in Nekemte Town. A multi-stage sampling procedure was used to select 1003 study participants. A pretested structured questionnaire was used to collect data, and data collectors who had completed high school were involved in the data collection process. A bivariate, multivariable logistic regression model was fit, and statistical significance was determined with a 95% confidence level. The overall utilization rate of modern contraceptives in this study was 71.9%. The most common form of modern contraceptives used was injectable (60.3%). Age (AOR = 2.00, 95 % CI = 1.35-2.98), women's educational level (AOR = 2.50, 95 % CI = 1.62-3.84), monthly income (AOR = 2.26, 95 % CI = 1.24-4.10), respondent's fertility (AOR = 2.60, 95 % CI = 1.48-4.56), fertility-related decision (AOR = 3.70, 95 % CI = 2.45-5.58), and having radio (AOR = 1.93, 95 % CI = 1.37-2.71) showed significant positive associations with the utilization of modern contraceptive methods. The findings showed that women's empowerment, fertility-related discussions among couples, and the availability of the media were important factors that influenced the use of modern contraceptives. Thus, policymakers and implementers should work on those factors to increase the utilization of modern contraceptive methods.

  16. Development of new methods in modern selective organic synthesis: preparation of functionalized molecules with atomic precision

    Ananikov, V P; Khemchyan, L L; Ivanova, Yu V; Dilman, A D; Levin, V V; Bukhtiyarov, V I; Sorokin, A M; Prosvirin, I P; Romanenko, A V; Simonov, P A; Vatsadze, S Z; Medved'ko, A V; Nuriev, V N; Nenajdenko, V G; Shmatova, O I; Muzalevskiy, V M; Koptyug, I V; Kovtunov, K V; Zhivonitko, V V; Likholobov, V A

    2014-01-01

    The challenges of the modern society and the growing demand of high-technology sectors of industrial production bring about a new phase in the development of organic synthesis. A cutting edge of modern synthetic methods is introduction of functional groups and more complex structural units into organic molecules with unprecedented control over the course of chemical transformation. Analysis of the state-of-the-art achievements in selective organic synthesis indicates the appearance of a new trend — the synthesis of organic molecules, biologically active compounds, pharmaceutical substances and smart materials with absolute selectivity. Most advanced approaches to organic synthesis anticipated in the near future can be defined as 'atomic precision' in chemical reactions. The present review considers selective methods of organic synthesis suitable for transformation of complex functionalized molecules under mild conditions. Selected key trends in the modern organic synthesis are considered including the preparation of organofluorine compounds, catalytic cross-coupling and oxidative cross-coupling reactions, atom-economic addition reactions, methathesis processes, oxidation and reduction reactions, synthesis of heterocyclic compounds, design of new homogeneous and heterogeneous catalytic systems, application of photocatalysis, scaling up synthetic procedures to industrial level and development of new approaches to investigation of mechanisms of catalytic reactions. The bibliography includes 840 references

  17. Modern design methods of hydraulic machines according to a quality system; Moderne tecniche di progettazione delle macchine idrauliche a garanzia della qualita`

    Rossi, G.; Romolotti, F.; Lazzaro, B. [Voith Riva Hydr s.p.a., Milan (Italy)

    1998-03-01

    The article underlines how the adoption of a Quality System in line with the ISO 9001 Code is an essential instrument that is necessary to safely manage the complexity of the calculation, control and organization methods connected to the unavoidable sharing out of competence relevant to design, construction and installation of modern hydraulic machines. [Italiano] L`articolo sottolinea come l`adozione di un Sistema Qualita` in linea con la norma ISO 9001, sia strumento indispensabile per la sicura gestione della maggior complessita` dei metodi di calcolo, di controllo e di organizzazione legati all`inevitabile frazionamento delle competenze relative alla progettazione, costruzione e installazione delle moderne macchine idrauliche.

  18. Modern concepts of cost accounting: A review of the ABC method specific features

    Trklja Radmila

    2014-01-01

    Full Text Available New business conditions, in which the presence of turbulent changes in the environment are extremely obvious, demand, much more than before, relevant and reliable information which represent an essential support for the management in all the stages of decision making processes. In the countries with developed market and competitive economies, new approaches, philosophies, concepts and techniques in the field of expense accounting appear. The development of high technology businesses and the appearance of business globalisation raise the question of the quality of accounting information obtained using traditional methods of cost accounting and it is necessary to change the concept of establishing product costs. According to this, management accounting should ensure an informational support for managing businesses which are based on customers' demands, internal processes, continuous business improvement etc. It is only possible with the application of modern concepts of cost accounting, which will ensure efficient of cost management and business management in modern business conditions.

  19. Improving Quality in Teaching Statistics Concepts Using Modern Visualization: The Design and Use of the Flash Application on Pocket PCs

    Vaughn, Brandon K.; Wang, Pei-Yu

    2009-01-01

    The emergence of technology has led to numerous changes in mathematical and statistical teaching and learning which has improved the quality of instruction and teacher/student interactions. The teaching of statistics, for example, has shifted from mathematical calculations to higher level cognitive abilities such as reasoning, interpretation, and…

  20. Some considerations about literary analysis modern methods from a didactic perspective

    Marialina Ana García Escobio; Moraima Pérez Barrera; María del Carmen Miló Anillo

    2016-01-01

    This article make it possible a close look to the modern methods of literary analysis, taking as the starting point what the teaching of literature in the joyful context should fulfill in the study of the literary play and the processes of reception and aesthetic statement, as well as the application of the aforementioned methods in the attempt to make the student arrive at rational position; But, at the same time, he/she should feel creator and coauthor of an event that should be lived from ...

  1. Cluster size statistic and cluster mass statistic: two novel methods for identifying changes in functional connectivity between groups or conditions.

    Ing, Alex; Schwarzbauer, Christian

    2014-01-01

    Functional connectivity has become an increasingly important area of research in recent years. At a typical spatial resolution, approximately 300 million connections link each voxel in the brain with every other. This pattern of connectivity is known as the functional connectome. Connectivity is often compared between experimental groups and conditions. Standard methods used to control the type 1 error rate are likely to be insensitive when comparisons are carried out across the whole connectome, due to the huge number of statistical tests involved. To address this problem, two new cluster based methods--the cluster size statistic (CSS) and cluster mass statistic (CMS)--are introduced to control the family wise error rate across all connectivity values. These methods operate within a statistical framework similar to the cluster based methods used in conventional task based fMRI. Both methods are data driven, permutation based and require minimal statistical assumptions. Here, the performance of each procedure is evaluated in a receiver operator characteristic (ROC) analysis, utilising a simulated dataset. The relative sensitivity of each method is also tested on real data: BOLD (blood oxygen level dependent) fMRI scans were carried out on twelve subjects under normal conditions and during the hypercapnic state (induced through the inhalation of 6% CO2 in 21% O2 and 73%N2). Both CSS and CMS detected significant changes in connectivity between normal and hypercapnic states. A family wise error correction carried out at the individual connection level exhibited no significant changes in connectivity.

  2. Statistical methods and applications from a historical perspective selected issues

    Mignani, Stefania

    2014-01-01

    The book showcases a selection of peer-reviewed papers, the preliminary versions of which were presented at a conference held 11-13 June 2011 in Bologna and organized jointly by the Italian Statistical Society (SIS), the National Institute of Statistics (ISTAT) and the Bank of Italy. The theme of the conference was "Statistics in the 150 years of the Unification of Italy." The celebration of the anniversary of Italian unification provided the opportunity to examine and discuss the methodological aspects and applications from a historical perspective and both from a national and international point of view. The critical discussion on the issues of the past has made it possible to focus on recent advances, considering the studies of socio-economic and demographic changes in European countries.

  3. Statistical methods to evaluate thermoluminescence ionizing radiation dosimetry data

    Segre, Nadia; Matoso, Erika; Fagundes, Rosane Correa

    2011-01-01

    Ionizing radiation levels, evaluated through the exposure of CaF 2 :Dy thermoluminescence dosimeters (TLD- 200), have been monitored at Centro Experimental Aramar (CEA), located at Ipero in Sao Paulo state, Brazil, since 1991 resulting in a large amount of measurements until 2009 (more than 2,000). The data amount associated with measurements dispersion, since every process has deviation, reinforces the utilization of statistical tools to evaluate the results, procedure also imposed by the Brazilian Standard CNEN-NN-3.01/PR- 3.01-008 which regulates the radiometric environmental monitoring. Thermoluminescence ionizing radiation dosimetry data are statistically compared in order to evaluate potential CEA's activities environmental impact. The statistical tools discussed in this work are box plots, control charts and analysis of variance. (author)

  4. Statistical methods for quantitative mass spectrometry proteomic experiments with labeling

    Oberg Ann L

    2012-11-01

    Full Text Available Abstract Mass Spectrometry utilizing labeling allows multiple specimens to be subjected to mass spectrometry simultaneously. As a result, between-experiment variability is reduced. Here we describe use of fundamental concepts of statistical experimental design in the labeling framework in order to minimize variability and avoid biases. We demonstrate how to export data in the format that is most efficient for statistical analysis. We demonstrate how to assess the need for normalization, perform normalization, and check whether it worked. We describe how to build a model explaining the observed values and test for differential protein abundance along with descriptive statistics and measures of reliability of the findings. Concepts are illustrated through the use of three case studies utilizing the iTRAQ 4-plex labeling protocol.

  5. Statistical methods for quantitative mass spectrometry proteomic experiments with labeling.

    Oberg, Ann L; Mahoney, Douglas W

    2012-01-01

    Mass Spectrometry utilizing labeling allows multiple specimens to be subjected to mass spectrometry simultaneously. As a result, between-experiment variability is reduced. Here we describe use of fundamental concepts of statistical experimental design in the labeling framework in order to minimize variability and avoid biases. We demonstrate how to export data in the format that is most efficient for statistical analysis. We demonstrate how to assess the need for normalization, perform normalization, and check whether it worked. We describe how to build a model explaining the observed values and test for differential protein abundance along with descriptive statistics and measures of reliability of the findings. Concepts are illustrated through the use of three case studies utilizing the iTRAQ 4-plex labeling protocol.

  6. Adaptive Maneuvering Frequency Method of Current Statistical Model

    Wei Sun; Yongjian Yang

    2017-01-01

    Current statistical model(CSM) has a good performance in maneuvering target tracking. However, the fixed maneuvering frequency will deteriorate the tracking results, such as a serious dynamic delay, a slowly converging speedy and a limited precision when using Kalman filter(KF) algorithm. In this study, a new current statistical model and a new Kalman filter are proposed to improve the performance of maneuvering target tracking. The new model which employs innovation dominated subjection function to adaptively adjust maneuvering frequency has a better performance in step maneuvering target tracking, while a fluctuant phenomenon appears. As far as this problem is concerned, a new adaptive fading Kalman filter is proposed as well. In the new Kalman filter, the prediction values are amended in time by setting judgment and amendment rules,so that tracking precision and fluctuant phenomenon of the new current statistical model are improved. The results of simulation indicate the effectiveness of the new algorithm and the practical guiding significance.

  7. Assessment of modern spectral analysis methods to improve wavenumber resolution of F-K spectra

    Shirley, T.E.; Laster, S.J.; Meek, R.A.

    1987-01-01

    The improvement in wavenumber spectra obtained by using high resolution spectral estimators is examined. Three modern spectral estimators were tested, namely the Autoregressive/Maximum Entropy (AR/ME) method, the Extended Prony method, and an eigenstructure method. They were combined with the conventional Fourier method by first transforming each trace with a Fast Fourier Transform (FFT). A high resolution spectral estimator was applied to the resulting complex spatial sequence for each frequency. The collection of wavenumber spectra thus computed comprises a hybrid f-k spectrum with high wavenumber resolution and less spectral ringing. Synthetic and real data records containing 25 traces were analyzed by using the hybrid f-k method. The results show an FFT-AR/ME f-k spectrum has noticeably better wavenumber resolution and more spectral dynamic range than conventional spectra when the number of channels is small. The observed improvement suggests the hybrid technique is potentially valuable in seismic data analysis

  8. Appropriate statistical methods are required to assess diagnostic tests for replacement, add-on, and triage

    Hayen, Andrew; Macaskill, Petra; Irwig, Les; Bossuyt, Patrick

    2010-01-01

    To explain which measures of accuracy and which statistical methods should be used in studies to assess the value of a new binary test as a replacement test, an add-on test, or a triage test. Selection and explanation of statistical methods, illustrated with examples. Statistical methods for

  9. Debating Curricular Strategies for Teaching Statistics and Research Methods: What Does the Current Evidence Suggest?

    Barron, Kenneth E.; Apple, Kevin J.

    2014-01-01

    Coursework in statistics and research methods is a core requirement in most undergraduate psychology programs. However, is there an optimal way to structure and sequence methodology courses to facilitate student learning? For example, should statistics be required before research methods, should research methods be required before statistics, or…

  10. Fuzzy comprehensive evaluation method of F statistics weighting in ...

    In order to rapidly identify the source of water inrush in coal mine, and provide the theoretical basis for mine water damage prevention and control, fuzzy comprehensive evaluation model was established. The F statistics of water samples was normalized as the weight of fuzzy comprehensive evaluation for determining the ...

  11. Statistical methods for decision making in mine action

    Larsen, Jan

    The design and evaluation of mine clearance equipment – the problem of reliability * Detection probability – tossing a coin * Requirements in mine action * Detection probability and confidence in MA * Using statistics in area reduction Improving performance by information fusion and combination...

  12. Statistical methods of combining information: Applications to sensor data fusion

    Burr, T.

    1996-12-31

    This paper reviews some statistical approaches to combining information from multiple sources. Promising new approaches will be described, and potential applications to combining not-so-different data sources such as sensor data will be discussed. Experiences with one real data set are described.

  13. Effective viscosity of dispersions approached by a statistical continuum method

    Mellema, J.; Willemse, M.W.M.

    1983-01-01

    The problem of the determination of the effective viscosity of disperse systems (emulsions, suspensions) is considered. On the basis of the formal solution of the equations governing creeping flow in a statistically homogeneous dispersion, the effective viscosity is expressed in a series expansion

  14. Grassmann methods in lattice field theory and statistical mechanics

    Bilgici, E.; Gattringer, C.; Huber, P.

    2006-01-01

    Full text: In two dimensions models of loops can be represented as simple Grassmann integrals. In our work we explore the generalization of these techniques to lattice field theories and statistical mechanic systems in three and four dimensions. We discuss possible strategies and applications for representations of loop and surface models as Grassmann integrals. (author)

  15. Critical Realism and Statistical Methods--A Response to Nash

    Scott, David

    2007-01-01

    This article offers a defence of critical realism in the face of objections Nash (2005) makes to it in a recent edition of this journal. It is argued that critical and scientific realisms are closely related and that both are opposed to statistical positivism. However, the suggestion is made that scientific realism retains (from statistical…

  16. Obtaining the lattice energy of the anthracene crystal by modern yet affordable first-principles methods

    Sancho-García, J. C.; Aragó, J.; Ortí, E.; Olivier, Y.

    2013-05-01

    The non-covalent interactions in organic molecules are known to drive their self-assembly to form molecular crystals. We compare, in the case of anthracene and against experimental (electronic-only) sublimation energy, how modern quantum-chemical methods are able to calculate this cohesive energy taking into account all the interactions between occurring dimers in both first-and second-shells. These include both O(N6)- and O(N5)-scaling methods, Local Pair Natural Orbital-parameterized Coupled-Cluster Single and Double, and Spin-Component-Scaled-Møller-Plesset perturbation theory at second-order, respectively, as well as the most modern family of conceived density functionals: double-hybrid expressions in several variants (B2-PLYP, mPW2-PLYP, PWPB95) with customized dispersion corrections (-D3 and -NL). All-in-all, it is shown that these methods behave very accurately producing errors in the 1-2 kJ/mol range with respect to the experimental value taken into account the experimental uncertainty. These methods are thus confirmed as excellent tools for studying all kinds of interactions in chemical systems.

  17. Statistical methods for data analysis in particle physics

    Lista, Luca

    2017-01-01

    This concise set of course-based notes provides the reader with the main concepts and tools needed to perform statistical analyses of experimental data, in particular in the field of high-energy physics (HEP). First, the book provides an introduction to probability theory and basic statistics, mainly intended as a refresher from readers’ advanced undergraduate studies, but also to help them clearly distinguish between the Frequentist and Bayesian approaches and interpretations in subsequent applications. More advanced concepts and applications are gradually introduced, culminating in the chapter on both discoveries and upper limits, as many applications in HEP concern hypothesis testing, where the main goal is often to provide better and better limits so as to eventually be able to distinguish between competing hypotheses, or to rule out some of them altogether. Many worked-out examples will help newcomers to the field and graduate students alike understand the pitfalls involved in applying theoretical co...

  18. Reactor noise analysis by statistical pattern recognition methods

    Howington, L.C.; Gonzalez, R.C.

    1976-01-01

    A multivariate statistical pattern recognition system for reactor noise analysis is presented. The basis of the system is a transformation for decoupling correlated variables and algorithms for inferring probability density functions. The system is adaptable to a variety of statistical properties of the data, and it has learning, tracking, updating, and data compacting capabilities. System design emphasizes control of the false-alarm rate. Its abilities to learn normal patterns, to recognize deviations from these patterns, and to reduce the dimensionality of data with minimum error were evaluated by experiments at the Oak Ridge National Laboratory (ORNL) High-Flux Isotope Reactor. Power perturbations of less than 0.1 percent of the mean value in selected frequency ranges were detected by the pattern recognition system

  19. Statistical methods for data analysis in particle physics

    AUTHOR|(CDS)2070643

    2015-01-01

    This concise set of course-based notes provides the reader with the main concepts and tools to perform statistical analysis of experimental data, in particular in the field of high-energy physics (HEP). First, an introduction to probability theory and basic statistics is given, mainly as reminder from advanced undergraduate studies, yet also in view to clearly distinguish the Frequentist versus Bayesian approaches and interpretations in subsequent applications. More advanced concepts and applications are gradually introduced, culminating in the chapter on upper limits as many applications in HEP concern hypothesis testing, where often the main goal is to provide better and better limits so as to be able to distinguish eventually between competing hypotheses or to rule out some of them altogether. Many worked examples will help newcomers to the field and graduate students to understand the pitfalls in applying theoretical concepts to actual data

  20. METHODOLOGICAL PRINCIPLES AND METHODS OF TERMS OF TRADE STATISTICAL EVALUATION

    N. Kovtun

    2014-09-01

    Full Text Available The paper studies the methodological principles and guidance of the statistical evaluation of terms of trade for the United Nations classification model – Harmonized Commodity Description and Coding System (HS. The practical implementation of the proposed three-stage model of index analysis and estimation of terms of trade for Ukraine's commodity-members for the period of 2011-2012 are realized.

  1. Improving the speed of AFM by mechatronic design and modern control methods

    Schitter, Georg

    2009-01-01

    In Atomic Force Microscopy (AFM) high-performance and high-precision control of the AFM scanner and of the imaging forces is crucial. Particularly at high imaging speeds the dynamic behaviour of the scanner may cause imaging artifacts and limit the maximum imaging rate. This contribution discusses and presents recent improvements in AFM instrumentation for faster imaging by means of mechatronic design and utilizing modern control engineering methods. Combining these improvements enables AFM imaging at more than two orders of magnitudes faster than conventional AFMs. (orig.)

  2. Review of Polynomial Chaos-Based Methods for Uncertainty Quantification in Modern Integrated Circuits

    Arun Kaintura

    2018-02-01

    Full Text Available Advances in manufacturing process technology are key ensembles for the production of integrated circuits in the sub-micrometer region. It is of paramount importance to assess the effects of tolerances in the manufacturing process on the performance of modern integrated circuits. The polynomial chaos expansion has emerged as a suitable alternative to standard Monte Carlo-based methods that are accurate, but computationally cumbersome. This paper provides an overview of the most recent developments and challenges in the application of polynomial chaos-based techniques for uncertainty quantification in integrated circuits, with particular focus on high-dimensional problems.

  3. A review of modern instrumental methods of elemental analysis of petroleum related material. Part 2

    Nadkarni, R.A.

    1991-01-01

    In this paper a review is presented of the state of the art in elemental analysis of petroleum-related materials (crude oil, gasoline, additives, and lubricants) using modern instrumental analysis techniques. The major instrumental techniques used for elemental analysis of petroleum products include atomic absorption spectrometry (both with flame and with graphite furnace atomizer), inductively coupled plasma atomic emission spectrometry, ion chromatography, microelemental methods, neutron activation, spark source mass spectrometry, and x-ray fluorescence. Each of these techniques is compared for its advantages, disadvantages, and typical applications in the petroleum field

  4. When the Ostrich-Algorithm Fails: Blanking Method Affects Spike Train Statistics

    Kevin Joseph

    2018-04-01

    Full Text Available Modern electroceuticals are bound to employ the usage of electrical high frequency (130–180 Hz stimulation carried out under closed loop control, most prominent in the case of movement disorders. However, particular challenges are faced when electrical recordings of neuronal tissue are carried out during high frequency electrical stimulation, both in-vivo and in-vitro. This stimulation produces undesired artifacts and can render the recorded signal only partially useful. The extent of these artifacts is often reduced by temporarily grounding the recording input during stimulation pulses. In the following study, we quantify the effects of this method, “blanking,” on the spike count and spike train statistics. Starting from a theoretical standpoint, we calculate a loss in the absolute number of action potentials, depending on: width of the blanking window, frequency of stimulation, and intrinsic neuronal activity. These calculations were then corroborated by actual high signal to noise ratio (SNR single cell recordings. We state that, for clinically relevant frequencies of 130 Hz (used for movement disorders and realistic blanking windows of 2 ms, up to 27% of actual existing spikes are lost. We strongly advice cautioned use of the blanking method when spike rate quantification is attempted.Impact statementBlanking (artifact removal by temporarily grounding input, depending on recording parameters, can lead to significant spike loss. Very careful use of blanking circuits is advised.

  5. When the Ostrich-Algorithm Fails: Blanking Method Affects Spike Train Statistics.

    Joseph, Kevin; Mottaghi, Soheil; Christ, Olaf; Feuerstein, Thomas J; Hofmann, Ulrich G

    2018-01-01

    Modern electroceuticals are bound to employ the usage of electrical high frequency (130-180 Hz) stimulation carried out under closed loop control, most prominent in the case of movement disorders. However, particular challenges are faced when electrical recordings of neuronal tissue are carried out during high frequency electrical stimulation, both in-vivo and in-vitro . This stimulation produces undesired artifacts and can render the recorded signal only partially useful. The extent of these artifacts is often reduced by temporarily grounding the recording input during stimulation pulses. In the following study, we quantify the effects of this method, "blanking," on the spike count and spike train statistics. Starting from a theoretical standpoint, we calculate a loss in the absolute number of action potentials, depending on: width of the blanking window, frequency of stimulation, and intrinsic neuronal activity. These calculations were then corroborated by actual high signal to noise ratio (SNR) single cell recordings. We state that, for clinically relevant frequencies of 130 Hz (used for movement disorders) and realistic blanking windows of 2 ms, up to 27% of actual existing spikes are lost. We strongly advice cautioned use of the blanking method when spike rate quantification is attempted. Blanking (artifact removal by temporarily grounding input), depending on recording parameters, can lead to significant spike loss. Very careful use of blanking circuits is advised.

  6. HPLC method validation for modernization of the tetracycline hydrochloride capsule USP monograph

    Emad M. Hussien

    2014-12-01

    Full Text Available This paper is a continuation to our previous work aiming at development and validation of a reversed-phase HPLC for modernization of tetracycline-related USP monographs and the USP general chapter . Previous results showed that the method is accurate and precise for the assay of tetracycline hydrochloride and the limit of 4-epianhydrotetracycline impurity in the drug substance and oral suspension monographs. The aim of the current paper is to examine the feasibility of the method for modernization of USP tetracycline hydrochloride capsule monograph. Specificity, linearity, accuracy and precision were examined for tetracycline hydrochloride assay and 4-epianhydrotetracycline limit. The method was linear in the concentration range from 80% to 160% (r>0.9998 of the assay concentration (0.1 mg/mL for tetracycline hydrochloride and from 50% to 150% (r>0.997 of the acceptance criteria specified in tetracycline hydrochloride capsule monograph for 4-epianhydrotetracycline (NMT 3.0%. The recovery at three concentration levels for tetracycline hydrochloride assay was between 99% and 101% and the RSD from six preparations at the concentration 0.1 mg/mL is less than 0.6%. The recovery for 4-epianhydrotetracycline limit procedure over the concentration range from 50% to 150% is between 96% and 102% with RSD less than 5%. The results met the specified acceptance criteria.

  7. The demographic impact and development benefits of meeting demand for family planning with modern contraceptive methods.

    Goodkind, Daniel; Lollock, Lisa; Choi, Yoonjoung; McDevitt, Thomas; West, Loraine

    2018-01-01

    Meeting demand for family planning can facilitate progress towards all major themes of the United Nations Sustainable Development Goals (SDGs): people, planet, prosperity, peace, and partnership. Many policymakers have embraced a benchmark goal that at least 75% of the demand for family planning in all countries be satisfied with modern contraceptive methods by the year 2030. This study examines the demographic impact (and development implications) of achieving the 75% benchmark in 13 developing countries that are expected to be the furthest from achieving that benchmark. Estimation of the demographic impact of achieving the 75% benchmark requires three steps in each country: 1) translate contraceptive prevalence assumptions (with and without intervention) into future fertility levels based on biometric models, 2) incorporate each pair of fertility assumptions into separate population projections, and 3) compare the demographic differences between the two population projections. Data are drawn from the United Nations, the US Census Bureau, and Demographic and Health Surveys. The demographic impact of meeting the 75% benchmark is examined via projected differences in fertility rates (average expected births per woman's reproductive lifetime), total population, growth rates, age structure, and youth dependency. On average, meeting the benchmark would imply a 16 percentage point increase in modern contraceptive prevalence by 2030 and a 20% decline in youth dependency, which portends a potential demographic dividend to spur economic growth. Improvements in meeting the demand for family planning with modern contraceptive methods can bring substantial benefits to developing countries. To our knowledge, this is the first study to show formally how such improvements can alter population size and age structure. Declines in youth dependency portend a demographic dividend, an added bonus to the already well-known benefits of meeting existing demands for family planning.

  8. Modern methods of cost saving of the production activity in construction

    Silka, Dmitriy

    2017-10-01

    Every time economy faces recession, cost saving questions acquire increased urgency. This article shows how companies of the construction industry have switched to the new kind of economic relations over recent years. It is specified that the dominant type of economic relations does not allow to quickly reorient on the necessary tools in accordance with new requirements of economic activity. Successful experience in the new environment becomes demanded. Cost saving methods, which were proven in other industries, are offered for achievement of efficiency and competitiveness of the companies. Analysis is performed on the example of the retail sphere, which, according to the authoritative analytical reviews, is extremely innovative on both local and world economic levels. At that, methods, based on the modern unprecedentedly high opportunities of communications and informational exchange took special place among offered methods.

  9. Spatial Analysis Along Networks Statistical and Computational Methods

    Okabe, Atsuyuki

    2012-01-01

    In the real world, there are numerous and various events that occur on and alongside networks, including the occurrence of traffic accidents on highways, the location of stores alongside roads, the incidence of crime on streets and the contamination along rivers. In order to carry out analyses of those events, the researcher needs to be familiar with a range of specific techniques. Spatial Analysis Along Networks provides a practical guide to the necessary statistical techniques and their computational implementation. Each chapter illustrates a specific technique, from Stochastic Point Process

  10. Statistical methods for segmentation and classification of images

    Rosholm, Anders

    1997-01-01

    The central matter of the present thesis is Bayesian statistical inference applied to classification of images. An initial review of Markov Random Fields relates to the modeling aspect of the indicated main subject. In that connection, emphasis is put on the relatively unknown sub-class of Pickard...... with a Pickard Random Field modeling of a considered (categorical) image phenomemon. An extension of the fast PRF based classification technique is presented. The modification introduces auto-correlation into the model of an involved noise process, which previously has been assumed independent. The suitability...... of the extended model is documented by tests on controlled image data containing auto-correlated noise....

  11. Method of statistical estimation of temperature minimums in binary systems

    Mireev, V.A.; Safonov, V.V.

    1985-01-01

    On the basis of statistical processing of literature data the technique for evaluation of temperature minima on liquidus curves in binary systems with common ion chloride systems being taken as an example, is developed. The systems are formed by 48 chlorides of 45 chemical elements including alkali, alkaline earth, rare earth and transition metals as well as Cd, In, Th. It is shown that calculation error in determining minimum melting points depends on topology of the phase diagram. The comparison of calculated and experimental data for several previously nonstudied systems is given

  12. Trends in statistical methods in articles published in Archives of Plastic Surgery between 2012 and 2017.

    Han, Kyunghwa; Jung, Inkyung

    2018-05-01

    This review article presents an assessment of trends in statistical methods and an evaluation of their appropriateness in articles published in the Archives of Plastic Surgery (APS) from 2012 to 2017. We reviewed 388 original articles published in APS between 2012 and 2017. We categorized the articles that used statistical methods according to the type of statistical method, the number of statistical methods, and the type of statistical software used. We checked whether there were errors in the description of statistical methods and results. A total of 230 articles (59.3%) published in APS between 2012 and 2017 used one or more statistical method. Within these articles, there were 261 applications of statistical methods with continuous or ordinal outcomes, and 139 applications of statistical methods with categorical outcome. The Pearson chi-square test (17.4%) and the Mann-Whitney U test (14.4%) were the most frequently used methods. Errors in describing statistical methods and results were found in 133 of the 230 articles (57.8%). Inadequate description of P-values was the most common error (39.1%). Among the 230 articles that used statistical methods, 71.7% provided details about the statistical software programs used for the analyses. SPSS was predominantly used in the articles that presented statistical analyses. We found that the use of statistical methods in APS has increased over the last 6 years. It seems that researchers have been paying more attention to the proper use of statistics in recent years. It is expected that these positive trends will continue in APS.

  13. SOME ASPECTS OF THE USE OF MATHEMATICAL-STATISTICAL METHODS IN THE ANALYSIS OF SOCIO-HUMANISTIC TEXTS Humanities and social text, mathematics, method, statistics, probability

    Zaira M Alieva

    2016-01-01

    Full Text Available The article analyzes the application of mathematical and statistical methods in the analysis of socio-humanistic texts. The essence of mathematical and statistical methods, presents examples of their use in the study of Humanities and social phenomena. Considers the key issues faced by the expert in the application of mathematical-statistical methods in socio-humanitarian sphere, including the availability of sustainable contrasting socio-humanitarian Sciences and mathematics; the complexity of the allocation of the object that is the bearer of the problem; having the use of a probabilistic approach. The conclusion according to the results of the study.

  14. A statistical comparison of accelerated concrete testing methods

    Denny Meyer

    1997-01-01

    Accelerated curing results, obtained after only 24 hours, are used to predict the 28 day strength of concrete. Various accelerated curing methods are available. Two of these methods are compared in relation to the accuracy of their predictions and the stability of the relationship between their 24 hour and 28 day concrete strength. The results suggest that Warm Water accelerated curing is preferable to Hot Water accelerated curing of concrete. In addition, some other methods for improving the...

  15. Evaluation of local corrosion life by statistical method

    Kato, Shunji; Kurosawa, Tatsuo; Takaku, Hiroshi; Kusanagi, Hideo; Hirano, Hideo; Kimura, Hideo; Hide, Koichiro; Kawasaki, Masayuki

    1987-01-01

    In this paper, for the purpose of achievement of life extension of light water reactor, we examined the evaluation of local corrosion by satistical method and its application of nuclear power plant components. There are many evaluation examples of maximum cracking depth of local corrosion by dowbly exponential distribution. This evaluation method has been established. But, it has not been established that we evaluate service lifes of construction materials by satistical method. In order to establish of service life evaluation by satistical method, we must strive to collect local corrosion dates and its analytical researchs. (author)

  16. Monitoring the Error Rate of Modern Methods of Construction Based on Wood

    Švajlenka, Jozef; Kozlovská, Mária

    2017-06-01

    A range of new and innovative construction systems, currently developed, represent modern methods of construction (MMC), which has the ambition to improve the performance parameters of buildings throughout their life cycle. Regarding the implementation modern methods of construction in Slovakia, assembled buildings based on wood seem to be the most preferred construction system. In the study, presented in the paper, were searched already built and lived-in wood based family houses. The residents' attitudes to such type of buildings in the context with declared designing and qualitative parameters of efficiency and sustainability are overlooked. The methodology of the research study is based on the socio-economic survey carried out during the years 2015 - 2017 within the Slovak Republic. Due to the large extent of data collected through questionnaire, only selected parts of the survey results are evaluated and discussed in the paper. This paper is aimed at evaluating the quality of buildings expressed in a view of users of existing wooden buildings. Research indicates some defects, which can be eliminated in the next production process. Research indicates, that some defects occur, so the production process quality should be improved in the future development.

  17. REVEALING OF DEFECTS OF BEARINGS WITH THE HELP OF MODERN METHODS OF CONTROL OF TECHNOLOGICAL EQUIPMENT OF HARDWARE PRODUCTION

    S. M. Piskun

    2010-01-01

    Full Text Available It is shown that using of modern methods and means of technical diagnostics will allow to provide reliable accident-free exploitation of equipment, to decrease considerably labour-intensiveness, period of repair and accordingly production expenses.

  18. Statistical methods for analysing responses of wildlife to human disturbance.

    Haiganoush K. Preisler; Alan A. Ager; Michael J. Wisdom

    2006-01-01

    1. Off-road recreation is increasing rapidly in many areas of the world, and effects on wildlife can be highly detrimental. Consequently, we have developed methods for studying wildlife responses to off-road recreation with the use of new technologies that allow frequent and accurate monitoring of human-wildlife interactions. To illustrate these methods, we studied the...

  19. Introducing Students to the Application of Statistics and Investigative Methods in Political Science

    Wells, Dominic D.; Nemire, Nathan A.

    2017-01-01

    This exercise introduces students to the application of statistics and its investigative methods in political science. It helps students gain a better understanding and a greater appreciation of statistics through a real world application.

  20. Use of Mathematical Methods of Statistics for Analyzing Engine Characteristics

    Aivaras Jasilionis

    2012-11-01

    Full Text Available For the development of new models, automobile manufacturers are trying to come up with optimal software for engine control in all movement modes. However, in this case, a vehicle cannot reach outstanding characteristics in none of them. This is the main reason why modifications in engine control software used for adapting the vehicle for driver’s needs are becoming more and more popular. The article presents a short analysis of development trends towards engine control software. Also, models of mathematical statistics for engine power and torque growth are created. The introduced models give an opportunity to predict the probabilities of engine power or torque growth after individual reprogramming of engine control software.

  1. Statistical Methods and Tools for Hanford Staged Feed Tank Sampling

    Fountain, Matthew S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Brigantic, Robert T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Peterson, Reid A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-10-01

    This report summarizes work conducted by Pacific Northwest National Laboratory to technically evaluate the current approach to staged feed sampling of high-level waste (HLW) sludge to meet waste acceptance criteria (WAC) for transfer from tank farms to the Hanford Waste Treatment and Immobilization Plant (WTP). The current sampling and analysis approach is detailed in the document titled Initial Data Quality Objectives for WTP Feed Acceptance Criteria, 24590-WTP-RPT-MGT-11-014, Revision 0 (Arakali et al. 2011). The goal of this current work is to evaluate and provide recommendations to support a defensible, technical and statistical basis for the staged feed sampling approach that meets WAC data quality objectives (DQOs).

  2. A new quantum statistical evaluation method for time correlation functions

    Loss, D.; Schoeller, H.

    1989-01-01

    Considering a system of N identical interacting particles, which obey Fermi-Dirac or Bose-Einstein statistics, the authors derive new formulas for correlation functions of the type C(t) = i= 1 N A i (t) Σ j=1 N B j > (where B j is diagonal in the free-particle states) in the thermodynamic limit. Thereby they apply and extend a superoperator formalism, recently developed for the derivation of long-time tails in semiclassical systems. As an illustrative application, the Boltzmann equation value of the time-integrated correlation function C(t) is derived in a straight-forward manner. Due to exchange effects, the obtained t-matrix and the resulting scattering cross section, which occurs in the Boltzmann collision operator, are now functionals of the Fermi-Dirac or Bose-Einstein distribution

  3. Statistical Bayesian method for reliability evaluation based on ADT data

    Lu, Dawei; Wang, Lizhi; Sun, Yusheng; Wang, Xiaohong

    2018-05-01

    Accelerated degradation testing (ADT) is frequently conducted in the laboratory to predict the products’ reliability under normal operating conditions. Two kinds of methods, degradation path models and stochastic process models, are utilized to analyze degradation data and the latter one is the most popular method. However, some limitations like imprecise solution process and estimation result of degradation ratio still exist, which may affect the accuracy of the acceleration model and the extrapolation value. Moreover, the conducted solution of this problem, Bayesian method, lose key information when unifying the degradation data. In this paper, a new data processing and parameter inference method based on Bayesian method is proposed to handle degradation data and solve the problems above. First, Wiener process and acceleration model is chosen; Second, the initial values of degradation model and parameters of prior and posterior distribution under each level is calculated with updating and iteration of estimation values; Third, the lifetime and reliability values are estimated on the basis of the estimation parameters; Finally, a case study is provided to demonstrate the validity of the proposed method. The results illustrate that the proposed method is quite effective and accuracy in estimating the lifetime and reliability of a product.

  4. A simple statistical method for catch comparison studies

    Holst, René; Revill, Andrew

    2009-01-01

    For analysing catch comparison data, we propose a simple method based on Generalised Linear Mixed Models (GLMM) and use polynomial approximations to fit the proportions caught in the test codend. The method provides comparisons of fish catch at length by the two gears through a continuous curve...... with a realistic confidence band. We demonstrate the versatility of this method, on field data obtained from the first known testing in European waters of the Rhode Island (USA) 'Eliminator' trawl. These data are interesting as they include a range of species with different selective patterns. Crown Copyright (C...

  5. A statistical comparison of accelerated concrete testing methods

    Denny Meyer

    1997-01-01

    Full Text Available Accelerated curing results, obtained after only 24 hours, are used to predict the 28 day strength of concrete. Various accelerated curing methods are available. Two of these methods are compared in relation to the accuracy of their predictions and the stability of the relationship between their 24 hour and 28 day concrete strength. The results suggest that Warm Water accelerated curing is preferable to Hot Water accelerated curing of concrete. In addition, some other methods for improving the accuracy of predictions of 28 day strengths are suggested. In particular the frequency at which it is necessary to recalibrate the prediction equation is considered.

  6. Application of modern diagnostic methods to environmental improvement. Annual progress report, October 1994--September 1995

    Shepard, W.S.

    1995-12-01

    The Diagnostic Instrumentation and Analysis Laboratory (DIAL), an interdisciplinary research department in the College of Engineering at Mississippi State University (MSU), is under contract with the US Department of Energy (DOE) to develop and apply advanced diagnostic instrumentation and analysis techniques to aid in solving DOE's nuclear waste problem. The program is a comprehensive effort which includes five focus areas: advanced diagnostic systems; development/application; torch operation and test facilities; process development; on-site field measurement and analysis; technology transfer/commercialization. As part of this program, diagnostic methods will be developed and evaluated for characterization, monitoring and process control. Also, the measured parameters, will be employed to improve, optimize and control the operation of the plasma torch and the overall plasma treatment process. Moreover, on-site field measurements at various DOE facilities are carried out to aid in the rapid demonstration and implementation of modern fieldable diagnostic methods. Such efforts also provide a basis for technology transfer

  7. Application of modern diagnostic methods to environmental improvement. Annual progress report, October 1994--September 1995

    Shepard, W.S.

    1995-12-01

    The Diagnostic Instrumentation and Analysis Laboratory (DIAL), an interdisciplinary research department in the College of Engineering at Mississippi State University (MSU), is under contract with the US Department of Energy (DOE) to develop and apply advanced diagnostic instrumentation and analysis techniques to aid in solving DOE`s nuclear waste problem. The program is a comprehensive effort which includes five focus areas: advanced diagnostic systems; development/application; torch operation and test facilities; process development; on-site field measurement and analysis; technology transfer/commercialization. As part of this program, diagnostic methods will be developed and evaluated for characterization, monitoring and process control. Also, the measured parameters, will be employed to improve, optimize and control the operation of the plasma torch and the overall plasma treatment process. Moreover, on-site field measurements at various DOE facilities are carried out to aid in the rapid demonstration and implementation of modern fieldable diagnostic methods. Such efforts also provide a basis for technology transfer.

  8. Some considerations about literary analysis modern methods from a didactic perspective

    Marialina Ana García Escobio

    2016-12-01

    Full Text Available This article make it possible a close look to the modern methods of literary analysis, taking as the starting point what the teaching of literature in the joyful context should fulfill in the study of the literary play and the processes of reception and aesthetic statement, as well as the application of the aforementioned methods in the attempt to make the student arrive at rational position; But, at the same time, he/she should feel creator and coauthor of an event that should be lived from a type of special reading done; Herein the importance of a system of methodological work that prepares the professors to contribute from each of the classes to the development of skills for the students´ literary analysis.

  9. Rationalizing method of replacement intervals by using Bayesian statistics

    Kasai, Masao; Notoya, Junichi; Kusakari, Yoshiyuki

    2007-01-01

    This study represents the formulations for rationalizing the replacement intervals of equipments and/or parts taking into account the probability density functions (PDF) of the parameters of failure distribution functions (FDF) and compares the optimized intervals by our formulations with those by conventional formulations which uses only representative values of the parameters of FDF instead of using these PDFs. The failure data are generated by Monte Carlo simulations since the real failure data can not be available for us. The PDF of PDF parameters are obtained by Bayesian method and the representative values are obtained by likelihood estimation and Bayesian method. We found that the method using PDF by Bayesian method brings longer replacement intervals than one using the representative of the parameters. (author)

  10. Comparative Analysis of Kernel Methods for Statistical Shape Learning

    Rathi, Yogesh; Dambreville, Samuel; Tannenbaum, Allen

    2006-01-01

    .... In this work, we perform a comparative analysis of shape learning techniques such as linear PCA, kernel PCA, locally linear embedding and propose a new method, kernelized locally linear embedding...

  11. Statistical Genetics Methods for Localizing Multiple Breast Cancer Genes

    Ott, Jurg

    1998-01-01

    .... For a number of variables measured on a trait, a method, principal components of heritability, was developed that combines these variables in such a way that the resulting linear combination has highest heritability...

  12. THE MODERN TRENDS AND EXPERIMENTAL METHODS OF TRAINING FUTURE SPECIALISTS IN THE FIELD OF HISTRIONIC ART

    Julia Sergeevna Skvortsova

    2013-11-01

    Full Text Available This article reflects the specific ways of training specialists in the sphere of histrionic art, contemporary trends and special scientific experiments in the education of actors.We consider the traditional aspects of histrionic art specialists’ occupational fitness and modern requirements upon actor’s psycho-physical training.There also follows historical parallels in studying actor’s energy in the researches of K. Stanislavsky, M. Chekhov, T. Reebo, W. James, A. Maneggetty as well as in the modern researches of L. Gracheva.And there was proved the reason for including some Yoga elements and Academician M.Norbecov’s exercises system into the actors training practice.In the article Yoga for actors considers as a system of emotional and physical preparation to the artistic creative work, as a method of self-control that let an actor be really deep in his creative condition.There also was described an advantageous result of applying at our experimental laboratory of actors Yoga classes some exercises that reveal connection of person’s carriage and his condition in the process of working on the outward demonstration of this or that emotion in order to create appropriate inner state.Object: Studying the influence on the productivity of actors’ professional education of the Yoga elements included into the actors’ psycho-technique practice training.Methods: theoretical, experimental and observational methods of research.Results: After using these methods during the classes the rate of student’s psycho-physical APPARATUS has extremely increased.The area of application: the educational process at artistic creative high schools.DOI: http://dx.doi.org/10.12731/2218-7405-2013-7-52

  13. Statistics in science the foundations of statistical methods in biology, physics and economics

    Costantini, Domenico

    1990-01-01

    An inference may be defined as a passage of thought according to some method. In the theory of knowledge it is customary to distinguish deductive and non-deductive inferences. Deductive inferences are truth preserving, that is, the truth of the premises is preserved in the con­ clusion. As a result, the conclusion of a deductive inference is already 'contained' in the premises, although we may not know this fact until the inference is performed. Standard examples of deductive inferences are taken from logic and mathematics. Non-deductive inferences need not preserve truth, that is, 'thought may pass' from true premises to false conclusions. Such inferences can be expansive, or, ampliative in the sense that the performances of such inferences actually increases our putative knowledge. Standard non-deductive inferences do not really exist, but one may think of elementary inductive inferences in which conclusions regarding the future are drawn from knowledge of the past. Since the body of scientific knowledge i...

  14. Statistical methods to assess and control processes and products during nuclear fuel fabrication

    Weidinger, H.

    1999-01-01

    Very good statistical tools and techniques are available today to access the quality and the reliability of fabrication process as the original sources for a good and reliable quality of the fabricated processes. Quality control charts of different types play a key role and the high capability of modern electronic data acquisition technologies proved, at least potentially, a high efficiency in the more or less online application of these methods. These techniques focus mainly on stability and the reliability of the fabrication process. In addition, relatively simple statistical tolls are available to access the capability of fabrication process, assuming they are stable, to fulfill the product specifications. All these techniques can only result in as good a product as the product design is able to describe the product requirements necessary for good performance. Therefore it is essential that product design is strictly and closely performance oriented. However, performance orientation is only successful through an open and effective cooperation with the customer who uses or applies those products. During the last one to two decades in the west, a multi-vendor strategy has been developed by the utility, sometimes leading to three different fuel vendors for one reactor core. This development resulted in better economic conditions for the user but did not necessarily increase an open attitude with the vendor toward the using utility. The responsibility of the utility increased considerably to ensure an adequate quality of the fuel they received. As a matter of fact, sometimes the utilities had to pay a high price because of unexpected performance problems. Thus the utilities are now learning that they need to increase their knowledge and experience in the area of nuclear fuel quality management and technology. This process started some time ago in the west. However, it now also reaches the utilities in the eastern countries. (author)

  15. Groundwater vulnerability assessment: from overlay methods to statistical methods in the Lombardy Plain area

    Stefania Stevenazzi

    2017-06-01

    Full Text Available Groundwater is among the most important freshwater resources. Worldwide, aquifers are experiencing an increasing threat of pollution from urbanization, industrial development, agricultural activities and mining enterprise. Thus, practical actions, strategies and solutions to protect groundwater from these anthropogenic sources are widely required. The most efficient tool, which helps supporting land use planning, while protecting groundwater from contamination, is represented by groundwater vulnerability assessment. Over the years, several methods assessing groundwater vulnerability have been developed: overlay and index methods, statistical and process-based methods. All methods are means to synthesize complex hydrogeological information into a unique document, which is a groundwater vulnerability map, useable by planners, decision and policy makers, geoscientists and the public. Although it is not possible to identify an approach which could be the best one for all situations, the final product should always be scientific defensible, meaningful and reliable. Nevertheless, various methods may produce very different results at any given site. Thus, reasons for similarities and differences need to be deeply investigated. This study demonstrates the reliability and flexibility of a spatial statistical method to assess groundwater vulnerability to contamination at a regional scale. The Lombardy Plain case study is particularly interesting for its long history of groundwater monitoring (quality and quantity, availability of hydrogeological data, and combined presence of various anthropogenic sources of contamination. Recent updates of the regional water protection plan have raised the necessity of realizing more flexible, reliable and accurate groundwater vulnerability maps. A comparison of groundwater vulnerability maps obtained through different approaches and developed in a time span of several years has demonstrated the relevance of the

  16. Instrumental and statistical methods for the comparison of class evidence

    Liszewski, Elisa Anne

    Trace evidence is a major field within forensic science. Association of trace evidence samples can be problematic due to sample heterogeneity and a lack of quantitative criteria for comparing spectra or chromatograms. The aim of this study is to evaluate different types of instrumentation for their ability to discriminate among samples of various types of trace evidence. Chemometric analysis, including techniques such as Agglomerative Hierarchical Clustering, Principal Components Analysis, and Discriminant Analysis, was employed to evaluate instrumental data. First, automotive clear coats were analyzed by using microspectrophotometry to collect UV absorption data. In total, 71 samples were analyzed with classification accuracy of 91.61%. An external validation was performed, resulting in a prediction accuracy of 81.11%. Next, fiber dyes were analyzed using UV-Visible microspectrophotometry. While several physical characteristics of cotton fiber can be identified and compared, fiber color is considered to be an excellent source of variation, and thus was examined in this study. Twelve dyes were employed, some being visually indistinguishable. Several different analyses and comparisons were done, including an inter-laboratory comparison and external validations. Lastly, common plastic samples and other polymers were analyzed using pyrolysis-gas chromatography/mass spectrometry, and their pyrolysis products were then analyzed using multivariate statistics. The classification accuracy varied dependent upon the number of classes chosen, but the plastics were grouped based on composition. The polymers were used as an external validation and misclassifications occurred with chlorinated samples all being placed into the category containing PVC.

  17. Non-Statistical Methods of Analysing of Bankruptcy Risk

    Pisula Tomasz

    2015-06-01

    Full Text Available The article focuses on assessing the effectiveness of a non-statistical approach to bankruptcy modelling in enterprises operating in the logistics sector. In order to describe the issue more comprehensively, the aforementioned prediction of the possible negative results of business operations was carried out for companies functioning in the Polish region of Podkarpacie, and in Slovakia. The bankruptcy predictors selected for the assessment of companies operating in the logistics sector included 28 financial indicators characterizing these enterprises in terms of their financial standing and management effectiveness. The purpose of the study was to identify factors (models describing the bankruptcy risk in enterprises in the context of their forecasting effectiveness in a one-year and two-year time horizon. In order to assess their practical applicability the models were carefully analysed and validated. The usefulness of the models was assessed in terms of their classification properties, and the capacity to accurately identify enterprises at risk of bankruptcy and healthy companies as well as proper calibration of the models to the data from training sample sets.

  18. Comparison of Statistical Methods for Detector Testing Programs

    Rennie, John Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Abhold, Mark [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-10-14

    A typical goal for any detector testing program is to ascertain not only the performance of the detector systems under test, but also the confidence that systems accepted using that testing program’s acceptance criteria will exceed a minimum acceptable performance (which is usually expressed as the minimum acceptable success probability, p). A similar problem often arises in statistics, where we would like to ascertain the fraction, p, of a population of items that possess a property that may take one of two possible values. Typically, the problem is approached by drawing a fixed sample of size n, with the number of items out of n that possess the desired property, x, being termed successes. The sample mean gives an estimate of the population mean p ≈ x/n, although usually it is desirable to accompany such an estimate with a statement concerning the range within which p may fall and the confidence associated with that range. Procedures for establishing such ranges and confidence limits are described in detail by Clopper, Brown, and Agresti for two-sided symmetric confidence intervals.

  19. Data Analysis & Statistical Methods for Command File Errors

    Meshkat, Leila; Waggoner, Bruce; Bryant, Larry

    2014-01-01

    This paper explains current work on modeling for managing the risk of command file errors. It is focused on analyzing actual data from a JPL spaceflight mission to build models for evaluating and predicting error rates as a function of several key variables. We constructed a rich dataset by considering the number of errors, the number of files radiated, including the number commands and blocks in each file, as well as subjective estimates of workload and operational novelty. We have assessed these data using different curve fitting and distribution fitting techniques, such as multiple regression analysis, and maximum likelihood estimation to see how much of the variability in the error rates can be explained with these. We have also used goodness of fit testing strategies and principal component analysis to further assess our data. Finally, we constructed a model of expected error rates based on the what these statistics bore out as critical drivers to the error rate. This model allows project management to evaluate the error rate against a theoretically expected rate as well as anticipate future error rates.

  20. Statistically Efficient Methods for Pitch and DOA Estimation

    Jensen, Jesper Rindom; Christensen, Mads Græsbøll; Jensen, Søren Holdt

    2013-01-01

    , it was recently considered to estimate the DOA and pitch jointly. In this paper, we propose two novel methods for DOA and pitch estimation. They both yield maximum-likelihood estimates in white Gaussian noise scenar- ios, where the SNR may be different across channels, as opposed to state-of-the-art methods......Traditionally, direction-of-arrival (DOA) and pitch estimation of multichannel, periodic sources have been considered as two separate problems. Separate estimation may render the task of resolving sources with similar DOA or pitch impossible, and it may decrease the estimation accuracy. Therefore...

  1. The emergence of modern statistics in agricultural science: analysis of variance, experimental design and the reshaping of research at Rothamsted Experimental Station, 1919-1933.

    Parolini, Giuditta

    2015-01-01

    During the twentieth century statistical methods have transformed research in the experimental and social sciences. Qualitative evidence has largely been replaced by quantitative results and the tools of statistical inference have helped foster a new ideal of objectivity in scientific knowledge. The paper will investigate this transformation by considering the genesis of analysis of variance and experimental design, statistical methods nowadays taught in every elementary course of statistics for the experimental and social sciences. These methods were developed by the mathematician and geneticist R. A. Fisher during the 1920s, while he was working at Rothamsted Experimental Station, where agricultural research was in turn reshaped by Fisher's methods. Analysis of variance and experimental design required new practices and instruments in field and laboratory research, and imposed a redistribution of expertise among statisticians, experimental scientists and the farm staff. On the other hand the use of statistical methods in agricultural science called for a systematization of information management and made computing an activity integral to the experimental research done at Rothamsted, permanently integrating the statisticians' tools and expertise into the station research programme. Fisher's statistical methods did not remain confined within agricultural research and by the end of the 1950s they had come to stay in psychology, sociology, education, chemistry, medicine, engineering, economics, quality control, just to mention a few of the disciplines which adopted them.

  2. Statistical methods for mass spectrometry-based clinical proteomics

    Kakourou, A.

    2018-01-01

    The work presented in this thesis focuses on methods for the construction of diagnostic rules based on clinical mass spectrometry proteomic data. Mass spectrometry has become one of the key technologies for jointly measuring the expression of thousands of proteins in biological samples.

  3. Statistical comparison of excystation methods in Cryptosporidium parvum oocysts

    Pecková, R.; Stuart, P. D.; Sak, Bohumil; Květoňová, Dana; Kváč, Martin; Foitová, I.

    2016-01-01

    Roč. 230, OCT 30 (2016), s. 1-5 ISSN 0304-4017 R&D Projects: GA ČR(CZ) GAP505/11/1163 Institutional support: RVO:60077344 Keywords : Cryptosporidium parvum * excystation methods * in vitro cultivation * sodium hypochlorite * tlypsin Subject RIV: EG - Zoology Impact factor: 2.356, year: 2016

  4. Application of few-body methods to statistical mechanics

    Bolle, D.

    1981-01-01

    This paper reviews some of the methods to study the thermodynamic properties of a macroscopic system in terms of the scattering processes between the constituent particles in the system. In particular, we discuss the time delay approach to the virial expansion and the use of the arrangement channel quantum mechanics formulation in kinetic theory. (orig.)

  5. CAPABILITY ASSESSMENT OF MEASURING EQUIPMENT USING STATISTIC METHOD

    Pavel POLÁK

    2014-10-01

    Full Text Available Capability assessment of the measurement device is one of the methods of process quality control. Only in case the measurement device is capable, the capability of the measurement and consequently production process can be assessed. This paper deals with assessment of the capability of the measuring device using indices Cg and Cgk.

  6. Impact of statistical learning methods on the predictive power of multivariate normal tissue complication probability models

    Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A.; van t Veld, Aart A.

    2012-01-01

    PURPOSE: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator

  7. Comparison between statistical and optimization methods in accessing unmixing of spectrally similar materials

    Debba, Pravesh

    2010-11-01

    Full Text Available This paper reports on the results from ordinary least squares and ridge regression as statistical methods, and is compared to numerical optimization methods such as the stochastic method for global optimization, simulated annealing, particle swarm...

  8. Statistical Analysis Methods for the fMRI Data

    Huseyin Boyaci

    2011-08-01

    Full Text Available Functional magnetic resonance imaging (fMRI is a safe and non-invasive way to assess brain functions by using signal changes associated with brain activity. The technique has become a ubiquitous tool in basic, clinical and cognitive neuroscience. This method can measure little metabolism changes that occur in active part of the brain. We process the fMRI data to be able to find the parts of brain that are involve in a mechanism, or to determine the changes that occur in brain activities due to a brain lesion. In this study we will have an overview over the methods that are used for the analysis of fMRI data.

  9. A method for the statistical interpretation of friction ridge skin impression evidence: Method development and validation.

    Swofford, H J; Koertner, A J; Zemp, F; Ausdemore, M; Liu, A; Salyards, M J

    2018-04-03

    The forensic fingerprint community has faced increasing amounts of criticism by scientific and legal commentators, challenging the validity and reliability of fingerprint evidence due to the lack of an empirically demonstrable basis to evaluate and report the strength of the evidence in a given case. This paper presents a method, developed as a stand-alone software application, FRStat, which provides a statistical assessment of the strength of fingerprint evidence. The performance was evaluated using a variety of mated and non-mated datasets. The results show strong performance characteristics, often with values supporting specificity rates greater than 99%. This method provides fingerprint experts the capability to demonstrate the validity and reliability of fingerprint evidence in a given case and report the findings in a more transparent and standardized fashion with clearly defined criteria for conclusions and known error rate information thereby responding to concerns raised by the scientific and legal communities. Published by Elsevier B.V.

  10. Statistics of electron multiplication in multiplier phototube: iterative method

    Grau Malonda, A.; Ortiz Sanchez, J.F.

    1985-01-01

    An iterative method is applied to study the variation of dynode response in the multiplier phototube. Three different situations are considered that correspond to the following ways of electronic incidence on the first dynode: incidence of exactly one electron, incidence of exactly r electrons and incidence of an average anti-r electrons. The responses are given for a number of steps between 1 and 5, and for values of the multiplication factor of 2.1, 2.5, 3 and 5. We study also the variance, the skewness and the excess of jurtosis for different multiplication factors. (author)

  11. Statistics of electron multiplication in a multiplier phototube; Iterative method

    Ortiz, J. F.; Grau, A.

    1985-01-01

    In the present paper an iterative method is applied to study the variation of dynode response in the multiplier phototube. Three different situation are considered that correspond to the following ways of electronic incidence on the first dynode: incidence of exactly one electron, incidence of exactly r electrons and incidence of an average r electrons. The responses are given for a number of steps between 1 and 5, and for values of the multiplication factor of 2.1, 2.5, 3 and 5. We study also the variance, the skewness and the excess of jurtosis for different multiplication factors. (Author) 11 refs

  12. Statistical inference methods for two crossing survival curves: a comparison of methods.

    Li, Huimin; Han, Dong; Hou, Yawen; Chen, Huilin; Chen, Zheng

    2015-01-01

    A common problem that is encountered in medical applications is the overall homogeneity of survival distributions when two survival curves cross each other. A survey demonstrated that under this condition, which was an obvious violation of the assumption of proportional hazard rates, the log-rank test was still used in 70% of studies. Several statistical methods have been proposed to solve this problem. However, in many applications, it is difficult to specify the types of survival differences and choose an appropriate method prior to analysis. Thus, we conducted an extensive series of Monte Carlo simulations to investigate the power and type I error rate of these procedures under various patterns of crossing survival curves with different censoring rates and distribution parameters. Our objective was to evaluate the strengths and weaknesses of tests in different situations and for various censoring rates and to recommend an appropriate test that will not fail for a wide range of applications. Simulation studies demonstrated that adaptive Neyman's smooth tests and the two-stage procedure offer higher power and greater stability than other methods when the survival distributions cross at early, middle or late times. Even for proportional hazards, both methods maintain acceptable power compared with the log-rank test. In terms of the type I error rate, Renyi and Cramér-von Mises tests are relatively conservative, whereas the statistics of the Lin-Xu test exhibit apparent inflation as the censoring rate increases. Other tests produce results close to the nominal 0.05 level. In conclusion, adaptive Neyman's smooth tests and the two-stage procedure are found to be the most stable and feasible approaches for a variety of situations and censoring rates. Therefore, they are applicable to a wider spectrum of alternatives compared with other tests.

  13. Students' Attitudes toward Statistics across the Disciplines: A Mixed-Methods Approach

    Griffith, James D.; Adams, Lea T.; Gu, Lucy L.; Hart, Christian L.; Nichols-Whitehead, Penney

    2012-01-01

    Students' attitudes toward statistics were investigated using a mixed-methods approach including a discovery-oriented qualitative methodology among 684 undergraduate students across business, criminal justice, and psychology majors where at least one course in statistics was required. Students were asked about their attitudes toward statistics and…

  14. Counting Better? An Examination of the Impact of Quantitative Method Teaching on Statistical Anxiety and Confidence

    Chamberlain, John Martyn; Hillier, John; Signoretta, Paola

    2015-01-01

    This article reports the results of research concerned with students' statistical anxiety and confidence to both complete and learn to complete statistical tasks. Data were collected at the beginning and end of a quantitative method statistics module. Students recognised the value of numeracy skills but felt they were not necessarily relevant for…

  15. The modern methods of treatment of patients with radiation syndrome in a specialized hospital (analytical review)

    Selidovkin, G.D.

    1995-01-01

    Modern methods of treatment of patients with various symptoms of acute radiation disease in a specialized hospital are reviewed. The treatment starts from prophylactic prescription of antibacterial antibiotics of latest generations (imipinem, cephalosporin of 3-rd generation), antifungal and antiviral remedies, immunoglobulin G, selective decontamination of intestines should be carried out. Transfusion of donor thrombocytes and erythrocytes in the quantities adequate to the degree of explicit cytopenia should be provided. Full parenteral feeding should be prescribed, desintoxication and adjustment therapy is be administered. Transplantation of HLA-identical bone marrow may be recommended only over the range from 10 to 15 Gy close to uniform irradiation. The most promising in the therapy of acute radiation disease - 3, 4 are hemopoietic growth factors. 76 refs.; 2 tabs

  16. Multivariate statistical methods and data mining in particle physics (4/4)

    CERN. Geneva

    2008-01-01

    The lectures will cover multivariate statistical methods and their applications in High Energy Physics. The methods will be viewed in the framework of a statistical test, as used e.g. to discriminate between signal and background events. Topics will include an introduction to the relevant statistical formalism, linear test variables, neural networks, probability density estimation (PDE) methods, kernel-based PDE, decision trees and support vector machines. The methods will be evaluated with respect to criteria relevant to HEP analyses such as statistical power, ease of computation and sensitivity to systematic effects. Simple computer examples that can be extended to more complex analyses will be presented.

  17. Multivariate statistical methods and data mining in particle physics (2/4)

    CERN. Geneva

    2008-01-01

    The lectures will cover multivariate statistical methods and their applications in High Energy Physics. The methods will be viewed in the framework of a statistical test, as used e.g. to discriminate between signal and background events. Topics will include an introduction to the relevant statistical formalism, linear test variables, neural networks, probability density estimation (PDE) methods, kernel-based PDE, decision trees and support vector machines. The methods will be evaluated with respect to criteria relevant to HEP analyses such as statistical power, ease of computation and sensitivity to systematic effects. Simple computer examples that can be extended to more complex analyses will be presented.

  18. Multivariate statistical methods and data mining in particle physics (1/4)

    CERN. Geneva

    2008-01-01

    The lectures will cover multivariate statistical methods and their applications in High Energy Physics. The methods will be viewed in the framework of a statistical test, as used e.g. to discriminate between signal and background events. Topics will include an introduction to the relevant statistical formalism, linear test variables, neural networks, probability density estimation (PDE) methods, kernel-based PDE, decision trees and support vector machines. The methods will be evaluated with respect to criteria relevant to HEP analyses such as statistical power, ease of computation and sensitivity to systematic effects. Simple computer examples that can be extended to more complex analyses will be presented.

  19. MODERN INSTRUMENTAL METHODS TO CONTROL THE SEED QUALITY IN ROOT VEGETABLES

    F. B. Musaev

    2017-01-01

    Full Text Available The standard methods of analysis don’t meet all modern requirements to determine the seed a quality. These methods can’t unveil inner deficiencies that are very important to control seed viability. The capabilities of new instrumental method to analyze the seed quality of root vegetables were regarded in the article. The method of micro-focus radiography is distinguished from other existing methods by more sensitivity, rapidity and easiness to be performed. Based on practical importance the visualization of inner seed structure, it allows determining far before seed germination the degree of endosperm development and embryo; the presence of inner damages and infections, occupation and damage caused by pests. The use of micro-focus radiography enables to detect the degree of seed quality difference for some traits such as monogermity and self-fertilization that are economically valuable for breeding program in red beet. With the aid of the method the level of seed development, damage and inner deficiencies in carrot and parsnip can be revealed. In X-ray projection seeds of inbred lines of radish significantly differed from variety population ones for their underdevelopment in the inner structure. The advantage of the method is that seeds rest undamaged after quality analyzing and both can be used for further examination with the use of other methods or be sown; that is quite important for breeders, when handling with small quantity or collectable plant breeding material. The results radiography analyses can be saved and archived that enables to watch for seed qualities in dynamic; this data can be also used at possible arbitration cases. 

  20. Refining developmental coordination disorder subtyping with multivariate statistical methods

    Lalanne Christophe

    2012-07-01

    Full Text Available Abstract Background With a large number of potentially relevant clinical indicators penalization and ensemble learning methods are thought to provide better predictive performance than usual linear predictors. However, little is known about how they perform in clinical studies where few cases are available. We used Random Forests and Partial Least Squares Discriminant Analysis to select the most salient impairments in Developmental Coordination Disorder (DCD and assess patients similarity. Methods We considered a wide-range testing battery for various neuropsychological and visuo-motor impairments which aimed at characterizing subtypes of DCD in a sample of 63 children. Classifiers were optimized on a training sample, and they were used subsequently to rank the 49 items according to a permuted measure of variable importance. In addition, subtyping consistency was assessed with cluster analysis on the training sample. Clustering fitness and predictive accuracy were evaluated on the validation sample. Results Both classifiers yielded a relevant subset of items impairments that altogether accounted for a sharp discrimination between three DCD subtypes: ideomotor, visual-spatial and constructional, and mixt dyspraxia. The main impairments that were found to characterize the three subtypes were: digital perception, imitations of gestures, digital praxia, lego blocks, visual spatial structuration, visual motor integration, coordination between upper and lower limbs. Classification accuracy was above 90% for all classifiers, and clustering fitness was found to be satisfactory. Conclusions Random Forests and Partial Least Squares Discriminant Analysis are useful tools to extract salient features from a large pool of correlated binary predictors, but also provide a way to assess individuals proximities in a reduced factor space. Less than 15 neuro-visual, neuro-psychomotor and neuro-psychological tests might be required to provide a sensitive and

  1. Modern Cored Wire Injection 2PE-9 Method in the Production of Ductile Iron

    E. Guzik

    2012-04-01

    Full Text Available The results of studies on the use of modern two cored wires injection method for production of nodular graphite cast iron with use of unique implementation of drum ladle as a treatment/ transport and casting ladle instead vertical treatment ladle was described. The injection of length of Ø 9mm wires, cored: in FeSi + Mg nodulariser mixture and inoculant master alloy is a treatment method which can be used to produce iron melted in coreless induction furnace. This paper describes the results of using this method for possibility production of ductile iron under specific industrial conditions. In this case was taken ductile iron with material designation: EN-GJS-450- 10 Grade according PN-EN 1563:2000. Microstructure of 28 trials was controlled on internally used sample which has been correlated with standard sample before. The paper presents typical metallic matrix and graphite characteristic. Additionally, mechanical properties were checked in one experiment. Because of further possibility treatment temperature reduction only the rough magnesium recovery and cost of this new method are given.

  2. Energy demand forecasting method based on international statistical data

    Glanc, Z.; Kerner, A.

    1997-01-01

    Poland is in a transition phase from a centrally planned to a market economy; data collected under former economic conditions do not reflect a market economy. Final energy demand forecasts are based on the assumption that the economic transformation in Poland will gradually lead the Polish economy, technologies and modes of energy use, to the same conditions as mature market economy countries. The starting point has a significant influence on the future energy demand and supply structure: final energy consumption per capita in 1992 was almost half the average of OECD countries; energy intensity, based on Purchasing Power Parities (PPP) and referred to GDP, is more than 3 times higher in Poland. A method of final energy demand forecasting based on regression analysis is described in this paper. The input data are: output of macroeconomic and population growth forecast; time series 1970-1992 of OECD countries concerning both macroeconomic characteristics and energy consumption; and energy balance of Poland for the base year of the forecast horizon. (author). 1 ref., 19 figs, 4 tabs

  3. Energy demand forecasting method based on international statistical data

    Glanc, Z; Kerner, A [Energy Information Centre, Warsaw (Poland)

    1997-09-01

    Poland is in a transition phase from a centrally planned to a market economy; data collected under former economic conditions do not reflect a market economy. Final energy demand forecasts are based on the assumption that the economic transformation in Poland will gradually lead the Polish economy, technologies and modes of energy use, to the same conditions as mature market economy countries. The starting point has a significant influence on the future energy demand and supply structure: final energy consumption per capita in 1992 was almost half the average of OECD countries; energy intensity, based on Purchasing Power Parities (PPP) and referred to GDP, is more than 3 times higher in Poland. A method of final energy demand forecasting based on regression analysis is described in this paper. The input data are: output of macroeconomic and population growth forecast; time series 1970-1992 of OECD countries concerning both macroeconomic characteristics and energy consumption; and energy balance of Poland for the base year of the forecast horizon. (author). 1 ref., 19 figs, 4 tabs.

  4. Statistical methods for the forensic analysis of striated tool marks

    Hoeksema, Amy Beth [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  5. Statistical methods used in the public health literature and implications for training of public health professionals.

    Hayat, Matthew J; Powell, Amanda; Johnson, Tessa; Cadwell, Betsy L

    2017-01-01

    Statistical literacy and knowledge is needed to read and understand the public health literature. The purpose of this study was to quantify basic and advanced statistical methods used in public health research. We randomly sampled 216 published articles from seven top tier general public health journals. Studies were reviewed by two readers and a standardized data collection form completed for each article. Data were analyzed with descriptive statistics and frequency distributions. Results were summarized for statistical methods used in the literature, including descriptive and inferential statistics, modeling, advanced statistical techniques, and statistical software used. Approximately 81.9% of articles reported an observational study design and 93.1% of articles were substantively focused. Descriptive statistics in table or graphical form were reported in more than 95% of the articles, and statistical inference reported in more than 76% of the studies reviewed. These results reveal the types of statistical methods currently used in the public health literature. Although this study did not obtain information on what should be taught, information on statistical methods being used is useful for curriculum development in graduate health sciences education, as well as making informed decisions about continuing education for public health professionals.

  6. Modern Methods of Multidimensional Data Visualization: Analysis, Classification, Implementation, and Applications in Technical Systems

    I. K. Romanova

    2016-01-01

    Full Text Available The article deals with theoretical and practical aspects of solving the problem of visualization of multidimensional data as an effective means of multivariate analysis of systems. Several classifications are proposed for visualization techniques, according to data types, visualization objects, the method of transformation of coordinates and data. To represent classification are used charts with links to the relevant work. The article also proposes two classifications of modern trends in display technology, including integration of visualization techniques as one of the modern trends of development, along with the introduction of interactive technologies and the dynamics of development processes. It describes some approaches to the visualization problem, which are concerned with fulfilling the needs. The needs are generated by the relevant tasks such as information retrieval in global networks, development of bioinformatics, study and control of business processes, development of regions, etc. The article highlights modern visualization tools, which are capable of improving the efficiency of the multivariate analysis and searching for solutions in multi-objective optimization of technical systems, but are not very actively used for such studies. These are horizontal graphs, graphics "quantile-quantile", etc. The paper proposes to use Choropleth cards traditionally used in cartography for simultaneous presentation of the distribution parameters of several criteria in the space. It notes that visualizations of graphs in network applications can be more actively used to describe the control system. The article suggests using the heat maps to provide graphical representation of the sensitivity of the system quality criteria under variations of options (multivariate analysis of technical systems. It also mentions that it is useful to extend the supervising heat maps to the task of estimating quality of identify in constructing system models. A

  7. [A brief history of resuscitation - the influence of previous experience on modern techniques and methods].

    Kucmin, Tomasz; Płowaś-Goral, Małgorzata; Nogalski, Adam

    2015-02-01

    Cardiopulmonary resuscitation (CPR) is relatively novel branch of medical science, however first descriptions of mouth-to-mouth ventilation are to be found in the Bible and literature is full of descriptions of different resuscitation methods - from flagellation and ventilation with bellows through hanging the victims upside down and compressing the chest in order to stimulate ventilation to rectal fumigation with tobacco smoke. The modern history of CPR starts with Kouwenhoven et al. who in 1960 published a paper regarding heart massage through chest compressions. Shortly after that in 1961Peter Safar presented a paradigm promoting opening the airway, performing rescue breaths and chest compressions. First CPR guidelines were published in 1966. Since that time guidelines were modified and improved numerously by two leading world expert organizations ERC (European Resuscitation Council) and AHA (American Heart Association) and published in a new version every 5 years. Currently 2010 guidelines should be obliged. In this paper authors made an attempt to present history of development of resuscitation techniques and methods and assess the influence of previous lifesaving methods on nowadays technologies, equipment and guidelines which allow to help those women and men whose life is in danger due to sudden cardiac arrest. © 2015 MEDPRESS.

  8. Modern methods of high-pressure fuel pump common rail power system diagnostics

    Kyshchun В.

    2016-08-01

    Full Text Available We've considered high pressure fuel pumps design features and equipment for their diagnosis. It was noted that the reliability of the fuel elements Common Rail system primarily provide precision parts of the fuel equipment. As a consequence, the aim of study was comparative analysis and laborious of modern methods of the high pressure fuel pump diagnosing. In particular, the definition of a technical condition of the fuel pump was carried out using a special stand and by measuring the fuel pressure and duty cycle of the pressure regulator signal. As an object of our research we've chosen Bosch № 0445010008 fuel pump (from Mercedes Benz E320cdi in which the plunger pairs were changed alternately with different technical conditions. Preliminary fuel pump parameters were determined by hydraulic testing. Based on conducted experiments we've found out that fuel pressure measurement change method and the duty cycle of the pressure regulator signal at the starting and full load modes less laborious compared to the definition of a technical condition of the pump on the stand. The results of both methods of diagnosing confirmed identity of the fuel pumps.

  9. Use of modern methods of fibre surface modification to obtain the multifunctional properties of textile materials

    Jocić Dragan

    2003-01-01

    Full Text Available The modern textile fibre treatments aim to obtain the required level of beneficial effect while attempting to confine the modification to the fibre surface. Recently, much attention has been focused on different physical methods of fibre surface modification, cold plasma treatment being considered as very useful. Moreover, there are efficient chemical methods available, such as peroxide, biopolymer and enzyme treatment. Some interesting combinations of these physical and chemical surface modification methods as means to modify fibre surface topography and thus controlling the surface-related properties of the fibre are presented in this paper. The properties obtained are discussed on the basis of the physico-chemical changes in the surface layer of the fibre, being assessed by wettability and contact angle measurements, as well as by FTIR-ATR and XPS analysis. The SEM and AFM technique are used to assess the changes in the fibre surface topography and to correlate these changes to the effectiveness, uniformity and severity of the textile fibre surface modification treatments.

  10. Statistical method for resolving the photon-photoelectron-counting inversion problem

    Wu Jinlong; Li Tiejun; Peng, Xiang; Guo Hong

    2011-01-01

    A statistical inversion method is proposed for the photon-photoelectron-counting statistics in quantum key distribution experiment. With the statistical viewpoint, this problem is equivalent to the parameter estimation for an infinite binomial mixture model. The coarse-graining idea and Bayesian methods are applied to deal with this ill-posed problem, which is a good simple example to show the successful application of the statistical methods to the inverse problem. Numerical results show the applicability of the proposed strategy. The coarse-graining idea for the infinite mixture models should be general to be used in the future.

  11. Methods and applications of statistics in engineering, quality control, and the physical sciences

    Balakrishnan, N

    2011-01-01

    Inspired by the Encyclopedia of Statistical Sciences, Second Edition (ESS2e), this volume presents a concise, well-rounded focus on the statistical concepts and applications that are essential for understanding gathered data in the fields of engineering, quality control, and the physical sciences. The book successfully upholds the goals of ESS2e by combining both previously-published and newly developed contributions written by over 100 leading academics, researchers, and practitioner in a comprehensive, approachable format. The result is a succinct reference that unveils modern, cutting-edge approaches to acquiring and analyzing data across diverse subject areas within these three disciplines, including operations research, chemistry, physics, the earth sciences, electrical engineering, and quality assurance. In addition, techniques related to survey methodology, computational statistics, and operations research are discussed, where applicable. Topics of coverage include: optimal and stochastic control, arti...

  12. The Playground Game: Inquiry‐Based Learning About Research Methods and Statistics

    Westera, Wim; Slootmaker, Aad; Kurvers, Hub

    2014-01-01

    The Playground Game is a web-based game that was developed for teaching research methods and statistics to nursing and social sciences students in higher education and vocational training. The complexity and abstract nature of research methods and statistics poses many challenges for students. The

  13. APA's Learning Objectives for Research Methods and Statistics in Practice: A Multimethod Analysis

    Tomcho, Thomas J.; Rice, Diana; Foels, Rob; Folmsbee, Leah; Vladescu, Jason; Lissman, Rachel; Matulewicz, Ryan; Bopp, Kara

    2009-01-01

    Research methods and statistics courses constitute a core undergraduate psychology requirement. We analyzed course syllabi and faculty self-reported coverage of both research methods and statistics course learning objectives to assess the concordance with APA's learning objectives (American Psychological Association, 2007). We obtained a sample of…

  14. Best Practices in Teaching Statistics and Research Methods in the Behavioral Sciences [with CD-ROM

    Dunn, Dana S., Ed.; Smith, Randolph A., Ed.; Beins, Barney, Ed.

    2007-01-01

    This book provides a showcase for "best practices" in teaching statistics and research methods in two- and four-year colleges and universities. A helpful resource for teaching introductory, intermediate, and advanced statistics and/or methods, the book features coverage of: (1) ways to integrate these courses; (2) how to promote ethical conduct;…

  15. Relationship between Students' Scores on Research Methods and Statistics, and Undergraduate Project Scores

    Ossai, Peter Agbadobi Uloku

    2016-01-01

    This study examined the relationship between students' scores on Research Methods and statistics, and undergraduate project at the final year. The purpose was to find out whether students matched knowledge of research with project-writing skill. The study adopted an expost facto correlational design. Scores on Research Methods and Statistics for…

  16. An ME-PC Enhanced HDMR Method for Efficient Statistical Analysis of Multiconductor Transmission Line Networks

    Yucel, Abdulkadir C.; Bagci, Hakan; Michielssen, Eric

    2015-01-01

    An efficient method for statistically characterizing multiconductor transmission line (MTL) networks subject to a large number of manufacturing uncertainties is presented. The proposed method achieves its efficiency by leveraging a high

  17. Investigation of modern methods of probalistic sensitivity analysis of final repository performance assessment models (MOSEL)

    Spiessl, Sabine; Becker, Dirk-Alexander

    2017-06-01

    Sensitivity analysis is a mathematical means for analysing the sensitivities of a computational model to variations of its input parameters. Thus, it is a tool for managing parameter uncertainties. It is often performed probabilistically as global sensitivity analysis, running the model a large number of times with different parameter value combinations. Going along with the increase of computer capabilities, global sensitivity analysis has been a field of mathematical research for some decades. In the field of final repository modelling, probabilistic analysis is regarded a key element of a modern safety case. An appropriate uncertainty and sensitivity analysis can help identify parameters that need further dedicated research to reduce the overall uncertainty, generally leads to better system understanding and can thus contribute to building confidence in the models. The purpose of the project described here was to systematically investigate different numerical and graphical techniques of sensitivity analysis with typical repository models, which produce a distinctly right-skewed and tailed output distribution and can exhibit a highly nonlinear, non-monotonic or even non-continuous behaviour. For the investigations presented here, three test models were defined that describe generic, but typical repository systems. A number of numerical and graphical sensitivity analysis methods were selected for investigation and, in part, modified or adapted. Different sampling methods were applied to produce various parameter samples of different sizes and many individual runs with the test models were performed. The results were evaluated with the different methods of sensitivity analysis. On this basis the methods were compared and assessed. This report gives an overview of the background and the applied methods. The results obtained for three typical test models are presented and explained; conclusions in view of practical applications are drawn. At the end, a recommendation

  18. Investigation of modern methods of probalistic sensitivity analysis of final repository performance assessment models (MOSEL)

    Spiessl, Sabine; Becker, Dirk-Alexander

    2017-06-15

    Sensitivity analysis is a mathematical means for analysing the sensitivities of a computational model to variations of its input parameters. Thus, it is a tool for managing parameter uncertainties. It is often performed probabilistically as global sensitivity analysis, running the model a large number of times with different parameter value combinations. Going along with the increase of computer capabilities, global sensitivity analysis has been a field of mathematical research for some decades. In the field of final repository modelling, probabilistic analysis is regarded a key element of a modern safety case. An appropriate uncertainty and sensitivity analysis can help identify parameters that need further dedicated research to reduce the overall uncertainty, generally leads to better system understanding and can thus contribute to building confidence in the models. The purpose of the project described here was to systematically investigate different numerical and graphical techniques of sensitivity analysis with typical repository models, which produce a distinctly right-skewed and tailed output distribution and can exhibit a highly nonlinear, non-monotonic or even non-continuous behaviour. For the investigations presented here, three test models were defined that describe generic, but typical repository systems. A number of numerical and graphical sensitivity analysis methods were selected for investigation and, in part, modified or adapted. Different sampling methods were applied to produce various parameter samples of different sizes and many individual runs with the test models were performed. The results were evaluated with the different methods of sensitivity analysis. On this basis the methods were compared and assessed. This report gives an overview of the background and the applied methods. The results obtained for three typical test models are presented and explained; conclusions in view of practical applications are drawn. At the end, a recommendation

  19. Teaching issues of contemporary history using historical sources and modern teaching methods

    Gruber Gabriela

    2017-01-01

    Full Text Available The study of history is becoming increasingly less interesting to students, despite the fact that the history teaching process has been continuously modernized during recent years. It is an observation which can be perceived even if we don`t make an elaborated research in the field. Some empirical data show us that students in secondary and High Schools are less interested in studying History than in studying Geography or other social sciences. The number of students who are determined to study History in universities has significantly dropped in recent years [1]. Of course, there are multiple causes and the factors behind this change are numerous and varied. In this paper we handle only some changes in teaching History in High Schools, as they are designed in History Curricula and in History textbooks. Therefore during the first sequence of this paper we shall analyze the History Curricula for High School, 11th and 12th grades, regarding their finalities (competencies, some relevant contents and the recommended pedagogical approaches about the teaching methods and the auxiliary material. In the second part of the paper we propose some teaching activities through which students would practice the specific competencies from their Curriculum for History. We aim at presenting attractive teaching material and learning methods and applying the methodological recommendations from the High school Curricula for History, 11th and 12th grades.

  20. Modernized accurate methods for processing of in-core measurement signals in WWER reactors

    Polak, T.

    1996-01-01

    Utilization of the new accurate WIMS-KAERI library (WIMKAL-88) to generate the following characteristics for Rhodium SPND: Sensitivity depletion law by high (approx= 75%) burnup of emitter; influence of burnup-history on depletion law course; influence of neutron spectrum change on Rh-SPND sensitivity caused by change of fuel enrichment, fuel burnup, moderator temperature, concentration of boracid, central pin power rate and concentration of Xe 135 ; generating and experimental testing of Rh-SPND signal to linear pin power rate and signal to neutron flux conversion factors. Rh-SPND instrumentation optimization (reduction) related to safety and operational aspects as needed for 3D power surveillance in WWER-1000 reactors. Analysis of SPND reduction from 64x7 to 46x7 by method of Shannon information entropy optimization. Influence of reduction on accuracy of 3D power distribution reconstruction. Physical methods of 3D power distribution unfolding in new modernized on-line I and C system in NPP J. Bohunice with in-core measurements according to 210 thermocouples and 36x7 Rh-SPNDs. Program system TOPRE under QNX operating system network in FORTRAN 77, neutronic background calculations by macrocode MOBY-DICK. (author). 10 refs, 6 figs, 7 tabs

  1. Analysis of individual environmental particles using modern methods of electron microscopy and X-ray microanalysis

    Laskin, A.; Cowin, J.P.; Iedema, M.J.

    2006-01-01

    Understanding the composition of particles in the atmosphere is critical because of their health effects and their direct and indirect effects on radiative forcing, and hence on climate. In this manuscript, we demonstrate the utility of single particle off-line analysis to investigate the chemistry of individual atmospheric particles using modern, state-of-the-art electron microscopy and time-of-flight secondary ionization mass spectrometry techniques. We show that these methods provide specific, detailed data on particle composition, chemistry, morphology, phase and internal structure. This information is crucial for evaluating hygroscopic properties of aerosols, understanding aerosol aging and reactivity, and correlating the characteristics of aerosols with their optical properties. The manuscript presents a number of analytical advances in methods of electron probe particle analysis along with a brief review of a number of the research projects carried out in the authors' laboratory on the chemical characterization of environmental particles. The obtained data offers a rich set of qualitative and quantitative information on the particle chemistry, composition and the mechanisms of gas-particle interactions which are of high importance to atmospheric processes involving particulate matter and air pollution

  2. Modern Evaluation of Liquisolid Systems with Varying Amounts of Liquid Phase Prepared Using Two Different Methods

    Barbora Vraníková

    2015-01-01

    Full Text Available Liquisolid systems are an innovative dosage form used for enhancing dissolution rate and improving in vivo bioavailability of poorly soluble drugs. These formulations require specific evaluation methods for their quality assurance (e.g., evaluation of angle of slide, contact angle, or water absorption ratio. The presented study is focused on the preparation, modern in vitro testing, and evaluation of differences of liquisolid systems containing varying amounts of a drug in liquid state (polyethylene glycol 400 solution of rosuvastatin in relation to an aluminometasilicate carrier (Neusilin US2. Liquisolid powders used for the formulation of final tablets were prepared using two different methods: simple blending and spraying of drug solution onto a carrier in fluid bed equipment. The obtained results imply that the amount of liquid phase in relation to carrier material had an effect on the hardness, friability, and disintegration of tablets, as well as their height. The use of spraying technique enhanced flow properties of the prepared mixtures, increased hardness values, decreased friability, and improved homogeneity of the final dosage form.

  3. Analysis of Statistical Methods and Errors in the Articles Published in the Korean Journal of Pain

    Yim, Kyoung Hoon; Han, Kyoung Ah; Park, Soo Young

    2010-01-01

    Background Statistical analysis is essential in regard to obtaining objective reliability for medical research. However, medical researchers do not have enough statistical knowledge to properly analyze their study data. To help understand and potentially alleviate this problem, we have analyzed the statistical methods and errors of articles published in the Korean Journal of Pain (KJP), with the intention to improve the statistical quality of the journal. Methods All the articles, except case reports and editorials, published from 2004 to 2008 in the KJP were reviewed. The types of applied statistical methods and errors in the articles were evaluated. Results One hundred and thirty-nine original articles were reviewed. Inferential statistics and descriptive statistics were used in 119 papers and 20 papers, respectively. Only 20.9% of the papers were free from statistical errors. The most commonly adopted statistical method was the t-test (21.0%) followed by the chi-square test (15.9%). Errors of omission were encountered 101 times in 70 papers. Among the errors of omission, "no statistics used even though statistical methods were required" was the most common (40.6%). The errors of commission were encountered 165 times in 86 papers, among which "parametric inference for nonparametric data" was the most common (33.9%). Conclusions We found various types of statistical errors in the articles published in the KJP. This suggests that meticulous attention should be given not only in the applying statistical procedures but also in the reviewing process to improve the value of the article. PMID:20552071

  4. Comments on Brodsky's statistical methods for evaluating epidemiological results, and reply by Brodsky, A

    Frome, E.L.; Khare, M.

    1980-01-01

    Brodsky's paper 'A Statistical Method for Testing Epidemiological Results, as applied to the Hanford Worker Population', (Health Phys., 36, 611-628, 1979) proposed two test statistics for use in comparing the survival experience of a group of employees and controls. This letter states that both of the test statistics were computed using incorrect formulas and concludes that the results obtained using these statistics may also be incorrect. In his reply Brodsky concurs with the comments on the proper formulation of estimates of pooled standard errors in constructing test statistics but believes that the erroneous formulation does not invalidate the major points, results and discussions of his paper. (author)

  5. Statistical methods for assays with limits of detection: Serum bile acid as a differentiator between patients with normal colons, adenomas, and colorectal cancer

    Bonnie LaFleur

    2011-01-01

    Full Text Available In analytic chemistry a detection limit (DL is the lowest measurable amount of an analyte that can be distinguished from a blank; many biomedical measurement technologies exhibit this property. From a statistical perspective, these data present inferential challenges because instead of precise measures, one only has information that the value is somewhere between 0 and the DL (below detection limit, BDL. Substitution of BDL values, with 0 or the DL can lead to biased parameter estimates and a loss of statistical power. Statistical methods that make adjustments when dealing with these types of data, often called left-censored data, are available in many commercial statistical packages. Despite this availability, the use of these methods is still not widespread in biomedical literature. We have reviewed the statistical approaches of dealing with BDL values, and used simulations to examine the performance of the commonly used substitution methods and the most widely available statistical methods. We have illustrated these methods using a study undertaken at the Vanderbilt-Ingram Cancer Center, to examine the serum bile acid levels in patients with colorectal cancer and adenoma. We have found that the modern methods for BDL values identify disease-related differences that are often missed, with statistically naive approaches.

  6. Comparison and analysis of two modern methods in the structural health monitoring techniques in aerospace

    Riahi, Mohammad; Ahmadi, Alireza

    2016-04-01

    Role of air transport in the development and expansion of world trade leading to economic growth of different countries is undeniable. Continuing the world's trade sustainability without expansion of aerospace is next to impossible. Based on enormous expenses for design, manufacturing and maintenance of different aerospace structures, correct and timely diagnosis of defects in those structures to provide for maximum safety has the highest importance. Amid all this, manufacturers of commercial and even military aircrafts are after production of less expensive, lighter, higher fuel economy and nonetheless, higher safety. As such, two events has prevailed in the aerospace industries: (1) Utilization of composites for the fuselage as well as other airplane parts, (2) using modern manufacturing methods. Arrival of two these points have created the need for upgrading of the present systems as well as innovating newer methods in diagnosing and detection of defects in aerospace structures. Despite applicability of nondestructive testing (NDT) methods in aerospace for decades, due to some limitations in the defect detection's certainty, particularly for composite material and complex geometries, shadow of doubt has fallen on maintaining complete confidence in using NDT. These days, two principal approach are ahead to tackle the above mentioned problems. First, approach for the short range is the creative and combinational mean to increase the reliability of NDT and for the long run, innovation of new methods on the basis of structural health monitoring (SHM) is in order. This has led to new philosophy in the maintenance area and in some instances; field of design has also been affected by it.

  7. Development of modern methods with respect to neutron transport and uncertainty analyses for reactor core calculations. Interim report; Weiterentwicklung moderner Verfahren zu Neutronentransport und Unsicherheitsanalysen fuer Kernberechnungen. Zwischenbericht

    Zwermann, Winfried; Aures, Alexander; Bostelmann, Friederike; Pasichnyk, Ihor; Perin, Yann; Velkov, Kiril; Zilly, Matias

    2016-12-15

    This report documents the status of the research and development goals reached within the reactor safety research project RS1536 ''Development of modern methods with respect to neutron transport and uncertainty analyses for reactor core calculations'' as of the 3{sup rd} quarter of 2016. The superordinate goal of the project is the development, validation, and application of neutron transport methods and uncertainty analyses for reactor core calculations. These calculation methods will mainly be applied to problems related to the core behaviour of light water reactors and innovative reactor concepts, in particular fast reactors cooled by liquid metal. The contributing individual goals are the further optimization and validation of deterministic calculation methods with high spatial and energy resolution, the development of a coupled calculation system using the Monte Carlo method for the neutron transport to describe time-dependent reactor core states, the processing and validation of nuclear data, particularly with regard to covariance data, the development, validation, and application of sampling-based methods for uncertainty and sensitivity analyses, the creation of a platform for performing systematic uncertainty analyses for fast reactor systems, as well as the description of states of severe core damage with the Monte Carlo method. Moreover, work regarding the European NURESAFE project, started in the preceding project RS1503, are being continued and completed.

  8. Improvement of Information and Methodical Provision of Macro-economic Statistical Analysis

    Tiurina Dina M.

    2014-02-01

    Full Text Available The article generalises and analyses main shortcomings of the modern system of macro-statistical analysis based on the use of the system of national accounts and balance of the national economy. The article proves on the basis of historic analysis of formation of indicators of the system of national accounts that problems with its practical use have both regional and global reasons. In order to eliminate impossibility of accounting life quality the article offers a system of quality indicators based on the general perception of wellbeing as assurance in own solvency of population and representative sampling of economic subjects.

  9. Knowledge and use of modern family planning methods by rural women in Zambia

    C. Mubita-Ngoma

    2010-09-01

    Full Text Available The main aim of the study was to determine knowledge and use of modem contractive methods among reproductive age group rural women in Zambia. The study is a descriptive cross-sectional study of 105 randomly selected rural women. Data was collected using semi-structured interview schedule and analyzed using EPI Info version 6 statistical packages. The findings revealed that 63% of the respondents were within the age group 21-35 years, 65% were married and 64% were peasant farmers. 90% of the respondents had heard about modem contraceptives and their main source of information was the Health worker (62%. 76% of the respondents stated that modem contraceptive methods could be obtained from public health facilities. 56% of the respondents were currently using modem contraceptive methods and 46% were not using modem contraceptive methods. Reasons for non use of contraceptive methods were religious beliefs (50%, partner disapproval (30% and side effects (20%. The results showed a relationship between educational level and use of contraceptives (Chi-square 7.83, df = 3, P < 0.05 and spouse approval or support of contractive methods and use of contraceptive (Chisquare 5.9, df = 2, P < 0.05. Therefore, efforts to promote modem contraceptive use among the rural women should be intensified to overcome barriers to contraceptive use and should involve men.

  10. Forms And Methods Of Modern Russian Youth Involvement Into The Electoral Process

    Aleksey D. Maslov

    2015-03-01

    Full Text Available In the present article authors analyzes forms and methods of modern Russian youth involvement in the electoral process. Involving young people in the electoral process is directly related to the problem of increasing the level of political culture in the society. This article presents the main forms of work to attract young people to participate in elections in our country, according to the Central Election Commission (CEC of Russia, some of the regional election commissions, the Russian Public Opinion Research Center (WCIOM. Authors note that at present there are more than one hundred and sixty legislative acts of the Russian Federation, which reflect certain aspects of the state youth policy. All these measures stimulate the political activity of young people, but in our opinion, that is not enough. The fundamental change in the attitude of young people to politics, to the institution of elections is possible only when young people feel like a real part and the subject of transformation processes in our country. In conclusion authors summarizes, that a fundamental change in the relationship of young people to politics, the institution of elections is possible only, when very young feel a real party and the subject of transformation processes in our country. This is possible only when the state is really and not formally prioritizes youth policy. Young people should have a daily state support for education, starting a business, implementation of acquired skills for a decent fee, starting a family, buying a house, etc.

  11. Can We Use Polya’s Method to Improve Students’ Performance in the Statistics Classes?

    Indika Wickramasinghe

    2015-01-01

    Full Text Available In this study, Polya’s problem-solving method is introduced in a statistics class in an effort to enhance students’ performance. Teaching the method was applied to one of the two introductory-level statistics classes taught by the same instructor, and a comparison was made between the performances in the two classes. The results indicate there was a significant improvement of the students’ performance in the class in which Polya’s method was introduced.

  12. Optimal correction and design parameter search by modern methods of rigorous global optimization

    Makino, K.; Berz, M.

    2011-01-01

    Frequently the design of schemes for correction of aberrations or the determination of possible operating ranges for beamlines and cells in synchrotrons exhibit multitudes of possibilities for their correction, usually appearing in disconnected regions of parameter space which cannot be directly qualified by analytical means. In such cases, frequently an abundance of optimization runs are carried out, each of which determines a local minimum depending on the specific chosen initial conditions. Practical solutions are then obtained through an often extended interplay of experienced manual adjustment of certain suitable parameters and local searches by varying other parameters. However, in a formal sense this problem can be viewed as a global optimization problem, i.e. the determination of all solutions within a certain range of parameters that lead to a specific optimum. For example, it may be of interest to find all possible settings of multiple quadrupoles that can achieve imaging; or to find ahead of time all possible settings that achieve a particular tune; or to find all possible manners to adjust nonlinear parameters to achieve correction of high order aberrations. These tasks can easily be phrased in terms of such an optimization problem; but while mathematically this formulation is often straightforward, it has been common belief that it is of limited practical value since the resulting optimization problem cannot usually be solved. However, recent significant advances in modern methods of rigorous global optimization make these methods feasible for optics design for the first time. The key ideas of the method lie in an interplay of rigorous local underestimators of the objective functions, and by using the underestimators to rigorously iteratively eliminate regions that lie above already known upper bounds of the minima, in what is commonly known as a branch-and-bound approach. Recent enhancements of the Differential Algebraic methods used in particle

  13. Using the Bootstrap Method for a Statistical Significance Test of Differences between Summary Histograms

    Xu, Kuan-Man

    2006-01-01

    A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.

  14. A test of Hartnett's revisions to the pubic symphysis and fourth rib methods on a modern sample.

    Merritt, Catherine E

    2014-05-01

    Estimating age at death is one of the most important aspects of creating a biological profile. Most adult age estimation methods were developed on North American skeletal collections from the early to mid-20th century, and their applicability to modern populations has been questioned. In 2010, Hartnett used a modern skeletal collection from the Maricopia County Forensic Science Centre to revise the Suchey-Brooks pubic symphysis method and the İşcan et al. fourth rib methods. The current study tests Hartnett's revised methods as well as the original Suchey-Brooks and İşcan et al. methods on a modern sample from the William Bass Skeletal Collection (N = 313, mean age = 58.5, range 19-92). Results show that the Suchey-Brooks and İşcan et al. methods assign individuals to the correct phase 70.8% and 57.5% of the time compared with Hartnett's revised methods at 58.1% and 29.7%, respectively, with correctness scores based on one standard deviation of the mean rather than the entire age range. Accuracy and bias scores are significantly improved for Hartnett's revised pubic symphysis method and marginally better for Hartnett's revised fourth rib method, suggesting that the revised mean ages at death of Hartnett's phases better reflect this modern population. Overall, both Hartnett's revised methods are reliable age estimation methods. For the pubic symphysis, there are significant improvements in accuracy and bias scores, especially for older individuals; however, for the fourth rib, the results are comparable to the original İşcan et al. methods, with some improvement for older individuals. © 2014 American Academy of Forensic Sciences.

  15. The Relationship of Instructional Methods with Student Responses to the Survey of Attitudes Toward Statistics.

    Faghihi, Foroozandeh; Rakow, Ernest A.

    This study, conducted at the University of Memphis (Tennessee), compared the effects of a self-paced method of instruction on the attitudes and perceptions of students enrolled in an undergraduate statistics course with those of a comparable group of students taking statistics in a traditional lecture setting. The non-traditional course used a…

  16. Multivariate Statistical Methods as a Tool of Financial Analysis of Farm Business

    Novák, J.; Sůvová, H.; Vondráček, Jiří

    2002-01-01

    Roč. 48, č. 1 (2002), s. 9-12 ISSN 0139-570X Institutional research plan: AV0Z1030915 Keywords : financial analysis * financial ratios * multivariate statistical methods * correlation analysis * discriminant analysis * cluster analysis Subject RIV: BB - Applied Statistics, Operational Research

  17. Probability of Detection (POD) as a statistical model for the validation of qualitative methods.

    Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T

    2011-01-01

    A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.

  18. Statistical mechanics in JINR

    Tonchev, N.; Shumovskij, A.S.

    1986-01-01

    The history of investigations, conducted at the JINR in the field of statistical mechanics, beginning with the fundamental works by Bogolyubov N.N. on superconductivity microscopic theory is presented. Ideas, introduced in these works and methods developed in them, have largely determined the ways for developing statistical mechanics in the JINR and Hartree-Fock-Bogolyubov variational principle has become an important method of the modern nucleus theory. A brief review of the main achievements, connected with the development of statistical mechanics methods and their application in different fields of physical science is given

  19. An Efficient Graph-based Method for Long-term Land-use Change Statistics

    Yipeng Zhang

    2015-12-01

    Full Text Available Statistical analysis of land-use change plays an important role in sustainable land management and has received increasing attention from scholars and administrative departments. However, the statistical process involving spatial overlay analysis remains difficult and needs improvement to deal with mass land-use data. In this paper, we introduce a spatio-temporal flow network model to reveal the hidden relational information among spatio-temporal entities. Based on graph theory, the constant condition of saturated multi-commodity flow is derived. A new method based on a network partition technique of spatio-temporal flow network are proposed to optimize the transition statistical process. The effectiveness and efficiency of the proposed method is verified through experiments using land-use data in Hunan from 2009 to 2014. In the comparison among three different land-use change statistical methods, the proposed method exhibits remarkable superiority in efficiency.

  20. Statistical methods of discrimination and classification advances in theory and applications

    Choi, Sung C

    1986-01-01

    Statistical Methods of Discrimination and Classification: Advances in Theory and Applications is a collection of papers that tackles the multivariate problems of discriminating and classifying subjects into exclusive population. The book presents 13 papers that cover that advancement in the statistical procedure of discriminating and classifying. The studies in the text primarily focus on various methods of discriminating and classifying variables, such as multiple discriminant analysis in the presence of mixed continuous and categorical data; choice of the smoothing parameter and efficiency o

  1. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures

    Rock, Adam J.; Coventry, William L.; Morgan, Methuen I.; Loi, Natasha M.

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal, Ginsburg, & Schau, 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof, Ceroni, Jeong, & Moghaddam, 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to...

  2. An improved method for statistical analysis of raw accelerator mass spectrometry data

    Gutjahr, A.; Phillips, F.; Kubik, P.W.; Elmore, D.

    1987-01-01

    Hierarchical statistical analysis is an appropriate method for statistical treatment of raw accelerator mass spectrometry (AMS) data. Using Monte Carlo simulations we show that this method yields more accurate estimates of isotope ratios and analytical uncertainty than the generally used propagation of errors approach. The hierarchical analysis is also useful in design of experiments because it can be used to identify sources of variability. 8 refs., 2 figs

  3. Statistical methods to monitor the West Valley off-gas system

    Eggett, D.L.

    1990-01-01

    This paper reports on the of-gas system for the ceramic melter operated at the West Valley Demonstration Project at West Valley, NY, monitored during melter operation. A one-at-a-time method of monitoring the parameters of the off-gas system is not statistically sound. Therefore, multivariate statistical methods appropriate for the monitoring of many correlated parameters will be used. Monitoring a large number of parameters increases the probability of a false out-of-control signal. If the parameters being monitored are statistically independent, the control limits can be easily adjusted to obtain the desired probability of a false out-of-control signal. The principal component (PC) scores have desirable statistical properties when the original variables are distributed as multivariate normals. Two statistics derived from the PC scores and used to form multivariate control charts are outlined and their distributional properties reviewed

  4. Physics-based statistical model and simulation method of RF propagation in urban environments

    Pao, Hsueh-Yuan; Dvorak, Steven L.

    2010-09-14

    A physics-based statistical model and simulation/modeling method and system of electromagnetic wave propagation (wireless communication) in urban environments. In particular, the model is a computationally efficient close-formed parametric model of RF propagation in an urban environment which is extracted from a physics-based statistical wireless channel simulation method and system. The simulation divides the complex urban environment into a network of interconnected urban canyon waveguides which can be analyzed individually; calculates spectral coefficients of modal fields in the waveguides excited by the propagation using a database of statistical impedance boundary conditions which incorporates the complexity of building walls in the propagation model; determines statistical parameters of the calculated modal fields; and determines a parametric propagation model based on the statistical parameters of the calculated modal fields from which predictions of communications capability may be made.

  5. Statistical Methods for the detection of answer copying on achievement tests

    Sotaridona, Leonardo

    2003-01-01

    This thesis contains a collection of studies where statistical methods for the detection of answer copying on achievement tests in multiple-choice format are proposed and investigated. Although all methods are suited to detect answer copying, each method is designed to address specific

  6. Assessment of statistical methods used in library-based approaches to microbial source tracking.

    Ritter, Kerry J; Carruthers, Ethan; Carson, C Andrew; Ellender, R D; Harwood, Valerie J; Kingsley, Kyle; Nakatsu, Cindy; Sadowsky, Michael; Shear, Brian; West, Brian; Whitlock, John E; Wiggins, Bruce A; Wilbur, Jayson D

    2003-12-01

    Several commonly used statistical methods for fingerprint identification in microbial source tracking (MST) were examined to assess the effectiveness of pattern-matching algorithms to correctly identify sources. Although numerous statistical methods have been employed for source identification, no widespread consensus exists as to which is most appropriate. A large-scale comparison of several MST methods, using identical fecal sources, presented a unique opportunity to assess the utility of several popular statistical methods. These included discriminant analysis, nearest neighbour analysis, maximum similarity and average similarity, along with several measures of distance or similarity. Threshold criteria for excluding uncertain or poorly matched isolates from final analysis were also examined for their ability to reduce false positives and increase prediction success. Six independent libraries used in the study were constructed from indicator bacteria isolated from fecal materials of humans, seagulls, cows and dogs. Three of these libraries were constructed using the rep-PCR technique and three relied on antibiotic resistance analysis (ARA). Five of the libraries were constructed using Escherichia coli and one using Enterococcus spp. (ARA). Overall, the outcome of this study suggests a high degree of variability across statistical methods. Despite large differences in correct classification rates among the statistical methods, no single statistical approach emerged as superior. Thresholds failed to consistently increase rates of correct classification and improvement was often associated with substantial effective sample size reduction. Recommendations are provided to aid in selecting appropriate analyses for these types of data.

  7. A robust statistical method for association-based eQTL analysis.

    Ning Jiang

    Full Text Available It has been well established that theoretical kernel for recently surging genome-wide association study (GWAS is statistical inference of linkage disequilibrium (LD between a tested genetic marker and a putative locus affecting a disease trait. However, LD analysis is vulnerable to several confounding factors of which population stratification is the most prominent. Whilst many methods have been proposed to correct for the influence either through predicting the structure parameters or correcting inflation in the test statistic due to the stratification, these may not be feasible or may impose further statistical problems in practical implementation.We propose here a novel statistical method to control spurious LD in GWAS from population structure by incorporating a control marker into testing for significance of genetic association of a polymorphic marker with phenotypic variation of a complex trait. The method avoids the need of structure prediction which may be infeasible or inadequate in practice and accounts properly for a varying effect of population stratification on different regions of the genome under study. Utility and statistical properties of the new method were tested through an intensive computer simulation study and an association-based genome-wide mapping of expression quantitative trait loci in genetically divergent human populations.The analyses show that the new method confers an improved statistical power for detecting genuine genetic association in subpopulations and an effective control of spurious associations stemmed from population structure when compared with other two popularly implemented methods in the literature of GWAS.

  8. Use of traditional and modern contraceptives among childbearing women: findings from a mixed methods study in two southwestern Nigerian states.

    Ajayi, Anthony Idowu; Adeniyi, Oladele Vincent; Akpan, Wilson

    2018-05-09

    Contraceptive use has numerous health benefits such as preventing unplanned pregnancies, ensuring optimum spacing between births, reducing maternal and child mortality, and improving the lives of women and children in general. This study examines the level of contraceptive use, its determinants, reasons for non-use of contraception among women in the reproductive age group (18-49 years) in two southwestern Nigerian states. The study adopted an interviewer-administered questionnaire to collect data from 809 participants selected using a 3-stage cluster random sampling technique. We also conducted 46 in-depth interviews. In order to investigate the association between the socio-demographic variables and use of contraceptive methods, we estimated the binary logistic regression models. The findings indicated that knowledge of any methods of contraception was almost universal among the participants. The rates of ever use and current use of contraception was 80 and 66.6%, respectively. However, only 43.9% of the participants had ever used any modern contraceptive methods, considered to be more reliable. The fear of side effects of modern contraceptive methods drove women to rely on less effective traditional methods (withdrawal and rhythm methods). Some women employed crude and unproven contraceptive methods to prevent pregnancies. Our findings show that the rate of contraceptive use was high in the study setting. However, many women chose less effective traditional contraceptive methods over more effective modern contraceptive methods due to fear of side effects of the latter. Patient education on the various options of modern contraceptives, their side effects and management would be crucial towards expanding the family planning services in the study setting.

  9. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  10. An Investigation of the Variety and Complexity of Statistical Methods Used in Current Internal Medicine Literature.

    Narayanan, Roshni; Nugent, Rebecca; Nugent, Kenneth

    2015-10-01

    Accreditation Council for Graduate Medical Education guidelines require internal medicine residents to develop skills in the interpretation of medical literature and to understand the principles of research. A necessary component is the ability to understand the statistical methods used and their results, material that is not an in-depth focus of most medical school curricula and residency programs. Given the breadth and depth of the current medical literature and an increasing emphasis on complex, sophisticated statistical analyses, the statistical foundation and education necessary for residents are uncertain. We reviewed the statistical methods and terms used in 49 articles discussed at the journal club in the Department of Internal Medicine residency program at Texas Tech University between January 1, 2013 and June 30, 2013. We collected information on the study type and on the statistical methods used for summarizing and comparing samples, determining the relations between independent variables and dependent variables, and estimating models. We then identified the typical statistics education level at which each term or method is learned. A total of 14 articles came from the Journal of the American Medical Association Internal Medicine, 11 from the New England Journal of Medicine, 6 from the Annals of Internal Medicine, 5 from the Journal of the American Medical Association, and 13 from other journals. Twenty reported randomized controlled trials. Summary statistics included mean values (39 articles), category counts (38), and medians (28). Group comparisons were based on t tests (14 articles), χ2 tests (21), and nonparametric ranking tests (10). The relations between dependent and independent variables were analyzed with simple regression (6 articles), multivariate regression (11), and logistic regression (8). Nine studies reported odds ratios with 95% confidence intervals, and seven analyzed test performance using sensitivity and specificity calculations

  11. Models and methods for assessing the value of HVDC and MVDC technologies in modern power grids

    Makarov, Yuri V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Elizondo, Marcelo A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); O' Brien, James G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Huang, Qiuhua [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kirkham, Harold [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Huang, Zhenyu [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Chinthavali, Madhu [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Suman, Debnath [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Mohan, Nihal [Mid-Continent Independent System Operator (MISO), St. Paul, MN (United States); Hess, Warren [Mid-Continent Independent System Operator (MISO), St. Paul, MN (United States); Duebner, David [Mid-Continent Independent System Operator (MISO), St. Paul, MN (United States); Orser, David [Mid-Continent Independent System Operator (MISO), St. Paul, MN (United States); Brown, Hilary [Mid-Continent Independent System Operator (MISO), St. Paul, MN (United States); Osborn, Dale [Mid-Continent Independent System Operator (MISO), St. Paul, MN (United States); Feltes, James [Siemens, Knoxville, TN (United States); Kurthakoti Chandrashekhara, Divya [Siemens, Knoxville, TN (United States); Zhu, Wenchun [Siemens, Knoxville, TN (United States)

    2017-07-31

    This report reflects the results of U.S. Department of Energy’s (DOE) Grid Modernization project 0074 “Models and methods for assessing the value of HVDC [high-voltage direct current] and MTDC [multi-terminal direct current] technologies in modern power grids.” The work was done by the Pacific Northwest National Laboratory (PNNL) and Oak Ridge National Laboratory (ORNL) in cooperation with Mid-Continent Independent System Operator (MISO) and Siemens. The main motivation of this study was to show the benefit of using direct current (DC) systems larger than those in existence today as they overlap with the alternating current (AC) systems. Proper use of their flexibility in terms of active/reactive power control and fast response can provide much-needed services to the grid at the same time as moving large blocks of energy to take advantage of cost diversity. Ultimately, the project’s success will enable decision-makers and investors to make well-informed decisions regarding this use of DC systems. This project showed the technical feasibility of HVDC macrogrid for frequency control and congestion relief in addition to bulk power transfers. Industry-established models for commonly used technologies were employed, along with high-fidelity models for recently developed HVDC converter technologies; like the modular multilevel converters (MMCs), a voltage source converters (VSC). Models for General Electric Positive Sequence Load Flow (GE PSLF) and Siemens Power System Simulator (PSS/E), widely used analysis programs, were for the first time adapted to include at the same time both Western Electricity Coordinating Council (WECC) and Eastern Interconnection (EI), the two largest North American interconnections. The high-fidelity models and their control were developed in detail for MMC system and extended to HVDC systems in point-to-point and in three-node multi-terminal configurations. Using a continental-level mixed AC-DC grid model, and using a HVDC macrogrid

  12. Abstracts book of 4. Poznan Analytical Seminar on Modern Methods of Sample Preparation and Trace Amounts Determination of Elements

    1995-01-01

    The 4. Poznan Analytical Seminar on Modern Methods of Sample Preparation and Trace Amounts Determination of Elements has been held in Poznan 27-28 April 1995. The new versions of analytical methods have been presented for quantitative determination of trace elements in biological, environmental and geological materials. Also the number of special techniques for sample preparation enables achievement the best precision of analytical results have been shown and discussed

  13. Understanding seafloor morphology using remote high frequency acoustic methods: An appraisal to modern techniques and its effectiveness

    Chakraborty, B.

    Content-Type text/plain; charset=UTF-8 179 Understanding seafloor morphology using remote high frequency acoustic methods: an appraisal to modern techniques and its effectiveness Bishwajit Chakraborty National institute of Oceanography.... The two third of the earth surface i.e. 362 million square km (70 %) is covered by the ocean. In order to understand the seafloor various methods like: application of remote acoustic techniques, seafloor photographic and geological sampling techniques...

  14. 8th International Conference on Soft Methods in Probability and Statistics

    Giordani, Paolo; Vantaggi, Barbara; Gagolewski, Marek; Gil, María; Grzegorzewski, Przemysław; Hryniewicz, Olgierd

    2017-01-01

    This proceedings volume is a collection of peer reviewed papers presented at the 8th International Conference on Soft Methods in Probability and Statistics (SMPS 2016) held in Rome (Italy). The book is dedicated to Data science which aims at developing automated methods to analyze massive amounts of data and to extract knowledge from them. It shows how Data science employs various programming techniques and methods of data wrangling, data visualization, machine learning, probability and statistics. The soft methods proposed in this volume represent a collection of tools in these fields that can also be useful for data science.

  15. Modern Methods of Measuring and Modelling Architectural Objects in the Process of their Valorisation

    Zagroba, Marek

    2017-10-01

    As well as being a cutting-edge technology, laser scanning is still developing rapidly. Laser scanners have an almost unlimited range of use in many disciplines of contemporary engineering, where precision and high quality of tasks performed are of the utmost importance. Among these disciplines, special attention is drawn to architecture and urban space studies that is the fields of science which shape the space and surroundings occupied by people, thus having a direct impact on people’s lives. It is more complicated to take measurements with a laser scanner than with traditional methods, where laser target markers or a measuring tape are used. A specific procedure must be followed when measurements are taken with a laser scanner, and the aim is to obtain three-dimensional data about a building situated in a given space. Accuracy, low time consumption, safety and non-invasiveness are the primary advantages of this technology used in the civil engineering practice, when handling both historic and modern architecture. Using a laser scanner is especially important when taking measurements of vast engineering constructions, where an application of traditional techniques would be much more difficult and would require higher time and labour inputs, for example because of some less easily accessible nooks and crannies or due to the geometrical complexity of individual components of a building structure. In this article, the author undertakes the problem of measuring and modelling architectural objects in the process of their valorisation, i.e. the enhancement of their functional, usable, spatial and aesthetic values. Above all, the laser scanning method, by generating results as a point cloud, enables the user to obtain a very detailed, three-dimensional computer image of measured objects, and to make series of analyses and expert investigations, e.g. of the technical condition (deformation of construction elements) as well as the spatial management of the surrounding

  16. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures

    Rock, Adam J.; Coventry, William L.; Morgan, Methuen I.; Loi, Natasha M.

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology. PMID:27014147

  17. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures.

    Rock, Adam J; Coventry, William L; Morgan, Methuen I; Loi, Natasha M

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology.

  18. Genomic similarity and kernel methods I: advancements by building on mathematical and statistical foundations.

    Schaid, Daniel J

    2010-01-01

    Measures of genomic similarity are the basis of many statistical analytic methods. We review the mathematical and statistical basis of similarity methods, particularly based on kernel methods. A kernel function converts information for a pair of subjects to a quantitative value representing either similarity (larger values meaning more similar) or distance (smaller values meaning more similar), with the requirement that it must create a positive semidefinite matrix when applied to all pairs of subjects. This review emphasizes the wide range of statistical methods and software that can be used when similarity is based on kernel methods, such as nonparametric regression, linear mixed models and generalized linear mixed models, hierarchical models, score statistics, and support vector machines. The mathematical rigor for these methods is summarized, as is the mathematical framework for making kernels. This review provides a framework to move from intuitive and heuristic approaches to define genomic similarities to more rigorous methods that can take advantage of powerful statistical modeling and existing software. A companion paper reviews novel approaches to creating kernels that might be useful for genomic analyses, providing insights with examples [1]. Copyright © 2010 S. Karger AG, Basel.

  19. Hybrid statistics-simulations based method for atom-counting from ADF STEM images

    De wael, Annelies, E-mail: annelies.dewael@uantwerpen.be [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium); De Backer, Annick [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium); Jones, Lewys; Nellist, Peter D. [Department of Materials, University of Oxford, Parks Road, OX1 3PH Oxford (United Kingdom); Van Aert, Sandra, E-mail: sandra.vanaert@uantwerpen.be [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium)

    2017-06-15

    A hybrid statistics-simulations based method for atom-counting from annular dark field scanning transmission electron microscopy (ADF STEM) images of monotype crystalline nanostructures is presented. Different atom-counting methods already exist for model-like systems. However, the increasing relevance of radiation damage in the study of nanostructures demands a method that allows atom-counting from low dose images with a low signal-to-noise ratio. Therefore, the hybrid method directly includes prior knowledge from image simulations into the existing statistics-based method for atom-counting, and accounts in this manner for possible discrepancies between actual and simulated experimental conditions. It is shown by means of simulations and experiments that this hybrid method outperforms the statistics-based method, especially for low electron doses and small nanoparticles. The analysis of a simulated low dose image of a small nanoparticle suggests that this method allows for far more reliable quantitative analysis of beam-sensitive materials. - Highlights: • A hybrid method for atom-counting from ADF STEM images is introduced. • Image simulations are incorporated into a statistical framework in a reliable manner. • Limits of the existing methods for atom-counting are far exceeded. • Reliable counting results from an experimental low dose image are obtained. • Progress towards reliable quantitative analysis of beam-sensitive materials is made.

  20. Trends in study design and the statistical methods employed in a leading general medicine journal.

    Gosho, M; Sato, Y; Nagashima, K; Takahashi, S

    2018-02-01

    Study design and statistical methods have become core components of medical research, and the methodology has become more multifaceted and complicated over time. The study of the comprehensive details and current trends of study design and statistical methods is required to support the future implementation of well-planned clinical studies providing information about evidence-based medicine. Our purpose was to illustrate study design and statistical methods employed in recent medical literature. This was an extension study of Sato et al. (N Engl J Med 2017; 376: 1086-1087), which reviewed 238 articles published in 2015 in the New England Journal of Medicine (NEJM) and briefly summarized the statistical methods employed in NEJM. Using the same database, we performed a new investigation of the detailed trends in study design and individual statistical methods that were not reported in the Sato study. Due to the CONSORT statement, prespecification and justification of sample size are obligatory in planning intervention studies. Although standard survival methods (eg Kaplan-Meier estimator and Cox regression model) were most frequently applied, the Gray test and Fine-Gray proportional hazard model for considering competing risks were sometimes used for a more valid statistical inference. With respect to handling missing data, model-based methods, which are valid for missing-at-random data, were more frequently used than single imputation methods. These methods are not recommended as a primary analysis, but they have been applied in many clinical trials. Group sequential design with interim analyses was one of the standard designs, and novel design, such as adaptive dose selection and sample size re-estimation, was sometimes employed in NEJM. Model-based approaches for handling missing data should replace single imputation methods for primary analysis in the light of the information found in some publications. Use of adaptive design with interim analyses is increasing

  1. India's Modern Slaves: Bonded Labor in India and Methods of Intervention

    Boutros, Heidi

    2005-01-01

    Slavery flourishes in the modern world. In nations plagued by debilitating poverty, individuals unable to afford food, clothing, and shelter may be compelled to make a devastating decision: to sell themselves or their children into slavery. Nowhere in the world is this more common than India. Conservative estimates suggest that there are 10…

  2. modern war and the utility of force: challenges, methods and strategy

    ismith

    to have adapted well to the new strategic environment, which now results in the ... On the one hand, empirical evidence confirms that military force has ... focussed on the question of how the modern social construct of war should be ... some light on four interrelated paradoxes that are central to the current debate on the.

  3. Perception of User Criteria in the Context of Sustainability of Modern Methods of Construction Based on Wood

    Jozef Švajlenka

    2018-01-01

    Full Text Available Recent developments in the construction industry have brought more efficient and sustainable technologies, technological procedures, and materials. An example of this are modern methods of construction, which offer larger production volumes with a higher quality and shorter procurement time. The goal of those methods is to improve construction sustainability through quality improvement, customer satisfaction, shortened construction time, and reduced environmental impact. The main goal of this research is to demonstrate, by means of theoretical assumptions, surveys, and analyses, the sustainability of modern methods of construction based on wood. The work focuses on identifying the user criteria for construction sustainability. Selected user criteria of construction sustainability are applied in a socio-economic survey whose purpose is to determine how users perceive the efficiency of selected construction systems. We evaluate certain user parameters in the context of sustainability by relying on the users of buildings (family houses which have already been built and compare the results with declared design parameters.

  4. The Issue of Age Estimation in a Modern Skeletal Population: Are Even the More Modern Current Aging Methods Satisfactory for the Elderly?

    Cappella, Annalisa; Cummaudo, Marco; Arrigoni, Elena; Collini, Federica; Cattaneo, Cristina

    2017-01-01

    The main idea behind age assessment in adults is related to the analysis of the physiological degeneration of particular skeletal structures with age. The main issues with these procedures are due to the fact that they have not been tested on different modern populations and in different taphonomic contexts and that they tend to underestimate the age of older individuals. The purpose of this study was to test the applicability and the reliability of these methods on a contemporary population of skeletal remains of 145 elderly individuals of known sex and age. The results show that, due to taphonomic influences, some skeletal sites showed a lower survival. Therefore, the methods with the highest percentage of applicability were Lovejoy (89.6%) and Rougé-Maillart (81.3%), followed by Suchey-Brooks (59.3%), and those with the lowest percentage of applicability were Beauthier (26.2%) and Iscan (22.7%). In addition, this research has shown how for older adults the study of both acetabulum and auricular surface may be more reliable for aging. This is also in accordance with the fact that auricular surface and the acetabulum are the areas more frequently surviving taphonomic insult. © 2016 American Academy of Forensic Sciences.

  5. A new method for curve fitting to the data with low statistics not using the chi2-method

    Awaya, T.

    1979-01-01

    A new method which does not use the chi 2 -fitting method is investigated in order to fit the theoretical curve to data with low statistics. The method is compared with the usual and modified chi 2 -fitting ones. The analyses are done for data which are generated by computers. It is concluded that the new method gives good results in all the cases. (Auth.)

  6. Comparative description of the use of modern methods of hormonal contraception for women with excessive body mass

    I. B. Gridina

    2016-04-01

    Full Text Available Maintenance of reproductive health of women with excessive weight is the actual problem of nowadays and is an important direction of modern medicine. Aim. To analyze the efficiency and acceptability of oral, іntravaginal and transdermal hormonal contraceptives among women with excessive body mass. Methods and results. The tolerability and ease of use of different types of hormonal contraception have been studied in 72 women with excessive body mass to determine the reliability and acceptability of modern hormonal contraceptives. It has been investigated that the effectiveness of hormonal contraceptives according to our data is 100%, none patient was registered with an unwanted pregnancy. The total subjective evaluation of all hormonal contraceptives use were positive: 78.6% of women with excessive body mass who used oral contraceptives, were satisfied with the chosen contraceptive method, 81,8% – were satisfied with intravaginal method, 59,1% – transdermal contraceptive. Conclusions. It has been found that intravaginal contraceptive is most suitable as the drug of first choice for women with overweight compared with oral and transdermal hormonal methods of contraception. This suggests that women with excessive body mass can successfully use modern methods of hormonal contraception. But it is necessary to carry out clinical supervision, during which further clarification on the use of hormonal contraception in women with excessive body mass is possible.

  7. Combination of statistical and physically based methods to assess shallow slide susceptibility at the basin scale

    Oliveira, Sérgio C.; Zêzere, José L.; Lajas, Sara; Melo, Raquel

    2017-07-01

    Approaches used to assess shallow slide susceptibility at the basin scale are conceptually different depending on the use of statistical or physically based methods. The former are based on the assumption that the same causes are more likely to produce the same effects, whereas the latter are based on the comparison between forces which tend to promote movement along the slope and the counteracting forces that are resistant to motion. Within this general framework, this work tests two hypotheses: (i) although conceptually and methodologically distinct, the statistical and deterministic methods generate similar shallow slide susceptibility results regarding the model's predictive capacity and spatial agreement; and (ii) the combination of shallow slide susceptibility maps obtained with statistical and physically based methods, for the same study area, generate a more reliable susceptibility model for shallow slide occurrence. These hypotheses were tested at a small test site (13.9 km2) located north of Lisbon (Portugal), using a statistical method (the information value method, IV) and a physically based method (the infinite slope method, IS). The landslide susceptibility maps produced with the statistical and deterministic methods were combined into a new landslide susceptibility map. The latter was based on a set of integration rules defined by the cross tabulation of the susceptibility classes of both maps and analysis of the corresponding contingency tables. The results demonstrate a higher predictive capacity of the new shallow slide susceptibility map, which combines the independent results obtained with statistical and physically based models. Moreover, the combination of the two models allowed the identification of areas where the results of the information value and the infinite slope methods are contradictory. Thus, these areas were classified as uncertain and deserve additional investigation at a more detailed scale.

  8. Statistical Diagnosis of the Best Weibull Methods for Wind Power Assessment for Agricultural Applications

    Abul Kalam Azad

    2014-05-01

    Full Text Available The best Weibull distribution methods for the assessment of wind energy potential at different altitudes in desired locations are statistically diagnosed in this study. Seven different methods, namely graphical method (GM, method of moments (MOM, standard deviation method (STDM, maximum likelihood method (MLM, power density method (PDM, modified maximum likelihood method (MMLM and equivalent energy method (EEM were used to estimate the Weibull parameters and six statistical tools, namely relative percentage of error, root mean square error (RMSE, mean percentage of error, mean absolute percentage of error, chi-square error and analysis of variance were used to precisely rank the methods. The statistical fittings of the measured and calculated wind speed data are assessed for justifying the performance of the methods. The capacity factor and total energy generated by a small model wind turbine is calculated by numerical integration using Trapezoidal sums and Simpson’s rules. The results show that MOM and MLM are the most efficient methods for determining the value of k and c to fit Weibull distribution curves.

  9. Verification of statistical method CORN for modeling of microfuel in the case of high grain concentration

    Chukbar, B. K., E-mail: bchukbar@mail.ru [National Research Center Kurchatov Institute (Russian Federation)

    2015-12-15

    Two methods of modeling a double-heterogeneity fuel are studied: the deterministic positioning and the statistical method CORN of the MCU software package. The effect of distribution of microfuel in a pebble bed on the calculation results is studied. The results of verification of the statistical method CORN for the cases of the microfuel concentration up to 170 cm{sup –3} in a pebble bed are presented. The admissibility of homogenization of the microfuel coating with the graphite matrix is studied. The dependence of the reactivity on the relative location of fuel and graphite spheres in a pebble bed is found.

  10. Conditional maximum-entropy method for selecting prior distributions in Bayesian statistics

    Abe, Sumiyoshi

    2014-11-01

    The conditional maximum-entropy method (abbreviated here as C-MaxEnt) is formulated for selecting prior probability distributions in Bayesian statistics for parameter estimation. This method is inspired by a statistical-mechanical approach to systems governed by dynamics with largely separated time scales and is based on three key concepts: conjugate pairs of variables, dimensionless integration measures with coarse-graining factors and partial maximization of the joint entropy. The method enables one to calculate a prior purely from a likelihood in a simple way. It is shown, in particular, how it not only yields Jeffreys's rules but also reveals new structures hidden behind them.

  11. A statistical method (cross-validation) for bone loss region detection after spaceflight

    Zhao, Qian; Li, Wenjun; Li, Caixia; Chu, Philip W.; Kornak, John; Lang, Thomas F.

    2010-01-01

    Astronauts experience bone loss after the long spaceflight missions. Identifying specific regions that undergo the greatest losses (e.g. the proximal femur) could reveal information about the processes of bone loss in disuse and disease. Methods for detecting such regions, however, remains an open problem. This paper focuses on statistical methods to detect such regions. We perform statistical parametric mapping to get t-maps of changes in images, and propose a new cross-validation method to select an optimum suprathreshold for forming clusters of pixels. Once these candidate clusters are formed, we use permutation testing of longitudinal labels to derive significant changes. PMID:20632144

  12. Classical Methods of Statistics With Applications in Fusion-Oriented Plasma Physics

    Kardaun, Otto J W F

    2005-01-01

    Classical Methods of Statistics is a blend of theory and practical statistical methods written for graduate students and researchers interested in applications to plasma physics and its experimental aspects. It can also fruitfully be used by students majoring in probability theory and statistics. In the first part, the mathematical framework and some of the history of the subject are described. Many exercises help readers to understand the underlying concepts. In the second part, two case studies are presented exemplifying discriminant analysis and multivariate profile analysis. The introductions of these case studies outline contextual magnetic plasma fusion research. In the third part, an overview of statistical software is given and, in particular, SAS and S-PLUS are discussed. In the last chapter, several datasets with guided exercises, predominantly from the ASDEX Upgrade tokamak, are included and their physical background is concisely described. The book concludes with a list of essential keyword transl...

  13. A statistical method for evaluation of the experimental phase equilibrium data of simple clathrate hydrates

    Eslamimanesh, Ali; Gharagheizi, Farhad; Mohammadi, Amir H.

    2012-01-01

    We, herein, present a statistical method for diagnostics of the outliers in phase equilibrium data (dissociation data) of simple clathrate hydrates. The applied algorithm is performed on the basis of the Leverage mathematical approach, in which the statistical Hat matrix, Williams Plot, and the r......We, herein, present a statistical method for diagnostics of the outliers in phase equilibrium data (dissociation data) of simple clathrate hydrates. The applied algorithm is performed on the basis of the Leverage mathematical approach, in which the statistical Hat matrix, Williams Plot...... in exponential form is used to represent/predict the hydrate dissociation pressures for three-phase equilibrium conditions (liquid water/ice–vapor-hydrate). The investigated hydrate formers are methane, ethane, propane, carbon dioxide, nitrogen, and hydrogen sulfide. It is interpreted from the obtained results...

  14. Cutting-edge statistical methods for a life-course approach.

    Bub, Kristen L; Ferretti, Larissa K

    2014-01-01

    Advances in research methods, data collection and record keeping, and statistical software have substantially increased our ability to conduct rigorous research across the lifespan. In this article, we review a set of cutting-edge statistical methods that life-course researchers can use to rigorously address their research questions. For each technique, we describe the method, highlight the benefits and unique attributes of the strategy, offer a step-by-step guide on how to conduct the analysis, and illustrate the technique using data from the National Institute of Child Health and Human Development Study of Early Child Care and Youth Development. In addition, we recommend a set of technical and empirical readings for each technique. Our goal was not to address a substantive question of interest but instead to provide life-course researchers with a useful reference guide to cutting-edge statistical methods.

  15. A pseudo-statistical approach to treat choice uncertainty: the example of partitioning allocation methods

    Mendoza Beltran, A.; Heijungs, R.; Guinée, J.; Tukker, A.

    2016-01-01

    Purpose: Despite efforts to treat uncertainty due to methodological choices in life cycle assessment (LCA) such as standardization, one-at-a-time (OAT) sensitivity analysis, and analytical and statistical methods, no method exists that propagate this source of uncertainty for all relevant processes

  16. Statistical Analysis of a Method to Predict Drug-Polymer Miscibility

    Knopp, Matthias Manne; Olesen, Niels Erik; Huang, Yanbin

    2016-01-01

    In this study, a method proposed to predict drug-polymer miscibility from differential scanning calorimetry measurements was subjected to statistical analysis. The method is relatively fast and inexpensive and has gained popularity as a result of the increasing interest in the formulation of drug...... as provided in this study. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association J Pharm Sci....

  17. animation : An R Package for Creating Animations and Demonstrating Statistical Methods

    Yihui Xie

    2013-04-01

    Full Text Available Animated graphs that demonstrate statistical ideas and methods can both attract interest and assist understanding. In this paper we first discuss how animations can be related to some statistical topics such as iterative algorithms, random simulations, (resampling methods and dynamic trends, then we describe the approaches that may be used to create animations, and give an overview to the R package animation, including its design, usage and the statistical topics in the package. With the animation package, we can export the animations produced by R into a variety of formats, such as a web page, a GIF animation, a Flash movie, a PDF document, or an MP4/AVI video, so that users can publish the animations fairly easily. The design of this package is flexible enough to be readily incorporated into web applications, e.g., we can generate animations online with Rweb, which means we do not even need R to be installed locally to create animations. We will show examples of the use of animations in teaching statistics and in the presentation of statistical reports using Sweave or knitr. In fact, this paper itself was written with the knitr and animation package, and the animations are embedded in the PDF document, so that readers can watch the animations in real time when they read the paper (the Adobe Reader is required.Animations can add insight and interest to traditional static approaches to teaching statistics and reporting, making statistics a more interesting and appealing subject.

  18. Mathematical statistics

    Pestman, Wiebe R

    2009-01-01

    This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

  19. On nonequilibrium many-body systems. 1: The nonequilibrium statistical operator method

    Algarte, A.C.S.; Vasconcellos, A.R.; Luzzi, R.; Sampaio, A.J.C.

    1985-01-01

    The theoretical aspects involved in the treatment of many-body systems strongly departed from equilibrium are discussed. The nonequilibrium statistical operator (NSO) method is considered in detail. Using Jaynes' maximum entropy formalism complemented with an ad hoc hypothesis a nonequilibrium statistical operator is obtained. This approach introduces irreversibility from the outset and we recover statistical operators like those of Green-Mori and Zubarev as particular cases. The connection with Generalized Thermodynamics and the construction of nonlinear transport equations are briefly described. (Author) [pt

  20. The choice of statistical methods for comparisons of dosimetric data in radiotherapy.

    Chaikh, Abdulhamid; Giraud, Jean-Yves; Perrin, Emmanuel; Bresciani, Jean-Pierre; Balosso, Jacques

    2014-09-18

    Novel irradiation techniques are continuously introduced in radiotherapy to optimize the accuracy, the security and the clinical outcome of treatments. These changes could raise the question of discontinuity in dosimetric presentation and the subsequent need for practice adjustments in case of significant modifications. This study proposes a comprehensive approach to compare different techniques and tests whether their respective dose calculation algorithms give rise to statistically significant differences in the treatment doses for the patient. Statistical investigation principles are presented in the framework of a clinical example based on 62 fields of radiotherapy for lung cancer. The delivered doses in monitor units were calculated using three different dose calculation methods: the reference method accounts the dose without tissues density corrections using Pencil Beam Convolution (PBC) algorithm, whereas new methods calculate the dose with tissues density correction for 1D and 3D using Modified Batho (MB) method and Equivalent Tissue air ratio (ETAR) method, respectively. The normality of the data and the homogeneity of variance between groups were tested using Shapiro-Wilks and Levene test, respectively, then non-parametric statistical tests were performed. Specifically, the dose means estimated by the different calculation methods were compared using Friedman's test and Wilcoxon signed-rank test. In addition, the correlation between the doses calculated by the three methods was assessed using Spearman's rank and Kendall's rank tests. The Friedman's test showed a significant effect on the calculation method for the delivered dose of lung cancer patients (p Wilcoxon signed-rank test of paired comparisons indicated that the delivered dose was significantly reduced using density-corrected methods as compared to the reference method. Spearman's and Kendall's rank tests indicated a positive correlation between the doses calculated with the different methods