WorldWideScience

Sample records for statistical concepts illustrated

  1. Using Visual Analogies To Teach Introductory Statistical Concepts

    Directory of Open Access Journals (Sweden)

    Jessica S. Ancker

    2017-07-01

    Full Text Available Introductory statistical concepts are some of the most challenging to convey in quantitative literacy courses. Analogies supplemented by visual illustrations can be highly effective teaching tools. This literature review shows that to exploit the power of analogies, teachers must select analogies familiar to the audience, explicitly link the analog with the target concept, and avert misconceptions by explaining where the analogy fails. We provide guidance for instructors and a series of visual analogies for use in teaching medical and health statistics.

  2. Illustrating Sampling Distribution of a Statistic: Minitab Revisited

    Science.gov (United States)

    Johnson, H. Dean; Evans, Marc A.

    2008-01-01

    Understanding the concept of the sampling distribution of a statistic is essential for the understanding of inferential procedures. Unfortunately, this topic proves to be a stumbling block for students in introductory statistics classes. In efforts to aid students in their understanding of this concept, alternatives to a lecture-based mode of…

  3. Statistical concepts a second course

    CERN Document Server

    Lomax, Richard G

    2012-01-01

    Statistical Concepts consists of the last 9 chapters of An Introduction to Statistical Concepts, 3rd ed. Designed for the second course in statistics, it is one of the few texts that focuses just on intermediate statistics. The book highlights how statistics work and what they mean to better prepare students to analyze their own data and interpret SPSS and research results. As such it offers more coverage of non-parametric procedures used when standard assumptions are violated since these methods are more frequently encountered when working with real data. Determining appropriate sample sizes

  4. Between Certainty and Uncertainty Statistics and Probability in Five Units with Notes on Historical Origins and Illustrative Numerical Examples

    CERN Document Server

    Laudański, Ludomir M

    2013-01-01

    „Between Certainty & Uncertainty” is a one-of–a-kind short course on statistics for students, engineers  and researchers.  It is a fascinating introduction to statistics and probability with notes on historical origins and 80 illustrative numerical examples organized in the five units:   ·         Chapter 1  Descriptive Statistics:  Compressing small samples, basic averages - mean and variance, their main properties including God’s proof; linear transformations and z-scored statistics .   ·         Chapter 2 Grouped data: Udny Yule’s concept of qualitative and quantitative variables. Grouping these two kinds of data. Graphical tools. Combinatorial rules and qualitative variables.  Designing frequency histogram. Direct and coded evaluation of quantitative data. Significance of percentiles.   ·         Chapter 3 Regression and correlation: Geometrical distance and equivalent distances in two orthogonal directions  as a prerequisite to the concept of two regressi...

  5. The Bobath concept - a model to illustrate clinical practice.

    Science.gov (United States)

    Michielsen, Marc; Vaughan-Graham, Julie; Holland, Ann; Magri, Alba; Suzuki, Mitsuo

    2017-12-17

    The model of Bobath clinical practice provides a framework identifying the unique aspects of the Bobath concept in terms of contemporary neurological rehabilitation. The utilisation of a framework to illustrate the clinical application of the Bobath concept provides the basis for a common understanding with respect to Bobath clinical practice, education, and research. The development process culminating in the model of Bobath clinical practice is described. The use of the model in clinical practice is illustrated using two cases: a client with a chronic incomplete spinal cord injury and a client with a stroke. This article describes the clinical application of the Bobath concept in terms of the integration of posture and movement with respect to the quality of task performance, applying the Model of Bobath Clinical Practice. Facilitation, a key aspect of Bobath clinical practice, was utilised to positively affect motor control and perception in two clients with impairment-related movement problems due to neurological pathology and associated activity limitations and participation restrictions - the outcome measures used to reflect the individual clinical presentation. Implications for Rehabilitation The model of Bobath clinical practice provides a framework identifying the unique aspects of the Bobath-concept. The model of Bobath clinical practice provides the basis for a common understanding with respect to Bobath clinical practice, education, and research. The clinical application of the Bobath-concept highlights the integration of posture and movement with respect to the quality of task performance. Facilitation, a key aspect of Bobath clinical practice, positively affects motor control, and perception.

  6. New concept of statistical ensembles

    International Nuclear Information System (INIS)

    Gorenstein, M.I.

    2009-01-01

    An extension of the standard concept of the statistical ensembles is suggested. Namely, the statistical ensembles with extensive quantities fluctuating according to an externally given distribution is introduced. Applications in the statistical models of multiple hadron production in high energy physics are discussed.

  7. Concept Maps in Introductory Statistics

    Science.gov (United States)

    Witmer, Jeffrey A.

    2016-01-01

    Concept maps are tools for organizing thoughts on the main ideas in a course. I present an example of a concept map that was created through the work of students in an introductory class and discuss major topics in statistics and relationships among them.

  8. An Introduction to Statistical Concepts

    CERN Document Server

    Lomax, Richard G

    2012-01-01

    This comprehensive, flexible text is used in both one- and two-semester courses to review introductory through intermediate statistics. Instructors select the topics that are most appropriate for their course. Its conceptual approach helps students more easily understand the concepts and interpret SPSS and research results. Key concepts are simply stated and occasionally reintroduced and related to one another for reinforcement. Numerous examples demonstrate their relevance. This edition features more explanation to increase understanding of the concepts. Only crucial equations are included. I

  9. Exploring the practicing-connections hypothesis: using gesture to support coordination of ideas in understanding a complex statistical concept.

    Science.gov (United States)

    Son, Ji Y; Ramos, Priscilla; DeWolf, Melissa; Loftus, William; Stigler, James W

    2018-01-01

    In this article, we begin to lay out a framework and approach for studying how students come to understand complex concepts in rich domains. Grounded in theories of embodied cognition, we advance the view that understanding of complex concepts requires students to practice, over time, the coordination of multiple concepts, and the connection of this system of concepts to situations in the world. Specifically, we explore the role that a teacher's gesture might play in supporting students' coordination of two concepts central to understanding in the domain of statistics: mean and standard deviation. In Study 1 we show that university students who have just taken a statistics course nevertheless have difficulty taking both mean and standard deviation into account when thinking about a statistical scenario. In Study 2 we show that presenting the same scenario with an accompanying gesture to represent variation significantly impacts students' interpretation of the scenario. Finally, in Study 3 we present evidence that instructional videos on the internet fail to leverage gesture as a means of facilitating understanding of complex concepts. Taken together, these studies illustrate an approach to translating current theories of cognition into principles that can guide instructional design.

  10. Introductory statistical inference

    CERN Document Server

    Mukhopadhyay, Nitis

    2014-01-01

    This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of dist

  11. Student-generated illustrations and written narratives of biological science concepts: The effect on community college life science students' achievement in and attitudes toward science

    Science.gov (United States)

    Harvey, Robert Christopher

    The purpose of this study was to determine the effects of two conceptually based instructional strategies on science achievement and attitudes of community college biological science students. The sample consisted of 277 students enrolled in General Biology 1, Microbiology, and Human Anatomy and Physiology 1. Control students were comprised of intact classes from the 2005 Spring semester; treatment students from the 2005 Fall semester were randomly assigned to one of two groups within each course: written narrative (WN) and illustration (IL). WN students prepared in-class written narratives related to cell theory and metabolism, which were taught in all three courses. IL students prepared in-class illustrations of the same concepts. Control students received traditional lecture/lab during the entire class period and neither wrote in-class descriptions nor prepared in-class illustrations of the targeted concepts. All groups were equivalent on age, gender, ethnicity, GPA, and number of college credits earned and were blinded to the study. All interventions occurred in class and no group received more attention or time to complete assignments. A multivariate analysis of covariance (MANCOVA) via multiple regression was the primary statistical strategy used to test the study's hypotheses. The model was valid and statistically significant. Independent follow-up univariate analyses relative to each dependent measure found that no research factor had a significant effect on attitude, but that course-teacher, group membership, and student academic characteristics had a significant effect (p < .05) on achievement: (1) Biology students scored significantly lower in achievement than A&P students; (2) Microbiology students scored significantly higher in achievement than Biology students; (3) Written Narrative students scored significantly higher in achievement than Control students; and (4) GPA had a significant effect on achievement. In addition, given p < .08: (1

  12. Development of the Statistical Reasoning in Biology Concept Inventory (SRBCI)

    Science.gov (United States)

    Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur

    2016-01-01

    We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being…

  13. Pseudo-populations a basic concept in statistical surveys

    CERN Document Server

    Quatember, Andreas

    2015-01-01

    This book emphasizes that artificial or pseudo-populations play an important role in statistical surveys from finite universes in two manners: firstly, the concept of pseudo-populations may substantially improve users’ understanding of various aspects in the sampling theory and survey methodology; an example of this scenario is the Horvitz-Thompson estimator. Secondly, statistical procedures exist in which pseudo-populations actually have to be generated. An example of such a scenario can be found in simulation studies in the field of survey sampling, where close-to-reality pseudo-populations are generated from known sample and population data to form the basis for the simulation process. The chapters focus on estimation methods, sampling techniques, nonresponse, questioning designs and statistical disclosure control.This book is a valuable reference in understanding the importance of the pseudo-population concept and applying it in teaching and research.

  14. Learning Illustrated: An Exploratory Cross-Sectional Drawing Analysis of Students' Conceptions of Learning

    Science.gov (United States)

    Hsieh, Wen-Min; Tsai, Chin-Chung

    2018-01-01

    Using the draw-a-picture technique, the authors explored the learning conceptions held by students across grade levels. A total of 1,067 Taiwanese students in Grades 2, 4, 6, 8, 10, and 12 participated in this study. Participants were asked to use drawing to illustrate how they conceptualize learning. A coding checklist was developed to analyze…

  15. Fostering Self-Concept and Interest for Statistics through Specific Learning Environments

    Science.gov (United States)

    Sproesser, Ute; Engel, Joachim; Kuntze, Sebastian

    2016-01-01

    Supporting motivational variables such as self-concept or interest is an important goal of schooling as they relate to learning and achievement. In this study, we investigated whether specific interest and self-concept related to the domains of statistics and mathematics can be fostered through a four-lesson intervention focusing on statistics.…

  16. Artist concept illustrating key events on day by day basis during Apollo 9

    Science.gov (United States)

    1969-01-01

    Artist concept illustrating key events on day by day basis during Apollo 9 mission. First photograph illustrates activities on the first day of the mission, including flight crew preparation, orbital insertion, 103 north mile orbit, separations, docking and docked Service Propulsion System Burn (19792); Second day events include landmark tracking, pitch maneuver, yaw-roll maneuver, and high apogee orbits (19793); Third day events include crew transfer and Lunar Module system evaluation (19794); Fourth day events include use of camera, day-night extravehicular activity, use of golden slippers, and television over Texas and Louisiana (19795); Fifth day events include vehicles undocked, Lunar Module burns for rendezvous, maximum separation, ascent propulsion system burn, formation flying and docking, and Lunar Module jettison ascent burn (19796); Sixth thru ninth day events include service propulsion system burns and landmark sightings, photograph special tests (19797); Tenth day events i

  17. Statistical physics and thermodynamics an introduction to key concepts

    CERN Document Server

    Rau, Jochen

    2017-01-01

    Statistical physics and thermodynamics describe the behaviour of systems on the macroscopic scale. Their methods are applicable to a wide range of phenomena: from refrigerators to the interior of stars, from chemical reactions to magnetism. Indeed, of all physical laws, the laws of thermodynamics are perhaps the most universal. This text provides a concise yet thorough introduction to the key concepts which underlie statistical physics and thermodynamics. It begins with a review of classical probability theory and quantum theory, as well as a careful discussion of the notions of information and entropy, prior to embarking on the development of statistical physics proper. The crucial steps leading from the microscopic to the macroscopic domain are rendered transparent. In particular, the laws of thermodynamics are shown to emerge as natural consequences of the statistical framework. While the emphasis is on clarifying the basic concepts, the text also contains many applications and classroom-tested exercises,...

  18. A Statistical Approach to Illustrate the Challenge of Astrobiology for Public Outreach

    Directory of Open Access Journals (Sweden)

    Frédéric Foucher

    2017-10-01

    Full Text Available In this study, we attempt to illustrate the competition that constitutes the main challenge of astrobiology, namely the competition between the probability of extraterrestrial life and its detectability. To illustrate this fact, we propose a simple statistical approach based on our knowledge of the Universe and the Milky Way, the Solar System, and the evolution of life on Earth permitting us to obtain the order of magnitude of the distance between Earth and bodies inhabited by more or less evolved past or present life forms, and the consequences of this probability for the detection of associated biosignatures. We thus show that the probability of the existence of evolved extraterrestrial forms of life increases with distance from the Earth while, at the same time, the number of detectable biosignatures decreases due to technical and physical limitations. This approach allows us to easily explain to the general public why it is very improbable to detect a signal of extraterrestrial intelligence while it is justified to launch space probes dedicated to the search for microbial life in the Solar System.

  19. An introduction to medical statistics

    International Nuclear Information System (INIS)

    Hilgers, R.D.; Bauer, P.; Scheiber, V.; Heitmann, K.U.

    2002-01-01

    This textbook teaches all aspects and methods of biometrics as a field of concentration in medical education. Instrumental interpretations of the theory, concepts and terminology of medical statistics are enhanced by numerous illustrations and examples. With problems, questions and answers. (orig./CB) [de

  20. Implementing the “Big Data” Concept in Official Statistics

    Directory of Open Access Journals (Sweden)

    О. V.

    2017-02-01

    Full Text Available Big data is a huge resource that needs to be used at all levels of economic planning. The article is devoted to the study of the development of the concept of “Big Data” in the world and its impact on the transformation of statistical simulation of economic processes. Statistics at the current stage should take into account the complex system of international economic relations, which functions in the conditions of globalization and brings new forms of economic development in small open economies. Statistical science should take into account such phenomena as gig-economy, common economy, institutional factors, etc. The concept of “Big Data” and open data are analyzed, problems of implementation of “Big Data” in the official statistics are shown. The ways of implementation of “Big Data” in the official statistics of Ukraine through active use of technological opportunities of mobile operators, navigation systems, surveillance cameras, social networks, etc. are presented. The possibilities of using “Big Data” in different sectors of the economy, also on the level of companies are shown. The problems of storage of large volumes of data are highlighted. The study shows that “Big Data” is a huge resource that should be used across the Ukrainian economy.

  1. Python for probability, statistics, and machine learning

    CERN Document Server

    Unpingco, José

    2016-01-01

    This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...

  2. Is There a Common Summary Statistical Process for Representing the Mean and Variance? A Study Using Illustrations of Familiar Items.

    Science.gov (United States)

    Yang, Yi; Tokita, Midori; Ishiguchi, Akira

    2018-01-01

    A number of studies revealed that our visual system can extract different types of summary statistics, such as the mean and variance, from sets of items. Although the extraction of such summary statistics has been studied well in isolation, the relationship between these statistics remains unclear. In this study, we explored this issue using an individual differences approach. Observers viewed illustrations of strawberries and lollypops varying in size or orientation and performed four tasks in a within-subject design, namely mean and variance discrimination tasks with size and orientation domains. We found that the performances in the mean and variance discrimination tasks were not correlated with each other and demonstrated that extractions of the mean and variance are mediated by different representation mechanisms. In addition, we tested the relationship between performances in size and orientation domains for each summary statistic (i.e. mean and variance) and examined whether each summary statistic has distinct processes across perceptual domains. The results illustrated that statistical summary representations of size and orientation may share a common mechanism for representing the mean and possibly for representing variance. Introspections for each observer performing the tasks were also examined and discussed.

  3. Development of the Statistical Reasoning in Biology Concept Inventory (SRBCI)

    Science.gov (United States)

    Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur

    2016-01-01

    We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being statistically literate and associated skills are needed in almost all walks of life. Despite this, previous work shows that non–expert-like thinking in statistical reasoning is common, even after instruction. As science educators, our goal should be to move students along a novice-to-expert spectrum, which could be achieved with growing experience in statistical reasoning. We used item response theory analyses (the one-parameter Rasch model and associated analyses) to assess responses gathered from biology students in two populations at a large research university in Canada in order to test SRBCI’s robustness and sensitivity in capturing useful data relating to the students’ conceptual ability in statistical reasoning. Our analyses indicated that SRBCI is a unidimensional construct, with items that vary widely in difficulty and provide useful information about such student ability. SRBCI should be useful as a diagnostic tool in a variety of biology settings and as a means of measuring the success of teaching interventions designed to improve statistical reasoning skills. PMID:26903497

  4. Six sigma for organizational excellence a statistical approach

    CERN Document Server

    Muralidharan, K

    2015-01-01

    This book discusses the integrated concepts of statistical quality engineering and management tools. It will help readers to understand and apply the concepts of quality through project management and technical analysis, using statistical methods. Prepared in a ready-to-use form, the text will equip practitioners to implement the Six Sigma principles in projects. The concepts discussed are all critically assessed and explained, allowing them to be practically applied in managerial decision-making, and in each chapter, the objectives and connections to the rest of the work are clearly illustrated. To aid in understanding, the book includes a wealth of tables, graphs, descriptions and checklists, as well as charts and plots, worked-out examples and exercises. Perhaps the most unique feature of the book is its approach, using statistical tools, to explain the science behind Six Sigma project management and integrated in engineering concepts. The material on quality engineering and statistical management tools of...

  5. Concepts in Thermal Physics

    CERN Document Server

    Blundell, Stephen J

    2006-01-01

    This modern introduction to thermal physics contains a step-by-step presentation of the key concepts. The text is copiously illustrated and each chapter contains several worked examples. - ;An understanding of thermal physics is crucial to much of modern physics, chemistry and engineering. This book provides a modern introduction to the main principles that are foundational to thermal physics, thermodynamics, and statistical mechanics. The key concepts are carefully presented in a clear way, and new ideas are illustrated with copious worked examples as well as a description of the historical background to their discovery. Applications are presented to subjects as. diverse as stellar astrophysics, information and communication theory, condensed matter physics, and climate change. Each chapter concludes with detailed exercises. -

  6. Implementing the “Big Data” Concept in Official Statistics

    OpenAIRE

    О. V.

    2017-01-01

    Big data is a huge resource that needs to be used at all levels of economic planning. The article is devoted to the study of the development of the concept of “Big Data” in the world and its impact on the transformation of statistical simulation of economic processes. Statistics at the current stage should take into account the complex system of international economic relations, which functions in the conditions of globalization and brings new forms of economic development in small open ec...

  7. Learning Essential Terms and Concepts in Statistics and Accounting

    Science.gov (United States)

    Peters, Pam; Smith, Adam; Middledorp, Jenny; Karpin, Anne; Sin, Samantha; Kilgore, Alan

    2014-01-01

    This paper describes a terminological approach to the teaching and learning of fundamental concepts in foundation tertiary units in Statistics and Accounting, using an online dictionary-style resource (TermFinder) with customised "termbanks" for each discipline. Designed for independent learning, the termbanks support inquiring students…

  8. Quantum mechanics from classical statistics

    International Nuclear Information System (INIS)

    Wetterich, C.

    2010-01-01

    Quantum mechanics can emerge from classical statistics. A typical quantum system describes an isolated subsystem of a classical statistical ensemble with infinitely many classical states. The state of this subsystem can be characterized by only a few probabilistic observables. Their expectation values define a density matrix if they obey a 'purity constraint'. Then all the usual laws of quantum mechanics follow, including Heisenberg's uncertainty relation, entanglement and a violation of Bell's inequalities. No concepts beyond classical statistics are needed for quantum physics - the differences are only apparent and result from the particularities of those classical statistical systems which admit a quantum mechanical description. Born's rule for quantum mechanical probabilities follows from the probability concept for a classical statistical ensemble. In particular, we show how the non-commuting properties of quantum operators are associated to the use of conditional probabilities within the classical system, and how a unitary time evolution reflects the isolation of the subsystem. As an illustration, we discuss a classical statistical implementation of a quantum computer.

  9. Applied Statistics Using SPSS, STATISTICA, MATLAB and R

    CERN Document Server

    De Sá, Joaquim P Marques

    2007-01-01

    This practical reference provides a comprehensive introduction and tutorial on the main statistical analysis topics, demonstrating their solution with the most common software package. Intended for anyone needing to apply statistical analysis to a large variety of science and enigineering problems, the book explains and shows how to use SPSS, MATLAB, STATISTICA and R for analysis such as data description, statistical inference, classification and regression, factor analysis, survival data and directional statistics. It concisely explains key concepts and methods, illustrated by practical examp

  10. Statistics for Petroleum Engineers and Geoscientists

    International Nuclear Information System (INIS)

    Jensen, J.L.; Lake, L.W.; Corbett, P.W.M.; Goggin, D.J.

    2000-01-01

    Geostatistics is a common tool in reservoir characterisation. Several texts discuss the subject, however this book differs in its approach and audience from currently available material. Written from the basics of statistics it covers only those topics that are needed for the two goals of the text: to exhibit the diagnostic potential of statistics and to introduce the important features of statistical modeling. This revised edition contains expanded discussions of some materials, in particular conditional probabilities, Bayes Theorem, correlation, and Kriging. The coverage of estimation, variability, and modeling applications have been updated. Seventy examples illustrate concepts and show the role of geology for providing important information for data analysis and model building. Four reservoir case studies conclude the presentation, illustrating the application and importance of the earlier material. This book can help petroleum professionals develop more accurate models, leading to lower sampling costs

  11. Development of the Statistical Reasoning in Biology Concept Inventory (SRBCI).

    Science.gov (United States)

    Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur

    2016-01-01

    We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being statistically literate and associated skills are needed in almost all walks of life. Despite this, previous work shows that non-expert-like thinking in statistical reasoning is common, even after instruction. As science educators, our goal should be to move students along a novice-to-expert spectrum, which could be achieved with growing experience in statistical reasoning. We used item response theory analyses (the one-parameter Rasch model and associated analyses) to assess responses gathered from biology students in two populations at a large research university in Canada in order to test SRBCI's robustness and sensitivity in capturing useful data relating to the students' conceptual ability in statistical reasoning. Our analyses indicated that SRBCI is a unidimensional construct, with items that vary widely in difficulty and provide useful information about such student ability. SRBCI should be useful as a diagnostic tool in a variety of biology settings and as a means of measuring the success of teaching interventions designed to improve statistical reasoning skills. © 2016 T. Deane et al. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  12. Effects of Concept Mapping Strategy on Learning Performance in Business and Economics Statistics

    Science.gov (United States)

    Chiou, Chei-Chang

    2009-01-01

    A concept map (CM) is a hierarchically arranged, graphic representation of the relationships among concepts. Concept mapping (CMING) is the process of constructing a CM. This paper examines whether a CMING strategy can be useful in helping students to improve their learning performance in a business and economics statistics course. A single…

  13. Using Author Bylines and Concept Maps to Illustrate the Connectedness of Scientists

    Directory of Open Access Journals (Sweden)

    Min-Ken Liao

    2013-02-01

    Full Text Available Incorporating reading and discussing primary articles in undergraduate courses has been shown to enhance students’ learning, stimulate their interests in science, and retain them as science majors.  While instructors diligently coach students to scrutinize every section in an article thoroughly and critically, the author byline is often overlooked.  In this study, the author bylines of primary articles were used to illustrate the connectedness of scientists and the collaborative nature of science.  First year college students first learned how to construct a concept map and used concept maps to connect 14 scientists with 14 primary articles that these scientists authored.  In doing so, students visualized and understood science as human activity and science progresses as a community effort.  Pre- and post-activity questionnaires were used to evaluate whether the activity objectives were met.  Students further examined the structure and organization of a primary article after the activity and they were engaged in discussions such as how research ideas developed and evolved, the advantages and disadvantages of collaborative research, the ethics of authorships, graduate schools, and careers in science. Hopefully, perceiving the authors of primary articles as real people in a social network and science as the collaborative efforts may help students see themselves being a part of the scientific advancement and inspire them to pursue careers in science.

  14. Concepts and recent advances in generalized information measures and statistics

    CERN Document Server

    Kowalski, Andres M

    2013-01-01

    Since the introduction of the information measure widely known as Shannon entropy, quantifiers based on information theory and concepts such as entropic forms and statistical complexities have proven to be useful in diverse scientific research fields. This book contains introductory tutorials suitable for the general reader, together with chapters dedicated to the basic concepts of the most frequently employed information measures or quantifiers and their recent applications to different areas, including physics, biology, medicine, economics, communication and social sciences. As these quantif

  15. Calculation Software versus Illustration Software for Teaching Statistics

    DEFF Research Database (Denmark)

    Mortensen, Peter Stendahl; Boyle, Robin G.

    1999-01-01

    As personal computers have become more and more powerful, so have the software packages available to us for teaching statistics. This paper investigates what software packages are currently being used by progressive statistics instructors at university level, examines some of the deficiencies...... of such software, and indicates features that statistics instructors wish to have incorporated in software in the future. The basis of the paper is a survey of participants at ICOTS-5 (the Fifth International Conference on Teaching Statistics). These survey results, combined with the software based papers...

  16. Probability and Statistics The Science of Uncertainty (Revised Edition)

    CERN Document Server

    Tabak, John

    2011-01-01

    Probability and Statistics, Revised Edition deals with the history of probability, describing the modern concept of randomness and examining "pre-probabilistic" ideas of what most people today would characterize as randomness. This revised book documents some historically important early uses of probability to illustrate some very important probabilistic questions. It goes on to explore statistics and the generations of mathematicians and non-mathematicians who began to address problems in statistical analysis, including the statistical structure of data sets as well as the theory of

  17. An Exploratory Study of Taiwanese Mathematics Teachers' Conceptions of School Mathematics, School Statistics, and Their Differences

    Science.gov (United States)

    Yang, Kai-Lin

    2014-01-01

    This study used phenomenography, a qualitative method, to investigate Taiwanese mathematics teachers' conceptions of school mathematics, school statistics, and their differences. To collect data, we interviewed five mathematics teachers by open questions. They also responded to statements drawn on mathematical/statistical conceptions and…

  18. Probability theory and statistical applications a profound treatise for self-study

    CERN Document Server

    Zörnig, Peter

    2016-01-01

    This accessible and easy-to-read book provides many examples to illustrate diverse topics in probability and statistics, from initial concepts up to advanced calculations. Special attention is devoted e.g. to independency of events, inequalities in probability and functions of random variables. The book is directed to students of mathematics, statistics, engineering, and other quantitative sciences.

  19. Examples and problems in mathematical statistics

    CERN Document Server

    Zacks, Shelemyahu

    2013-01-01

    This book presents examples that illustrate the theory of mathematical statistics and details how to apply the methods for solving problems.  While other books on the topic contain problems and exercises, they do not focus on problem solving. This book fills an important niche in the statistical theory literature by providing a theory/example/problem approach.  Each chapter is divided into four parts: Part I provides the needed theory so readers can become familiar with the concepts, notations, and proven results; Part II presents examples from a variety of fields including engineering, mathem

  20. The holistic concepts of disaster management and social cohesion - statistics and method

    Directory of Open Access Journals (Sweden)

    Gheorghe SĂVOIU

    2011-09-01

    Full Text Available The paper uses a multidisciplinary approach to underline the importance of some holistic concepts like social cohesion and human ecology, and also to assess environmental and economic specificity of these new ecological and social terms. The structure of the paper consists of an introduction describing the transition from mythological existence to the contemporary holistic view and four sections. While, the first section details the vital elements of the ecosphere in the new holistic sense, the second describes the holistic concept of human ecology, and the third details the significance, importance and impact of the contemporary management disasters and some global statistics. The last section summarize a statistical method known as the social cohesion evaluation, applied by the author in our country, during Romania’s admission period to EU, that in conjunction with holistic concept of human ecology represent new necessary analysis in this decade. Some final remarks underline the importance of a new approach in economics based on holistic principle and reciprocity.

  1. Introductory Level Problems Illustrating Concepts in Pharmaceutical Engineering

    Science.gov (United States)

    McIver, Keith; Whitaker, Kathryn; De Delva, Vladimir; Farrell, Stephanie; Savelski, Mariano J.; Slater, C. Stewart

    2012-01-01

    Textbook style problems including detailed solutions introducing pharmaceutical topics at the level of an introductory chemical engineering course have been created. The problems illustrate and teach subjects which students would learn if they were to pursue a career in pharmaceutical engineering, including the unique terminology of the field,…

  2. Statistical methods for quantitative mass spectrometry proteomic experiments with labeling

    Directory of Open Access Journals (Sweden)

    Oberg Ann L

    2012-11-01

    Full Text Available Abstract Mass Spectrometry utilizing labeling allows multiple specimens to be subjected to mass spectrometry simultaneously. As a result, between-experiment variability is reduced. Here we describe use of fundamental concepts of statistical experimental design in the labeling framework in order to minimize variability and avoid biases. We demonstrate how to export data in the format that is most efficient for statistical analysis. We demonstrate how to assess the need for normalization, perform normalization, and check whether it worked. We describe how to build a model explaining the observed values and test for differential protein abundance along with descriptive statistics and measures of reliability of the findings. Concepts are illustrated through the use of three case studies utilizing the iTRAQ 4-plex labeling protocol.

  3. Statistical methods for quantitative mass spectrometry proteomic experiments with labeling.

    Science.gov (United States)

    Oberg, Ann L; Mahoney, Douglas W

    2012-01-01

    Mass Spectrometry utilizing labeling allows multiple specimens to be subjected to mass spectrometry simultaneously. As a result, between-experiment variability is reduced. Here we describe use of fundamental concepts of statistical experimental design in the labeling framework in order to minimize variability and avoid biases. We demonstrate how to export data in the format that is most efficient for statistical analysis. We demonstrate how to assess the need for normalization, perform normalization, and check whether it worked. We describe how to build a model explaining the observed values and test for differential protein abundance along with descriptive statistics and measures of reliability of the findings. Concepts are illustrated through the use of three case studies utilizing the iTRAQ 4-plex labeling protocol.

  4. Evaluating Computer-Based Simulations, Multimedia and Animations that Help Integrate Blended Learning with Lectures in First Year Statistics

    Science.gov (United States)

    Neumann, David L.; Neumann, Michelle M.; Hood, Michelle

    2011-01-01

    The discipline of statistics seems well suited to the integration of technology in a lecture as a means to enhance student learning and engagement. Technology can be used to simulate statistical concepts, create interactive learning exercises, and illustrate real world applications of statistics. The present study aimed to better understand the…

  5. Statistical Thermodynamics and Microscale Thermophysics

    Science.gov (United States)

    Carey, Van P.

    1999-08-01

    Many exciting new developments in microscale engineering are based on the application of traditional principles of statistical thermodynamics. In this text Van Carey offers a modern view of thermodynamics, interweaving classical and statistical thermodynamic principles and applying them to current engineering systems. He begins with coverage of microscale energy storage mechanisms from a quantum mechanics perspective and then develops the fundamental elements of classical and statistical thermodynamics. Subsequent chapters discuss applications of equilibrium statistical thermodynamics to solid, liquid, and gas phase systems. The remainder of the book is devoted to nonequilibrium thermodynamics of transport phenomena and to nonequilibrium effects and noncontinuum behavior at the microscale. Although the text emphasizes mathematical development, Carey includes many examples and exercises to illustrate how the theoretical concepts are applied to systems of scientific and engineering interest. In the process he offers a fresh view of statistical thermodynamics for advanced undergraduate and graduate students, as well as practitioners, in mechanical, chemical, and materials engineering.

  6. The Big Mac Standard: A statistical Illustration

    OpenAIRE

    Yukinobu Kitamura; Hiroshi Fujiki

    2004-01-01

    We demonstrate a statistical procedure for selecting the most suitable empirical model to test an economic theory, using the example of the test for purchasing power parity based on the Big Mac Index. Our results show that supporting evidence for purchasing power parity, conditional on the Balassa-Samuelson effect, depends crucially on the selection of models, sample periods and economies used for estimations.

  7. Applying Statistical Process Control to Clinical Data: An Illustration.

    Science.gov (United States)

    Pfadt, Al; And Others

    1992-01-01

    Principles of statistical process control are applied to a clinical setting through the use of control charts to detect changes, as part of treatment planning and clinical decision-making processes. The logic of control chart analysis is derived from principles of statistical inference. Sample charts offer examples of evaluating baselines and…

  8. Statistical Validation of Engineering and Scientific Models: Background

    International Nuclear Information System (INIS)

    Hills, Richard G.; Trucano, Timothy G.

    1999-01-01

    A tutorial is presented discussing the basic issues associated with propagation of uncertainty analysis and statistical validation of engineering and scientific models. The propagation of uncertainty tutorial illustrates the use of the sensitivity method and the Monte Carlo method to evaluate the uncertainty in predictions for linear and nonlinear models. Four example applications are presented; a linear model, a model for the behavior of a damped spring-mass system, a transient thermal conduction model, and a nonlinear transient convective-diffusive model based on Burger's equation. Correlated and uncorrelated model input parameters are considered. The model validation tutorial builds on the material presented in the propagation of uncertainty tutoriaI and uses the damp spring-mass system as the example application. The validation tutorial illustrates several concepts associated with the application of statistical inference to test model predictions against experimental observations. Several validation methods are presented including error band based, multivariate, sum of squares of residuals, and optimization methods. After completion of the tutorial, a survey of statistical model validation literature is presented and recommendations for future work are made

  9. Categorical and nonparametric data analysis choosing the best statistical technique

    CERN Document Server

    Nussbaum, E Michael

    2014-01-01

    Featuring in-depth coverage of categorical and nonparametric statistics, this book provides a conceptual framework for choosing the most appropriate type of test in various research scenarios. Class tested at the University of Nevada, the book's clear explanations of the underlying assumptions, computer simulations, and Exploring the Concept boxes help reduce reader anxiety. Problems inspired by actual studies provide meaningful illustrations of the techniques. The underlying assumptions of each test and the factors that impact validity and statistical power are reviewed so readers can explain

  10. Nonadditive entropy and nonextensive statistical mechanics - Some central concepts and recent applications

    International Nuclear Information System (INIS)

    Tsallis, Constantino; Tirnakli, Ugur

    2010-01-01

    We briefly review central concepts concerning nonextensive statistical mechanics, based on the nonadditive entropy shown. Among others, we focus on possible realizations of the q-generalized Central Limit Theorem, including at the edge of chaos of the logistic map, and for quasi-stationary states of many-body long-range-interacting Hamiltonian systems.

  11. Statistical experimental design for refractory coatings

    International Nuclear Information System (INIS)

    McKinnon, J.A.; Standard, O.C.

    2000-01-01

    The production of refractory coatings on metal casting moulds is critically dependent on the development of suitable rheological characteristics, such as viscosity and thixotropy, in the initial coating slurry. In this paper, the basic concepts of mixture design and analysis are applied to the formulation of a refractory coating, with illustration by a worked example. Experimental data of coating viscosity versus composition are fitted to a statistical model to obtain a reliable method of predicting the optimal formulation of the coating. Copyright (2000) The Australian Ceramic Society

  12. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...

  13. The "Core Concepts Plus" Paradigm for Creating an Electronic Textbook for Introductory Business and Economic Statistics

    Science.gov (United States)

    Haley, M. Ryan

    2013-01-01

    This paper describes a flexible paradigm for creating an electronic "Core Concepts Plus" textbook (CCP-text) for a course in Introductory Business and Economic Statistics (IBES). In general terms, "core concepts" constitute the intersection of IBES course material taught by all IBES professors at the author's university. The…

  14. Distribution effects of electricity tax illustrated by different distribution concepts

    International Nuclear Information System (INIS)

    Halvorsen, Bente; Larsen, Bodil M.; Nesbakken, Runa

    2001-01-01

    This study demonstrates the significance of the choice of distribution concepts in analyses of distribution effects of electricity tax. By distribution effects are meant that life circumstances are changing. The focus is on different income concepts. Income is an important element in the life circumstances of the households. The distribution effects are studied by focusing on general income before and after tax, pension able earnings before and after tax and total consumption expenditure. The authors study how increased electricity expenses caused by a proportional increase of the electricity tax affect the households in various income groups. It is found that the burden of such an increased tax, measured by the budget part set aside for electricity, decreases with income no matter what distribution concept is used. By calculating measures of inequality for income minus electricity tax before and after the tax increase, it is found that the measures of inequality significantly depend on the choice of distribution concept

  15. Longitudinal Data Analyses Using Linear Mixed Models in SPSS: Concepts, Procedures and Illustrations

    Directory of Open Access Journals (Sweden)

    Daniel T. L. Shek

    2011-01-01

    Full Text Available Although different methods are available for the analyses of longitudinal data, analyses based on generalized linear models (GLM are criticized as violating the assumption of independence of observations. Alternatively, linear mixed models (LMM are commonly used to understand changes in human behavior over time. In this paper, the basic concepts surrounding LMM (or hierarchical linear models are outlined. Although SPSS is a statistical analyses package commonly used by researchers, documentation on LMM procedures in SPSS is not thorough or user friendly. With reference to this limitation, the related procedures for performing analyses based on LMM in SPSS are described. To demonstrate the application of LMM analyses in SPSS, findings based on six waves of data collected in the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes in Hong Kong are presented.

  16. Longitudinal data analyses using linear mixed models in SPSS: concepts, procedures and illustrations.

    Science.gov (United States)

    Shek, Daniel T L; Ma, Cecilia M S

    2011-01-05

    Although different methods are available for the analyses of longitudinal data, analyses based on generalized linear models (GLM) are criticized as violating the assumption of independence of observations. Alternatively, linear mixed models (LMM) are commonly used to understand changes in human behavior over time. In this paper, the basic concepts surrounding LMM (or hierarchical linear models) are outlined. Although SPSS is a statistical analyses package commonly used by researchers, documentation on LMM procedures in SPSS is not thorough or user friendly. With reference to this limitation, the related procedures for performing analyses based on LMM in SPSS are described. To demonstrate the application of LMM analyses in SPSS, findings based on six waves of data collected in the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes) in Hong Kong are presented.

  17. Roles of Illustrators in Visual Communication of Scientific Knowledge

    Directory of Open Access Journals (Sweden)

    Kana Okawa

    2011-10-01

    Full Text Available Scientific knowledge is the knowledge accumulated by systematic studies and organized by general principles. Visual, verbal, numeric, and other types of representation are used to communicate scientific knowledge. Scientific illustration is the visual representation of objects and concepts in order to record and to convey scientific knowledge(Ford, 1993. There are some discussions on scientific illustrations in history, philosophy and the sociology of science(Burri & Dumit, 2008, but little has been done on the creation of scientific illustrations by illustrators. This study focuses on the creation of scientific illustrations by illustrators. The purpose is to show how illustrators create the visual messages in communications of scientific knowledge. Through analysis of semi-structured interviews with 6 professional illustrators, creators and art directors, it is showed that illustrators select and edit scientific information, add non-scientific information, and organize information into one visual representation of scientific knowledge. The implication of this research will provide a new perspective to multisensory communication of scientific knowledge.

  18. Statistical Methodologies to Integrate Experimental and Computational Research

    Science.gov (United States)

    Parker, P. A.; Johnson, R. T.; Montgomery, D. C.

    2008-01-01

    Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application.

  19. Conjugate pair of non-extensive statistics in quantum scattering

    International Nuclear Information System (INIS)

    Ion, D.B.; Ion, M.L.D.

    1999-01-01

    In this paper, by defining the Fourier transform of the scattering amplitudes as a bounded linear mapping from the space L 2p to the space L 2q when 1/(2p)+1/(2q)=1, we introduced a new concept in quantum physics in terms of Tsallis-like entropies S J (p) and S θ (q), namely, that of conjugate pair of non-extensive statistics. This new concept is experimentally illustrated by using 88 + 49 sets of pion-nucleon and pion-nucleus phase shifts. From the experimental determination of the (p,q) - non-extensivity indices by choosing the pairs for which the [χ L 2 (p) + χ θ 2 (q min )] - optimal - test function is minimum we get the conjugate pair of [(p min ,J),(q min , θ)]- non-extensive statistics with 0.50 ≤ p min ≤ 0.60. This new non-extensive statistical effect is experimentally evidenced with high degree of accuracy (CL≥ 99%). Moreover, it is worth to mention that the modification of the statistics has been more efficient than the modification of the PMD-SQS-optimum principle in obtaining the best overall fitting to the experimental data. (authors)

  20. Rigorous force field optimization principles based on statistical distance minimization

    Energy Technology Data Exchange (ETDEWEB)

    Vlcek, Lukas, E-mail: vlcekl1@ornl.gov [Chemical Sciences Division, Geochemistry & Interfacial Sciences Group, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6110 (United States); Joint Institute for Computational Sciences, University of Tennessee, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6173 (United States); Chialvo, Ariel A. [Chemical Sciences Division, Geochemistry & Interfacial Sciences Group, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6110 (United States)

    2015-10-14

    We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. We exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of the approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.

  1. Interactions among Knowledge, Beliefs, and Goals in Framing a Qualitative Study in Statistics Education

    Science.gov (United States)

    Groth, Randall E.

    2010-01-01

    In the recent past, qualitative research methods have become more prevalent in the field of statistics education. This paper offers thoughts on the process of framing a qualitative study by means of an illustrative example. The decisions that influenced the framing of a study of pre-service teachers' understanding of the concept of statistical…

  2. Understanding statistical concepts using S-PLUS

    CERN Document Server

    Schumacker, Randall E

    2001-01-01

    Written as a supplemental text for an introductory or intermediate statistics course, this book is organized along the lines of many popular statistics texts. The chapters provide a good conceptual understanding of basic statistics and include exercises that use S-PLUS simulation programs. Each chapter lists a set of objectives and a summary.The book offers a rich insight into how probability has shaped statistical procedures in the behavioral sciences, as well as a brief history behind the creation of various statistics. Computational skills are kept to a minimum by including S-PLUS programs

  3. Infusion of Quantitative and Statistical Concepts into Biology Courses Does Not Improve Quantitative Literacy

    Science.gov (United States)

    Beck, Christopher W.

    2018-01-01

    Multiple national reports have pushed for the integration of quantitative concepts into the context of disciplinary science courses. The aim of this study was to evaluate the quantitative and statistical literacy of biology students and explore learning gains when those skills were taught implicitly in the context of biology. I examined gains in…

  4. Statistical aspects of essential derivation, with illustrations based on lettuce and barley

    NARCIS (Netherlands)

    Eeuwijk, van F.A.; Law, J.R.

    2004-01-01

    The concept of essential derivation was introduced by UPOV in 1991 to refine the scope of breeders' rights. The intention of the essential derivation concept was to confer breeders protection against fraudulent practices in which `new¿ varieties are produced from current, protected ones without a

  5. Visualization and modeling of sub-populations of compositional data: statistical methods illustrated by means of geochemical data from fumarolic fluids

    Science.gov (United States)

    Pawlowsky-Glahn, Vera; Buccianti, Antonella

    In the investigation of fluid samples of a volcanic system, collected during a given period of time, one of the main goals is to discover cause-effect relationships that allow us to explain changes in the chemical composition. They might be caused by physicochemical factors, such as temperature, pressure, or non-conservative behavior of some chemical constituents (addition or subtraction of material), among others. The presence of subgroups of observations showing different behavior is evidence of unusually complex situations, which might render even more difficult the analysis and interpretation of observed phenomena. These cases require appropriate statistical techniques as well as sound a priori hypothesis concerning underlying geological processes. The purpose of this article is to present the state of the art in the methodology for a better visualization of compositional data, as well as for detecting statistically significant sub-populations. The scheme of this article is to present first the application, and then the underlying methodology, with the aim of the first motivating the second. Thus, the first part has the goal to illustrate how to understand and interpret results, whereas the second is devoted to expose how to perform a study of this kind. The case study is related to the chemical composition of a fumarole of Vulcano Island (southern Italy), called F14. The volcanic activity at Vulcano Island is subject to a continuous program of geochemical surveillance from 1978 up to now and the large data set of observations contains the main chemical composition of volcanic gases as well as trace element concentrations in the condensates of fumarolic gases. Out of the complete set of measured components, the variables H2S, HF and As, determined in samples collected from 1978 to 1993 (As is not available in recent samples) are used to characterize two groups in the original population, which proved to be statistically distinct. The choice of the variables is

  6. Introduction to mathematical statistical physics

    CERN Document Server

    Minlos, R A

    1999-01-01

    This book presents a mathematically rigorous approach to the main ideas and phenomena of statistical physics. The introduction addresses the physical motivation, focussing on the basic concept of modern statistical physics, that is the notion of Gibbsian random fields. Properties of Gibbsian fields are analyzed in two ranges of physical parameters: "regular" (corresponding to high-temperature and low-density regimes) where no phase transition is exhibited, and "singular" (low temperature regimes) where such transitions occur. Next, a detailed approach to the analysis of the phenomena of phase transitions of the first kind, the Pirogov-Sinai theory, is presented. The author discusses this theory in a general way and illustrates it with the example of a lattice gas with three types of particles. The conclusion gives a brief review of recent developments arising from this theory. The volume is written for the beginner, yet advanced students will benefit from it as well. The book will serve nicely as a supplement...

  7. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  8. A Divergence Statistics Extension to VTK for Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bennett, Janine Camille [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical, "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.

  9. Introduction to probability and statistics for ecosystem managers simulation and resampling

    CERN Document Server

    Haas, Timothy C

    2013-01-01

    Explores computer-intensive probability and statistics for ecosystem management decision making Simulation is an accessible way to explain probability and stochastic model behavior to beginners. This book introduces probability and statistics to future and practicing ecosystem managers by providing a comprehensive treatment of these two areas. The author presents a self-contained introduction for individuals involved in monitoring, assessing, and managing ecosystems and features intuitive, simulation-based explanations of probabilistic and statistical concepts. Mathematical programming details are provided for estimating ecosystem model parameters with Minimum Distance, a robust and computer-intensive method. The majority of examples illustrate how probability and statistics can be applied to ecosystem management challenges. There are over 50 exercises - making this book suitable for a lecture course in a natural resource and/or wildlife management department, or as the main text in a program of self-stud...

  10. Google SketchUp Workshop Modeling, Visualizing, and Illustrating

    CERN Document Server

    Brixius, Laurent

    2010-01-01

    Discover the secrets of the Google SketchUp with the 16 real-world professional-level projects including parks, structures, concept art, and illustration. Google SketchUp Workshop includes all the wide variety of projects that SketchUp can be used for-architectural visualization, landscape design, video game and film conception, and more. SketchUp masters in every field will get you up to speed in this agile and intuitive software and then show you the real uses with through projects in architecture, engineering, and design. * Packed with 16 real-world Go

  11. Unders and Overs: Using a Dice Game to Illustrate Basic Probability Concepts

    Science.gov (United States)

    McPherson, Sandra Hanson

    2015-01-01

    In this paper, the dice game "Unders and Overs" is described and presented as an active learning exercise to introduce basic probability concepts. The implementation of the exercise is outlined and the resulting presentation of various probability concepts are described.

  12. Thermal equilibrium and statistical thermometers in special relativity.

    Science.gov (United States)

    Cubero, David; Casado-Pascual, Jesús; Dunkel, Jörn; Talkner, Peter; Hänggi, Peter

    2007-10-26

    There is an intense debate in the recent literature about the correct generalization of Maxwell's velocity distribution in special relativity. The most frequently discussed candidate distributions include the Jüttner function as well as modifications thereof. Here we report results from fully relativistic one-dimensional molecular dynamics simulations that resolve the ambiguity. The numerical evidence unequivocally favors the Jüttner distribution. Moreover, our simulations illustrate that the concept of "thermal equilibrium" extends naturally to special relativity only if a many-particle system is spatially confined. They make evident that "temperature" can be statistically defined and measured in an observer frame independent way.

  13. Scan Statistics

    CERN Document Server

    Glaz, Joseph

    2009-01-01

    Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.

  14. Statistics and data analysis for financial engineering with R examples

    CERN Document Server

    Ruppert, David

    2015-01-01

    The new edition of this influential textbook, geared towards graduate or advanced undergraduate students, teaches the statistics necessary for financial engineering. In doing so, it illustrates concepts using financial markets and economic data, R Labs with real-data exercises, and graphical and analytic methods for modeling and diagnosing modeling errors. Financial engineers now have access to enormous quantities of data. To make use of these data, the powerful methods in this book, particularly about volatility and risks, are essential. Strengths of this fully-revised edition include major additions to the R code and the advanced topics covered. Individual chapters cover, among other topics, multivariate distributions, copulas, Bayesian computations, risk management, multivariate volatility and cointegration. Suggested prerequisites are basic knowledge of statistics and probability, matrices and linear algebra, and calculus. There is an appendix on probability, statistics and linear algebra. Practicing fina...

  15. GALEX-SDSS CATALOGS FOR STATISTICAL STUDIES

    International Nuclear Information System (INIS)

    Budavari, Tamas; Heinis, Sebastien; Szalay, Alexander S.; Nieto-Santisteban, Maria; Bianchi, Luciana; Gupchup, Jayant; Shiao, Bernie; Smith, Myron; Chang Ruixiang; Kauffmann, Guinevere; Morrissey, Patrick; Wyder, Ted K.; Martin, D. Christopher; Barlow, Tom A.; Forster, Karl; Friedman, Peter G.; Schiminovich, David; Milliard, Bruno; Donas, Jose; Seibert, Mark

    2009-01-01

    We present a detailed study of the Galaxy Evolution Explorer's (GALEX) photometric catalogs with special focus on the statistical properties of the All-sky and Medium Imaging Surveys. We introduce the concept of primaries to resolve the issue of multiple detections and follow a geometric approach to define clean catalogs with well understood selection functions. We cross-identify the GALEX sources (GR2+3) with Sloan Digital Sky Survey (SDSS; DR6) observations, which indirectly provides an invaluable insight into the astrometric model of the UV sources and allows us to revise the band merging strategy. We derive the formal description of the GALEX footprints as well as their intersections with the SDSS coverage along with analytic calculations of their areal coverage. The crossmatch catalogs are made available for the public. We conclude by illustrating the implementation of typical selection criteria in SQL for catalog subsets geared toward statistical analyses, e.g., correlation and luminosity function studies.

  16. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  17. Utilizing a Simulation Exercise to Illustrate Critical Inventory Management Concepts

    Science.gov (United States)

    Umble, Elisabeth; Umble, Michael

    2013-01-01

    Most undergraduate business students simply do not appreciate the elegant mathematical beauty of inventory models. So how does an instructor capture students' interest and keep them engaged in the learning process when teaching inventory management concepts? This paper describes a competitive and energizing in-class simulation game that introduces…

  18. The Importance of Integrating Clinical Relevance and Statistical Significance in the Assessment of Quality of Care--Illustrated Using the Swedish Stroke Register.

    Directory of Open Access Journals (Sweden)

    Anita Lindmark

    Full Text Available When profiling hospital performance, quality inicators are commonly evaluated through hospital-specific adjusted means with confidence intervals. When identifying deviations from a norm, large hospitals can have statistically significant results even for clinically irrelevant deviations while important deviations in small hospitals can remain undiscovered. We have used data from the Swedish Stroke Register (Riksstroke to illustrate the properties of a benchmarking method that integrates considerations of both clinical relevance and level of statistical significance.The performance measure used was case-mix adjusted risk of death or dependency in activities of daily living within 3 months after stroke. A hospital was labeled as having outlying performance if its case-mix adjusted risk exceeded a benchmark value with a specified statistical confidence level. The benchmark was expressed relative to the population risk and should reflect the clinically relevant deviation that is to be detected. A simulation study based on Riksstroke patient data from 2008-2009 was performed to investigate the effect of the choice of the statistical confidence level and benchmark value on the diagnostic properties of the method.Simulations were based on 18,309 patients in 76 hospitals. The widely used setting, comparing 95% confidence intervals to the national average, resulted in low sensitivity (0.252 and high specificity (0.991. There were large variations in sensitivity and specificity for different requirements of statistical confidence. Lowering statistical confidence improved sensitivity with a relatively smaller loss of specificity. Variations due to different benchmark values were smaller, especially for sensitivity. This allows the choice of a clinically relevant benchmark to be driven by clinical factors without major concerns about sufficiently reliable evidence.The study emphasizes the importance of combining clinical relevance and level of statistical

  19. The Importance of Integrating Clinical Relevance and Statistical Significance in the Assessment of Quality of Care--Illustrated Using the Swedish Stroke Register.

    Science.gov (United States)

    Lindmark, Anita; van Rompaye, Bart; Goetghebeur, Els; Glader, Eva-Lotta; Eriksson, Marie

    2016-01-01

    When profiling hospital performance, quality inicators are commonly evaluated through hospital-specific adjusted means with confidence intervals. When identifying deviations from a norm, large hospitals can have statistically significant results even for clinically irrelevant deviations while important deviations in small hospitals can remain undiscovered. We have used data from the Swedish Stroke Register (Riksstroke) to illustrate the properties of a benchmarking method that integrates considerations of both clinical relevance and level of statistical significance. The performance measure used was case-mix adjusted risk of death or dependency in activities of daily living within 3 months after stroke. A hospital was labeled as having outlying performance if its case-mix adjusted risk exceeded a benchmark value with a specified statistical confidence level. The benchmark was expressed relative to the population risk and should reflect the clinically relevant deviation that is to be detected. A simulation study based on Riksstroke patient data from 2008-2009 was performed to investigate the effect of the choice of the statistical confidence level and benchmark value on the diagnostic properties of the method. Simulations were based on 18,309 patients in 76 hospitals. The widely used setting, comparing 95% confidence intervals to the national average, resulted in low sensitivity (0.252) and high specificity (0.991). There were large variations in sensitivity and specificity for different requirements of statistical confidence. Lowering statistical confidence improved sensitivity with a relatively smaller loss of specificity. Variations due to different benchmark values were smaller, especially for sensitivity. This allows the choice of a clinically relevant benchmark to be driven by clinical factors without major concerns about sufficiently reliable evidence. The study emphasizes the importance of combining clinical relevance and level of statistical confidence when

  20. On the use of statistical concepts in grand unified theories

    International Nuclear Information System (INIS)

    Dresden, M.

    1982-01-01

    The study raises the question-whether the use of traditional statistical mechanical concepts is legitimate in the early epochs of the development of the univese (from approx. equal to10 -40 s after the big bang, until about 10 -30 s). Several current procedures are examined in detail; the use of the equilibrium notion, the use of Boltzmann-like rate equations, the use of ideas from the theory of phase transitions. It is stressed that from the general viewpoint of statistical mechanics there is no convincing evidence that dynamical systems described by spontaneously broken gauge theories necessarily approach equilibrium. Techniques are suggested whereby this question might be approached. It is noted that the usual treatment by starting from the assumption of a homogeneous, isotropic universe is in principle incapable of discussing local non-equilibrium features. It is very questionable whether this assumption is valid for the epochs considered. Attention is called to the circumstance that if the phase transition picture is taken literally, the presence of both fermions and bosons indicates that a consistent treatment requires the existence of a critical line Tsub(c)(xi), rather than a critical temperature, xi is the ratio of the Fermi to Bose concentrations. This might well alter the qualitative picture of successive stages in the development of the universe. (orig.)

  1. The concept of normality through history: a didactic review of features related to philosophy, statistics and medicine.

    Science.gov (United States)

    Conti, A A; Conti, A; Gensini, G F

    2006-09-01

    Normality characterises in medicine any possible qualitative or quantitative situation whose absence implies an illness or a state of abnormality. The illness concept was first a philosophical one. But the use of mathematics in the study of biological events, which began with Galton (1822-1911) and with Pearson (1857-1936), changed the frame of reference. In the second part of the 19th century mathematics was used to study the distribution of some biological characteristics in the evolution of the species. Around 1900, statistics became the basis for the study of the diffusion of the illnesses. Half a century later statistics made possible the transition from the description of single cases to groups of cases. Even more important is the concept of "normality" in laboratory medicine. In this field the search for the "perfect norm" was, and possibly still is, under way. The widespread use of statistics in the laboratory has allowed the definition, in a certain sense, of a new normality. This is the reason why the term "reference value" has been introduced. However, even the introduction of this new term has merely shifted the problem, and not resolved it.

  2. A statistical view of uncertainty in expert systems

    International Nuclear Information System (INIS)

    Spiegelhalter, D.J.

    1986-01-01

    The constructors of expert systems interpret ''uncertainty'' in a wide sense and have suggested a variety of qualitative and quantitative techniques for handling the concept, such as the theory of ''endorsements,'' fuzzy reasoning, and belief functions. After a brief selective review of procedures that do not adhere to the laws of probability, it is argued that a subjectivist Bayesian view of uncertainty, if flexibly applied, can provide many of the features demanded by expert systems. This claim is illustrated with a number of examples of probabilistic reasoning, and a connection drawn with statistical work on the graphical representation of multivariate distributions. Possible areas of future research are outlined

  3. Illustrating Mathematics using 3D Printers

    OpenAIRE

    Knill, Oliver; Slavkovsky, Elizabeth

    2013-01-01

    3D printing technology can help to visualize proofs in mathematics. In this document we aim to illustrate how 3D printing can help to visualize concepts and mathematical proofs. As already known to educators in ancient Greece, models allow to bring mathematics closer to the public. The new 3D printing technology makes the realization of such tools more accessible than ever. This is an updated version of a paper included in book Low-Cost 3D Printing for science, education and Sustainable Devel...

  4. Enhancing Motivation and Acquisition of Coordinate Concepts by Using Concept Trees.

    Science.gov (United States)

    Hirumi, Atsusi; Bowers, Dennis R.

    1991-01-01

    Examines the effects of providing undergraduate learners with graphic illustrations of coordinate concept relationships to supplement text-based instruction. Half of those reading a specific passage received a graphic concept tree. That group outperformed those who did not, reporting significantly higher amounts of attenuation, confidence, and…

  5. Statistical thermodynamics

    International Nuclear Information System (INIS)

    Lim, Gyeong Hui

    2008-03-01

    This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics

  6. Generalized correlation integral vectors: A distance concept for chaotic dynamical systems

    Energy Technology Data Exchange (ETDEWEB)

    Haario, Heikki, E-mail: heikki.haario@lut.fi [School of Engineering Science, Lappeenranta University of Technology, Lappeenranta (Finland); Kalachev, Leonid, E-mail: KalachevL@mso.umt.edu [Department of Mathematical Sciences, University of Montana, Missoula, Montana 59812-0864 (United States); Hakkarainen, Janne [Earth Observation Unit, Finnish Meteorological Institute, Helsinki (Finland)

    2015-06-15

    Several concepts of fractal dimension have been developed to characterise properties of attractors of chaotic dynamical systems. Numerical approximations of them must be calculated by finite samples of simulated trajectories. In principle, the quantities should not depend on the choice of the trajectory, as long as it provides properly distributed samples of the underlying attractor. In practice, however, the trajectories are sensitive with respect to varying initial values, small changes of the model parameters, to the choice of a solver, numeric tolerances, etc. The purpose of this paper is to present a statistically sound approach to quantify this variability. We modify the concept of correlation integral to produce a vector that summarises the variability at all selected scales. The distribution of this stochastic vector can be estimated, and it provides a statistical distance concept between trajectories. Here, we demonstrate the use of the distance for the purpose of estimating model parameters of a chaotic dynamic model. The methodology is illustrated using computational examples for the Lorenz 63 and Lorenz 95 systems, together with a framework for Markov chain Monte Carlo sampling to produce posterior distributions of model parameters.

  7. New approaches to the formation of concepts in the school course «Socio-economic geography of the world»

    Directory of Open Access Journals (Sweden)

    Микола Чишкала

    2016-10-01

    Full Text Available The article analyzes the problem of concepts’ formation in the course «Socio-economic geography of the world». In the formation of economic-geographical concepts in the 10th form we propose to use various tools of learning and specific approaches. 1st approach - the formation of concepts with the help of maps. Maps are models of economic and geographic facts reflection in the space. Statistics put on the map is an effective tool in the formation of economic and geographic concepts, in the analysis of concrete situation in the country or the world. With the help of maps we propose to form the concepts of megalopolis, urban agglomeration, population reproduction, demographic transition, migration, urbanization, and the like. 2nd approach - the formation of concepts on the basis of charts and diagrams. The diagram is the evidence which determines the dominance of one process or object over another. We propose to use diagrams for the formation of concepts about the occupational structure of population, sex-age composition of the population, exports, imports, employment, etc. The graph performs the function of digital data interpretation, comparison and grouping of objects and processes, illustrating the dynamics of the indicators. With the help of graphs we propose to reflect the dynamics of population, population explosion, migration, fertility, mortality, population density, etc 3rd approach - the formation of concepts on the basis of illustrations and schemes. By the illustrations we propose to form the concepts of suburbanity, pseudourbanity, agriculture, and so on. Schematic images are effective in constructing concepts about the geographical division of labour, depopulation, surrounding geo-environment, monitoring, etc. 4th approach - the formation of concepts with the help of tables. Tables are a transitional link between the text and the graphic image. Using tables we can form such concepts as: Republic, monarchy, life

  8. Selecting the most appropriate inferential statistical test for your quantitative research study.

    Science.gov (United States)

    Bettany-Saltikov, Josette; Whittaker, Victoria Jane

    2014-06-01

    To discuss the issues and processes relating to the selection of the most appropriate statistical test. A review of the basic research concepts together with a number of clinical scenarios is used to illustrate this. Quantitative nursing research generally features the use of empirical data which necessitates the selection of both descriptive and statistical tests. Different types of research questions can be answered by different types of research designs, which in turn need to be matched to a specific statistical test(s). Discursive paper. This paper discusses the issues relating to the selection of the most appropriate statistical test and makes some recommendations as to how these might be dealt with. When conducting empirical quantitative studies, a number of key issues need to be considered. Considerations for selecting the most appropriate statistical tests are discussed and flow charts provided to facilitate this process. When nursing clinicians and researchers conduct quantitative research studies, it is crucial that the most appropriate statistical test is selected to enable valid conclusions to be made. © 2013 John Wiley & Sons Ltd.

  9. Validating an Air Traffic Management Concept of Operation Using Statistical Modeling

    Science.gov (United States)

    He, Yuning; Davies, Misty Dawn

    2013-01-01

    Validating a concept of operation for a complex, safety-critical system (like the National Airspace System) is challenging because of the high dimensionality of the controllable parameters and the infinite number of states of the system. In this paper, we use statistical modeling techniques to explore the behavior of a conflict detection and resolution algorithm designed for the terminal airspace. These techniques predict the robustness of the system simulation to both nominal and off-nominal behaviors within the overall airspace. They also can be used to evaluate the output of the simulation against recorded airspace data. Additionally, the techniques carry with them a mathematical value of the worth of each prediction-a statistical uncertainty for any robustness estimate. Uncertainty Quantification (UQ) is the process of quantitative characterization and ultimately a reduction of uncertainties in complex systems. UQ is important for understanding the influence of uncertainties on the behavior of a system and therefore is valuable for design, analysis, and verification and validation. In this paper, we apply advanced statistical modeling methodologies and techniques on an advanced air traffic management system, namely the Terminal Tactical Separation Assured Flight Environment (T-TSAFE). We show initial results for a parameter analysis and safety boundary (envelope) detection in the high-dimensional parameter space. For our boundary analysis, we developed a new sequential approach based upon the design of computer experiments, allowing us to incorporate knowledge from domain experts into our modeling and to determine the most likely boundary shapes and its parameters. We carried out the analysis on system parameters and describe an initial approach that will allow us to include time-series inputs, such as the radar track data, into the analysis

  10. ON STATISTICALLY CONVERGENT IN FINITE DIMENSIONAL SPACES

    OpenAIRE

    GÜNCAN, Ayşe Nur

    2009-01-01

    Abstract: In this paper, the notion of statistical convergence, which was introduced by Steinhaus (1951), was studied in Rm ; and some concepts and theorems, whose statistical correspondence for the real number sequences were given, were carried to Rm . In addition, the concepts of the statistical limit point and the statistical cluster point were given and it was mentioned that these two concepts were'nt equal in Fridy's study in 1993. These concepts were given in Rm and the i...

  11. Visuanimation in statistics

    KAUST Repository

    Genton, Marc G.

    2015-04-14

    This paper explores the use of visualization through animations, coined visuanimation, in the field of statistics. In particular, it illustrates the embedding of animations in the paper itself and the storage of larger movies in the online supplemental material. We present results from statistics research projects using a variety of visuanimations, ranging from exploratory data analysis of image data sets to spatio-temporal extreme event modelling; these include a multiscale analysis of classification methods, the study of the effects of a simulated explosive volcanic eruption and an emulation of climate model output. This paper serves as an illustration of visuanimation for future publications in Stat. Copyright © 2015 John Wiley & Sons, Ltd.

  12. A Simple Inexpensive Procedure for Illustrating Some Principles of Tomography

    Science.gov (United States)

    Darvey, Ivan G.

    2013-01-01

    The experiment proposed here illustrates some concepts of tomography via a qualitative determination of the relative concentration of various dilutions of food dye without "a priori" knowledge of the concentration of each dye mixture. This is performed in a manner analogous to computed tomography (CT) scans. In order to determine the…

  13. The Extended Enterprise concept

    DEFF Research Database (Denmark)

    Larsen, Lars Bjørn; Vesterager, Johan; Gobbi, Chiara

    1999-01-01

    This paper provides an overview of the work that has been done regarding the Extended Enterprise concept in the Common Concept team of Globeman 21 including references to results deliverables concerning the development of the Extended Enterprise concept. The first section presents the basic concept...... picture from Globeman21, which illustrates the Globeman21 way of realising the Extended Enterprise concept. The second section presents the Globeman21 EE concept in a life cycle perspective, which to a large extent is based on the thoughts and ideas behind GERAM (ISO/DIS 15704)....

  14. Characterisation of contaminated metals using an advanced statistical toolbox - Geostatistical characterisation of contaminated metals: methodology and illustrations

    International Nuclear Information System (INIS)

    Larsson, Arne; Lidar, Per; Desnoyers, Yvon

    2014-01-01

    Radiological characterisation plays an important role in the process to recycle contaminated or potentially contaminated metals. It is a platform for planning, identification of the extent and nature of contamination, assessing potential risk impacts, cost estimation, radiation protection, management of material arising from decommissioning as well as for the release of the materials as well as the disposal of the generated secondary waste as radioactive waste. Key issues in radiological characterisation are identification of objectives, development of a measurement and sampling strategy (probabilistic, judgmental or a combination thereof), knowledge management, traceability, recording and processing of obtained information. By applying advanced combination of statistical and geostatistical in the concept better performance can be achieved at a lower cost. This paper will describe the benefits with the usage of the available methods in the different stages of the characterisation, treatment and clearance processes aiming for reliable results in line with the data quality objectives. (authors)

  15. Advancing Empirical Approaches to the Concept of Resilience: A Critical Examination of Panarchy, Ecological Information, and Statistical Evidence

    Directory of Open Access Journals (Sweden)

    Ali Kharrazi

    2016-09-01

    Full Text Available Despite its ambiguities, the concept of resilience is of critical importance to researchers, practitioners, and policy-makers in dealing with dynamic socio-ecological systems. In this paper, we critically examine the three empirical approaches of (i panarchy; (ii ecological information-based network analysis; and (iii statistical evidence of resilience to three criteria determined for achieving a comprehensive understanding and application of this concept. These criteria are the ability: (1 to reflect a system’s adaptability to shocks; (2 to integrate social and environmental dimensions; and (3 to evaluate system-level trade-offs. Our findings show that none of the three currently applied approaches are strong in handling all three criteria. Panarchy is strong in the first two criteria but has difficulty with normative trade-offs. The ecological information-based approach is strongest in evaluating trade-offs but relies on common dimensions that lead to over-simplifications in integrating the social and environmental dimensions. Statistical evidence provides suggestions that are simplest and easiest to act upon but are generally weak in all three criteria. This analysis confirms the value of these approaches in specific instances but also the need for further research in advancing empirical approaches to the concept of resilience.

  16. Computational statistics handbook with Matlab

    CERN Document Server

    Martinez, Wendy L

    2007-01-01

    Prefaces Introduction What Is Computational Statistics? An Overview of the Book Probability Concepts Introduction Probability Conditional Probability and Independence Expectation Common Distributions Sampling Concepts Introduction Sampling Terminology and Concepts Sampling Distributions Parameter Estimation Empirical Distribution Function Generating Random Variables Introduction General Techniques for Generating Random Variables Generating Continuous Random Variables Generating Discrete Random Variables Exploratory Data Analysis Introduction Exploring Univariate Data Exploring Bivariate and Trivariate Data Exploring Multidimensional Data Finding Structure Introduction Projecting Data Principal Component Analysis Projection Pursuit EDA Independent Component Analysis Grand Tour Nonlinear Dimensionality Reduction Monte Carlo Methods for Inferential Statistics Introduction Classical Inferential Statistics Monte Carlo Methods for Inferential Statist...

  17. The derivation and application of a risk related value of the spend for saving a statistical life

    International Nuclear Information System (INIS)

    Jackson, D; Stone, D; Butler, G G; McGlynn, G

    2004-01-01

    The concept of a risk related value of the spend for saving a statistical life (VSSSL) is advanced for use in cost-benefit studies across the power generation sector, and the nuclear industry in particular. For illustrative purposes, a best estimate VSSSL is set based on HSE guidance at Pounds 2 M. Above a risk of 10 -3 y -1 it is assumed that the VSSSL may approach this maximum sustainable value. As the risk reduces so does the VSSSL. At a risk level of 10 -6 y -1 a VSSSL of Pounds 0.5 M is applied. For risks below 10 -9 y -1 the value of further risk reduction approaches zero, although a nominal VSSSL of Pounds 10 k is applied as a pragmatic way forward in this study. The implications of adopting this concept as an aid to decision making in determining the spend on radiological dose reduction measures are illustrated through a worked example with a banded approach to estimating collective dose

  18. An introduction to medical statistics; Einfuehrung in die Medizinische Statistik

    Energy Technology Data Exchange (ETDEWEB)

    Hilgers, R.D. [Technische Hochschule Aachen (Germany). Inst. fuer Medizinische Statistik; Bauer, P.; Scheiber, V. [Wien Univ. (Austria). Inst. fuer Medizinische Statistik; Heitmann, K.U. [Koeln Univ. (Germany). Inst. fuer Medizinische Statistik, Informatik und Epidemiologie

    2002-07-01

    This textbook teaches all aspects and methods of biometrics as a field of concentration in medical education. Instrumental interpretations of the theory, concepts and terminology of medical statistics are enhanced by numerous illustrations and examples. With problems, questions and answers. (orig./CB) [German] Das Buch fuehrt systematisch und umfassend in die gaengigen statistischen Methoden in der Medizin und deren Terminologie ein. Es entspricht sowohl dem aktuellen wie auch dem zukuenftigen Gegenstandskatalog fuer Biometrie in der Ausbildung fuer Mediziner. Die Darstellung der theoretischen Konzepte wird durch zahlreiche Abbildungen und medizinische Beispiele veranschaulicht. MC-orientierte Uebungsaufgaben mit Loesungen helfen dem Leser das erlernte Wissen zu vertiefen. (orig.)

  19. Vital statistics

    CERN Document Server

    MacKenzie, Dana

    2004-01-01

    The drawbacks of using 19th-century mathematics in physics and astronomy are illustrated. To continue with the expansion of the knowledge about the cosmos, the scientists will have to come in terms with modern statistics. Some researchers have deliberately started importing techniques that are used in medical research. However, the physicists need to identify the brand of statistics that will be suitable for them, and make a choice between the Bayesian and the frequentists approach. (Edited abstract).

  20. Statistical Learning Theory: Models, Concepts, and Results

    OpenAIRE

    von Luxburg, Ulrike; Schoelkopf, Bernhard

    2008-01-01

    Statistical learning theory provides the theoretical basis for many of today's machine learning algorithms. In this article we attempt to give a gentle, non-technical overview over the key ideas and insights of statistical learning theory. We target at a broad audience, not necessarily machine learning researchers. This paper can serve as a starting point for people who want to get an overview on the field before diving into technical details.

  1. On Understanding: Maxwell on the Methods of Illustration and Scientific Metaphor

    Science.gov (United States)

    Cat, Jordi

    In this paper I examine the notion and role of metaphors and illustrations in Maxwell's works in exact science as a pathway into a broader and richer philosophical conception of a scientist and scientific practice. While some of these notions and methods are still at work in current scientific research-from economics and biology to quantum computation and quantum field theory-, here I have chosen to attest to their entrenchment and complexity in actual science by attempting to make some conceptual sense of Maxwell's own usage; this endeavour includes situating Maxwell's conceptions and applications in his own culture of Victorian science and philosophy. I trace Maxwell's notions to the formulation of the problem of understanding, or interpreting, abstract representations such as potential functions and Lagrangian equations. I articulate the solution in terms of abstract-concrete relations, where the concrete, in tune with Victorian British psychology and engineering, includes the muscular as well as the pictorial. This sets the basis for a conception of understanding in terms of unification and concrete modelling, or representation. I examine the relation of illustration to analogies and metaphors on which this account rests. Lastly, I stress and explain the importance of context-dependence, its consequences for realism-instrumentalism debates, and Maxwell's own emphasis on method.

  2. Fundamental concepts of geometry

    CERN Document Server

    Meserve, Bruce E

    1983-01-01

    Demonstrates relationships between different types of geometry. Provides excellent overview of the foundations and historical evolution of geometrical concepts. Exercises (no solutions). Includes 98 illustrations.

  3. The Incoming Statistical Knowledge of Undergraduate Majors in a Department of Mathematics and Statistics

    Science.gov (United States)

    Cook, Samuel A.; Fukawa-Connelly, Timothy

    2016-01-01

    Studies have shown that at the end of an introductory statistics course, students struggle with building block concepts, such as mean and standard deviation, and rely on procedural understandings of the concepts. This study aims to investigate the understandings entering freshman of a department of mathematics and statistics (including mathematics…

  4. Statistics for library and information services a primer for using open source R software for accessibility and visualization

    CERN Document Server

    Friedman, Alon

    2016-01-01

    Statistics for Library and Information Services, written for non-statisticians, provides logical, user-friendly, and step-by-step instructions to make statistics more accessible for students and professionals in the field of Information Science. It emphasizes concepts of statistical theory and data collection methodologies, but also extends to the topics of visualization creation and display, so that the reader will be able to better conduct statistical analysis and communicate his/her findings. The book is tailored for information science students and professionals. It has specific examples of dataset sets, scripts, design modules, data repositories, homework assignments, and a glossary lexicon that matches the field of Information Science. The textbook provides a visual road map that is customized specifically for Information Science instructors, students, and professionals regarding statistics and visualization. Each chapter in the book includes full-color illustrations on how to use R for the statistical ...

  5. Periodic Virtual Cell Manufacturing (P-VCM) - Concept, Design and Operation

    NARCIS (Netherlands)

    Slomp, Jannes; Krushinsky, Dimitry; Caprihan, Rahul

    2011-01-01

    This paper presents and discusses the concept of Periodic Virtual Cell Manufacturing (P-VCM). After giving an illustrative example of the operation and design complexity of a P-VCM system, we present an industrial case to study the applicability of the concept. The illustrative example and the

  6. Cell illustrator 4.0: a computational platform for systems biology.

    Science.gov (United States)

    Nagasaki, Masao; Saito, Ayumu; Jeong, Euna; Li, Chen; Kojima, Kaname; Ikeda, Emi; Miyano, Satoru

    2011-01-01

    Cell Illustrator is a software platform for Systems Biology that uses the concept of Petri net for modeling and simulating biopathways. It is intended for biological scientists working at bench. The latest version of Cell Illustrator 4.0 uses Java Web Start technology and is enhanced with new capabilities, including: automatic graph grid layout algorithms using ontology information; tools using Cell System Markup Language (CSML) 3.0 and Cell System Ontology 3.0; parameter search module; high-performance simulation module; CSML database management system; conversion from CSML model to programming languages (FORTRAN, C, C++, Java, Python and Perl); import from SBML, CellML, and BioPAX; and, export to SVG and HTML. Cell Illustrator employs an extension of hybrid Petri net in an object-oriented style so that biopathway models can include objects such as DNA sequence, molecular density, 3D localization information, transcription with frame-shift, translation with codon table, as well as biochemical reactions.

  7. Introductory remote sensing principles and concepts principles and concepts

    CERN Document Server

    Gibson, Paul

    2013-01-01

    Introduction to Remote Sensing Principles and Concepts provides a comprehensive student introduction to both the theory and application of remote sensing. This textbook* introduces the field of remote sensing and traces its historical development and evolution* presents detailed explanations of core remote sensing principles and concepts providing the theory required for a clear understanding of remotely sensed images.* describes important remote sensing platforms - including Landsat, SPOT and NOAA * examines and illustrates many of the applications of remotely sensed images in various fields.

  8. Detection and validation of unscalable item score patterns using item response theory: an illustration with Harter's Self-Perception Profile for Children.

    Science.gov (United States)

    Meijer, Rob R; Egberink, Iris J L; Emons, Wilco H M; Sijtsma, Klaas

    2008-05-01

    We illustrate the usefulness of person-fit methodology for personality assessment. For this purpose, we use person-fit methods from item response theory. First, we give a nontechnical introduction to existing person-fit statistics. Second, we analyze data from Harter's (1985) Self-Perception Profile for Children (Harter, 1985) in a sample of children ranging from 8 to 12 years of age (N = 611) and argue that for some children, the scale scores should be interpreted with care and caution. Combined information from person-fit indexes and from observation, interviews, and self-concept theory showed that similar score profiles may have a different interpretation. For some children in the sample, item scores did not adequately reflect their trait level. Based on teacher interviews, this was found to be due most likely to a less developed self-concept and/or problems understanding the meaning of the questions. We recommend investigating the scalability of score patterns when using self-report inventories to help the researcher interpret respondents' behavior correctly.

  9. Visuanimation in statistics

    KAUST Repository

    Genton, Marc G.; Castruccio, Stefano; Crippa, Paola; Dutta, Subhajit; Huser, Raphaë l; Sun, Ying; Vettori, Sabrina

    2015-01-01

    This paper explores the use of visualization through animations, coined visuanimation, in the field of statistics. In particular, it illustrates the embedding of animations in the paper itself and the storage of larger movies in the online

  10. Development of an application simulating radioactive sources; Conception d'une application de simulation de sources radioactives

    Energy Technology Data Exchange (ETDEWEB)

    Riffault, V.; Locoge, N. [Ecole des Mines de Douai, Dept. Chimie et Environnement, 59 - Douai (France); Leblanc, E.; Vermeulen, M. [Ecole des Mines de Douai, 59 (France)

    2011-05-15

    This paper presents an application simulating radioactive gamma sources developed in the 'Ecole des Mines' of Douai (France). It generates raw counting data as an XML file which can then be statistically exploited to illustrate the various concepts of radioactivity (exponential decay law, isotropy of the radiation, attenuation of radiation in matter). The application, with a spread sheet for data analysis and lab procedures, has been released under free license. (authors)

  11. Understanding Computational Bayesian Statistics

    CERN Document Server

    Bolstad, William M

    2011-01-01

    A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic

  12. From evidence to understanding: a commentary on Fisher (1922) 'On the mathematical foundations of theoretical statistics'.

    Science.gov (United States)

    Hand, David J

    2015-04-13

    The nature of statistics has changed over time. It was originally concerned with descriptive 'matters of state'--with summarizing population numbers, economic strength and social conditions. But during the course of the twentieth century its aim broadened to include inference--how to use data to shed light on underlying mechanisms, about what might happen in the future, about what would happen if certain actions were taken. Central to this development was Ronald Fisher. Over the course of his life he was responsible for many of the major conceptual advances in statistics. This is particularly illustrated by his 1922 paper, in which he introduced many of the concepts which remain fundamental to our understanding of how to extract meaning from data, right to the present day. It is no exaggeration to say that Fisher's work, as illustrated by the ideas he described and developed in this paper, underlies all modern science, and much more besides. This commentary was written to celebrate the 350th anniversary of the journal Philosophical Transactions of the Royal Society.

  13. Strong Statistical Convergence in Probabilistic Metric Spaces

    OpenAIRE

    Şençimen, Celaleddin; Pehlivan, Serpil

    2008-01-01

    In this article, we introduce the concepts of strongly statistically convergent sequence and strong statistically Cauchy sequence in a probabilistic metric (PM) space endowed with the strong topology, and establish some basic facts. Next, we define the strong statistical limit points and the strong statistical cluster points of a sequence in this space and investigate the relations between these concepts.

  14. Basic statistics an introduction with R

    CERN Document Server

    Raykov, Tenko

    2012-01-01

    Basic Statistics provides an accessible and comprehensive introduction to statistics using the free, state-of-the-art, powerful software program R. This book is designed to both introduce students to key concepts in statistics and to provide simple instructions for using R.Teaches essential concepts in statistics, assuming little background knowledge on the part of the readerIntroduces students to R with as few sub-commands as possible for ease of useProvides practical examples from the educational, behavioral, and social sciencesBasic Statistics will appeal to students and professionals acros

  15. Introductory statistics and analytics a resampling perspective

    CERN Document Server

    Bruce, Peter C

    2014-01-01

    Concise, thoroughly class-tested primer that features basic statistical concepts in the concepts in the context of analytics, resampling, and the bootstrapA uniquely developed presentation of key statistical topics, Introductory Statistics and Analytics: A Resampling Perspective provides an accessible approach to statistical analytics, resampling, and the bootstrap for readers with various levels of exposure to basic probability and statistics. Originally class-tested at one of the first online learning companies in the discipline, www.statistics.com, the book primarily focuses on application

  16. Illustrator CC digital classroom

    CERN Document Server

    Smith, Jennifer

    2013-01-01

    A complete training package lets you learn Adobe Illustrator CC at your own speed Adobe Illustrator is the leading drawing and illustration software used to create artwork for a variety of media. This book-and-DVD package provides 13 self-paced lessons that get you up to speed on the latest version of Illustrator (Creative Cloud). Step-by-step instructions in the full-color book are supported by video tutorials on the DVD. Together, these tools will help you learn Adobe Illustrator basics, essential skills, and all the new capabilities in Illustrator CC-in no time.  Includes step-by-step in

  17. Basic statistics for social research

    CERN Document Server

    Hanneman, Robert A; Riddle, Mark D

    2012-01-01

    A core statistics text that emphasizes logical inquiry, notmath Basic Statistics for Social Research teaches core generalstatistical concepts and methods that all social science majorsmust master to understand (and do) social research. Its use ofmathematics and theory are deliberately limited, as the authorsfocus on the use of concepts and tools of statistics in theanalysis of social science data, rather than on the mathematicaland computational aspects. Research questions and applications aretaken from a wide variety of subfields in sociology, and eachchapter is organized arou

  18. Using Concept Mapping to Build Concept the Competence of School Principals

    Directory of Open Access Journals (Sweden)

    Mustamin Mustamin

    2012-08-01

    Full Text Available More and more the competence concept of school principals have an impact on two conditions, namely: (1 to develop the concept can complement and support each other; and (2 to develop the concept of possible contradict, giving rise to different interpretations. Therefore, this becomes the main issue researchers to identify the competence concept of school principals with adaptation of Jackson-Trochim method that is capable of illustrating the concept of competencies. Results of adaptation Jackson-Trochim method that school principals should have three types of competencies to lead the school effectively and efficiently. Kind of competencies are such as school leadership, instructional leadership, and operational leadership. Based on these results, the adaptationof Jackson-Trochim method to build the competence concept of school principals suggests this concept obtained may serve as a reference for school principals continue to build competencies in the future

  19. The emergent demand chain management: key features and illustration from the beef business

    NARCIS (Netherlands)

    Canever, M.D.; Trijp, van J.C.M.; Beers, G.

    2008-01-01

    Abstract Purpose - The paper seeks to delineate the emergence of demand chain management (DCM) from a theoretical perspective and to illustrate its occurrence in practice. Design/methodology/approach ¿ The DCM concept is examined empirically through a case study with retailers involved in the beef

  20. Development of a Research Methods and Statistics Concept Inventory

    Science.gov (United States)

    Veilleux, Jennifer C.; Chapman, Kate M.

    2017-01-01

    Research methods and statistics are core courses in the undergraduate psychology major. To assess learning outcomes, it would be useful to have a measure that assesses research methods and statistical literacy beyond course grades. In two studies, we developed and provided initial validation results for a research methods and statistical knowledge…

  1. Statistics For Dummies

    CERN Document Server

    Rumsey, Deborah

    2011-01-01

    The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou

  2. Illustrations enhance older colorectal cancer patients' website satisfaction and recall of online cancer information.

    Science.gov (United States)

    Bol, N; Smets, E M A; Eddes, E H; de Haes, J C J M; Loos, E F; van Weert, J C M

    2015-03-01

    This study aims to investigate the effects of illustrations in online cancer information on older cancer patients' website satisfaction (i.e. satisfaction with the attractiveness, comprehensibility and emotional support from the website) and recall of information. In an online experiment, 174 younger (text-only information, text with two cognitive illustrations or text with two affective illustrations. In general, adding cognitive illustrations compared with text-only information improved the satisfaction with the attractiveness of the website in both younger and older patients. For older patients in particular, cognitive illustrations facilitated recall of cancer information: whereas older patients recalled less information overall compared with younger patients (39% vs. 50%), no statistically significant differences in age on recall were observed when cognitive illustrations were added to text. Furthermore, older patients were more satisfied with the emotional support from the website than younger patients, especially when affective illustrations were present. Our results suggest that effective online cancer communication for ageing populations involves considering both cognitive and affective illustrations to enhance website satisfaction and recall of cancer information. © 2015 John Wiley & Sons Ltd.

  3. Research design and statistical analysis

    CERN Document Server

    Myers, Jerome L; Lorch Jr, Robert F

    2013-01-01

    Research Design and Statistical Analysis provides comprehensive coverage of the design principles and statistical concepts necessary to make sense of real data.  The book's goal is to provide a strong conceptual foundation to enable readers to generalize concepts to new research situations.  Emphasis is placed on the underlying logic and assumptions of the analysis and what it tells the researcher, the limitations of the analysis, and the consequences of violating assumptions.  Sampling, design efficiency, and statistical models are emphasized throughout. As per APA recommendations

  4. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  5. Recent Advances in System Reliability Signatures, Multi-state Systems and Statistical Inference

    CERN Document Server

    Frenkel, Ilia

    2012-01-01

    Recent Advances in System Reliability discusses developments in modern reliability theory such as signatures, multi-state systems and statistical inference. It describes the latest achievements in these fields, and covers the application of these achievements to reliability engineering practice. The chapters cover a wide range of new theoretical subjects and have been written by leading experts in reliability theory and its applications.  The topics include: concepts and different definitions of signatures (D-spectra),  their  properties and applications  to  reliability of coherent systems and network-type structures; Lz-transform of Markov stochastic process and its application to multi-state system reliability analysis; methods for cost-reliability and cost-availability analysis of multi-state systems; optimal replacement and protection strategy; and statistical inference. Recent Advances in System Reliability presents many examples to illustrate the theoretical results. Real world multi-state systems...

  6. Statistical analysis and interpretation of prenatal diagnostic imaging studies, Part 2: descriptive and inferential statistical methods.

    Science.gov (United States)

    Tuuli, Methodius G; Odibo, Anthony O

    2011-08-01

    The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.

  7. Statistical deception at work

    CERN Document Server

    Mauro, John

    2013-01-01

    Written to reveal statistical deceptions often thrust upon unsuspecting journalists, this book views the use of numbers from a public perspective. Illustrating how the statistical naivete of journalists often nourishes quantitative misinformation, the author's intent is to make journalists more critical appraisers of numerical data so that in reporting them they do not deceive the public. The book frequently uses actual reported examples of misused statistical data reported by mass media and describes how journalists can avoid being taken in by them. Because reports of survey findings seldom g

  8. Statistical data analysis using SAS intermediate statistical methods

    CERN Document Server

    Marasinghe, Mervyn G

    2018-01-01

    The aim of this textbook (previously titled SAS for Data Analytics) is to teach the use of SAS for statistical analysis of data for advanced undergraduate and graduate students in statistics, data science, and disciplines involving analyzing data. The book begins with an introduction beyond the basics of SAS, illustrated with non-trivial, real-world, worked examples. It proceeds to SAS programming and applications, SAS graphics, statistical analysis of regression models, analysis of variance models, analysis of variance with random and mixed effects models, and then takes the discussion beyond regression and analysis of variance to conclude. Pedagogically, the authors introduce theory and methodological basis topic by topic, present a problem as an application, followed by a SAS analysis of the data provided and a discussion of results. The text focuses on applied statistical problems and methods. Key features include: end of chapter exercises, downloadable SAS code and data sets, and advanced material suitab...

  9. Applied statistics in the pharmaceutical industry with case studies using S-PLUS

    CERN Document Server

    Krause, Andreas

    2001-01-01

    The purpose of this book is to provide a general guide to statistical methods used in the pharmaceutical industry, and to illustrate how to use S-PLUS to implement these methods. Specifically, the goal is to: *Illustrate statistical applications in the pharmaceutical industry; *Illustrate how the statistical applications can be carried out using S-PLUS; *Illustrate why S-PLUS is a useful software package for carrying out these applications; *Discuss the results and implications of a particular application; The target audience for this book is very broad, including: *Graduate students in biostatistics; *Statisticians who are involved in the industry as research scientists, regulators, academics, and/or consultants who want to know more about how to use S-PLUS and learn about other sub-fields within the indsutry that they may not be familiar with; *Statisticians in other fields who want to know more about statistical applications in the pharmaceutical industry.

  10. Measurement and statistics for teachers

    CERN Document Server

    Van Blerkom, Malcolm

    2008-01-01

    Written in a student-friendly style, Measurement and Statistics for Teachers shows teachers how to use measurement and statistics wisely in their classes. Although there is some discussion of theory, emphasis is given to the practical, everyday uses of measurement and statistics. The second part of the text provides more complete coverage of basic descriptive statistics and their use in the classroom than in any text now available.Comprehensive and accessible, Measurement and Statistics for Teachers includes:Short vignettes showing concepts in action Numerous classroom examples Highlighted vocabulary Boxes summarizing related concepts End-of-chapter exercises and problems Six full chapters devoted to the essential topic of Classroom Tests Instruction on how to carry out informal assessments, performance assessments, and portfolio assessments, and how to use and interpret standardized tests A five-chapter section on Descriptive Statistics, giving instructors the option of more thoroughly teaching basic measur...

  11. Scientific and technical reports how to write and illustrate

    CERN Document Server

    Sharma, B C

    2014-01-01

    Scientific and technical reports: How to Write and Illustrate provides step-by-step advice on tackling various tasks associated with report writing like gathering information, analyzing information, preparing an outline, writing a rough draft and revising. Many examples illustrate the processes involved at various steps. A stepwise approach to computer-assisted preparation of tables and various types of figures like line drawings, bar charts, histograms, flowcharts, etc., is provided. Also presented are suggestions about how to use commonly available computer programs to give visual shape to ideas, concepts, processes and cause and effect relations described in the text. Use of readability tests is explained as a screening system for checking comprehensibility of language used. Readers are alerted to some of the common pitfalls in science writing like redundancy, overuse of nouns, noun chains, excessive use of passive voice, use of overlong sentences and ambiguity. Checklist at the end of each chapter sums up...

  12. Trends in Illustration: A digital approach to large format illustrations ...

    African Journals Online (AJOL)

    In view of this, this study uses Adobe Photoshop CS4, a digital draw and paint tool, to illustrate some of the notable folktales in Igbo oral tradition. Vector and Raster techniques of computer graphics will be employed and discussed. Basic information on digital illustration will also be enumerated in the project. This piece of ...

  13. A conceptual guide to statistics using SPSS

    CERN Document Server

    Berkman, Elliot T

    2011-01-01

    Bridging an understanding of Statistics and SPSS. This unique text helps students develop a conceptual understanding of a variety of statistical tests by linking the ideas learned in a statistics class from a traditional statistics textbook with the computational steps and output from SPSS. Each chapter begins with a student-friendly explanation of the concept behind each statistical test and how the test relates to that concept. The authors then walk through the steps to compute the test in SPSS and the output, clearly linking how the SPSS procedure and output connect back to the conceptual u

  14. A review of the statistical principles of geochronometry. II. Additional concepts pertinent to radiogenic U-Pb studies

    International Nuclear Information System (INIS)

    Eglington, B.M.; Harmer, R.E.

    1993-01-01

    A summary is provided of statistical regression techniques as applied to radiogenic uranium-lead data. The model-dependent nature of U-Pb regression calculations, both for isochrons and errorchrons, is emphasized throughout. Near concordant U-Pb radiogenic data preserve better information about the original age of the samples than do more discordant data, yet most conventional regression techniques assign more importance to the discordant data than to those near concordia. The links between mathematical techniques for regression and conceptual models are highlighted and critically examined and methods illustrated to deal with the discordant data. Comparison of dates from different laboratories or researchers requires that the techniques applied be statistically valid and, in most cases, that the model-dependent assumptions be compatible. This is particularly important for U-Pb radiogenic data where the influence of model-dependent assumptions may have a greater influence than in the case of whole-rock techniques. A consistent approach is proposed for treating data at South African laboratories in order ro facilitate comparison of results. Recommendations are presented as regards the minimum requirements to be met when reporting radiogenic U-Pb isotope data so that future geochronologists may benefit. 35 refs., 2 tabs., 6 figs

  15. Practical Statistics for LHC Physicists: Descriptive Statistics, Probability and Likelihood (1/3)

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    These lectures cover those principles and practices of statistics that are most relevant for work at the LHC. The first lecture discusses the basic ideas of descriptive statistics, probability and likelihood. The second lecture covers the key ideas in the frequentist approach, including confidence limits, profile likelihoods, p-values, and hypothesis testing. The third lecture covers inference in the Bayesian approach. Throughout, real-world examples will be used to illustrate the practical application of the ideas. No previous knowledge is assumed.

  16. Statistical Analysis and validation

    NARCIS (Netherlands)

    Hoefsloot, H.C.J.; Horvatovich, P.; Bischoff, R.

    2013-01-01

    In this chapter guidelines are given for the selection of a few biomarker candidates from a large number of compounds with a relative low number of samples. The main concepts concerning the statistical validation of the search for biomarkers are discussed. These complicated methods and concepts are

  17. The CAD concept for stellarator-type magnetic systems

    International Nuclear Information System (INIS)

    Vorobyova, V.P.; Martynov, S.A.; Khazhmuradov, M.A.

    2002-01-01

    The paper describes the computer-aided design (CAD) concept for stellarator-type magnetic systems. Consideration is given to the main peculiarities, principles, and dialog organization and design stages of the CAD. The practical realization of the concept is illustrated by specific examples

  18. Concepts and Relations in Neurally Inspired In Situ Concept-Based Computing.

    Science.gov (United States)

    van der Velde, Frank

    2016-01-01

    In situ concept-based computing is based on the notion that conceptual representations in the human brain are "in situ." In this way, they are grounded in perception and action. Examples are neuronal assemblies, whose connection structures develop over time and are distributed over different brain areas. In situ concepts representations cannot be copied or duplicated because that will disrupt their connection structure, and thus the meaning of these concepts. Higher-level cognitive processes, as found in language and reasoning, can be performed with in situ concepts by embedding them in specialized neurally inspired "blackboards." The interactions between the in situ concepts and the blackboards form the basis for in situ concept computing architectures. In these architectures, memory (concepts) and processing are interwoven, in contrast with the separation between memory and processing found in Von Neumann architectures. Because the further development of Von Neumann computing (more, faster, yet power limited) is questionable, in situ concept computing might be an alternative for concept-based computing. In situ concept computing will be illustrated with a recently developed BABI reasoning task. Neurorobotics can play an important role in the development of in situ concept computing because of the development of in situ concept representations derived in scenarios as needed for reasoning tasks. Neurorobotics would also benefit from power limited and in situ concept computing.

  19. Statistical methods for astronomical data analysis

    CERN Document Server

    Chattopadhyay, Asis Kumar

    2014-01-01

    This book introduces “Astrostatistics” as a subject in its own right with rewarding examples, including work by the authors with galaxy and Gamma Ray Burst data to engage the reader. This includes a comprehensive blending of Astrophysics and Statistics. The first chapter’s coverage of preliminary concepts and terminologies for astronomical phenomenon will appeal to both Statistics and Astrophysics readers as helpful context. Statistics concepts covered in the book provide a methodological framework. A unique feature is the inclusion of different possible sources of astronomical data, as well as software packages for converting the raw data into appropriate forms for data analysis. Readers can then use the appropriate statistical packages for their particular data analysis needs. The ideas of statistical inference discussed in the book help readers determine how to apply statistical tests. The authors cover different applications of statistical techniques already developed or specifically introduced for ...

  20. Teaching Statistics Online Using "Excel"

    Science.gov (United States)

    Jerome, Lawrence

    2011-01-01

    As anyone who has taught or taken a statistics course knows, statistical calculations can be tedious and error-prone, with the details of a calculation sometimes distracting students from understanding the larger concepts. Traditional statistics courses typically use scientific calculators, which can relieve some of the tedium and errors but…

  1. The Study of Turkish Illustrated Story Books Published Between 1974-1993, from the Viewpoint of Physical Aspects

    Directory of Open Access Journals (Sweden)

    Havise Güleç Çakmak

    1997-03-01

    The samples used in the research were choserı from among 411 books (translated and adapted published betıveen theyears 1974-1993 and taken at random from vari- ous kindergartens, children’s libraries, private collections and bookshops. The books chosen were studied and recorded on a specially prepared “Book Form” which inclu- des name of book, name of the author and the illustrator, publishing place and pub- lishing year, binding, quality of cover, size, quality of paper, illustration and colou- ring, relationship betıveen text and illustration and style of illustration. Then, tables were prepared to study the distributions and position of the illustration and physical features of the books. Tables were analyzed by using the Khi-Square (X2 statistical test.The findings shoıved that there was generally an inadequacy in the binding qua- lity of cover, paper and colouring. But the size, illustration, position of illustration, relationship betmeen text and illustration were found edaquate.

  2. Radiology illustrated. Spine

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Heung Sik; Lee, Joon Woo [Seoul National Univ. Bundang Hospital, Seongnam, Kyonggi-do (Korea, Republic of). Dept. of Radiology; Kwon, Jong Won [Samsung Medical Center, Seoul (Korea, Republic of). Dept. of Radiology

    2014-04-01

    Offers a practical approach to image interpretation for spinal disorders. Includes numerous high-quality radiographic images and schematic illustrations. Will serve as a self-learning book covering daily routine cases from the basic to the advanced. Radiology Illustrated: Spine is an up-to-date, superbly illustrated reference in the style of a teaching file that has been designed specifically to be of value in clinical practice. Common, critical, and rare but distinctive spinal disorders are described succinctly with the aid of images highlighting important features and informative schematic illustrations. The first part of the book, on common spinal disorders, is for radiology residents and other clinicians who are embarking on the interpretation of spinal images. A range of key disorders are then presented, including infectious spondylitis, cervical trauma, spinal cord disorders, spinal tumors, congenital disorders, uncommon degenerative disorders, inflammatory arthritides, and vascular malformations. The third part is devoted to rare but clinically significant spinal disorders with characteristic imaging features, and the book closes by presenting practical tips that will assist in the interpretation of confusing cases.

  3. Parallel auto-correlative statistics with VTK.

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre; Bennett, Janine Camille

    2013-08-01

    This report summarizes existing statistical engines in VTK and presents both the serial and parallel auto-correlative statistics engines. It is a sequel to [PT08, BPRT09b, PT09, BPT09, PT10] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k-means, and order statistics engines. The ease of use of the new parallel auto-correlative statistics engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the autocorrelative statistics engine.

  4. Actuarial statistics with generalized linear mixed models

    NARCIS (Netherlands)

    Antonio, K.; Beirlant, J.

    2007-01-01

    Over the last decade the use of generalized linear models (GLMs) in actuarial statistics has received a lot of attention, starting from the actuarial illustrations in the standard text by McCullagh and Nelder [McCullagh, P., Nelder, J.A., 1989. Generalized linear models. In: Monographs on Statistics

  5. Statistical analysis and interpolation of compositional data in materials science.

    Science.gov (United States)

    Pesenson, Misha Z; Suram, Santosh K; Gregoire, John M

    2015-02-09

    Compositional data are ubiquitous in chemistry and materials science: analysis of elements in multicomponent systems, combinatorial problems, etc., lead to data that are non-negative and sum to a constant (for example, atomic concentrations). The constant sum constraint restricts the sampling space to a simplex instead of the usual Euclidean space. Since statistical measures such as mean and standard deviation are defined for the Euclidean space, traditional correlation studies, multivariate analysis, and hypothesis testing may lead to erroneous dependencies and incorrect inferences when applied to compositional data. Furthermore, composition measurements that are used for data analytics may not include all of the elements contained in the material; that is, the measurements may be subcompositions of a higher-dimensional parent composition. Physically meaningful statistical analysis must yield results that are invariant under the number of composition elements, requiring the application of specialized statistical tools. We present specifics and subtleties of compositional data processing through discussion of illustrative examples. We introduce basic concepts, terminology, and methods required for the analysis of compositional data and utilize them for the spatial interpolation of composition in a sputtered thin film. The results demonstrate the importance of this mathematical framework for compositional data analysis (CDA) in the fields of materials science and chemistry.

  6. Statistical shape analysis with applications in R

    CERN Document Server

    Dryden, Ian L

    2016-01-01

    A thoroughly revised and updated edition of this introduction to modern statistical methods for shape analysis Shape analysis is an important tool in the many disciplines where objects are compared using geometrical features. Examples include comparing brain shape in schizophrenia; investigating protein molecules in bioinformatics; and describing growth of organisms in biology. This book is a significant update of the highly-regarded `Statistical Shape Analysis’ by the same authors. The new edition lays the foundations of landmark shape analysis, including geometrical concepts and statistical techniques, and extends to include analysis of curves, surfaces, images and other types of object data. Key definitions and concepts are discussed throughout, and the relative merits of different approaches are presented. The authors have included substantial new material on recent statistical developments and offer numerous examples throughout the text. Concepts are introduced in an accessible manner, while reta...

  7. Retention of Statistical Concepts in a Preliminary Randomization-Based Introductory Statistics Curriculum

    Science.gov (United States)

    Tintle, Nathan; Topliff, Kylie; VanderStoep, Jill; Holmes, Vicki-Lynn; Swanson, Todd

    2012-01-01

    Previous research suggests that a randomization-based introductory statistics course may improve student learning compared to the consensus curriculum. However, it is unclear whether these gains are retained by students post-course. We compared the conceptual understanding of a cohort of students who took a randomization-based curriculum (n = 76)…

  8. Statistics 101 for Radiologists.

    Science.gov (United States)

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.

  9. USING STATISTICAL SURVEY IN ECONOMICS

    Directory of Open Access Journals (Sweden)

    Delia TESELIOS

    2012-01-01

    Full Text Available Statistical survey is an effective method of statistical investigation that involves gathering quantitative data, which is often preferred in statistical reports due to the information which can be obtained regarding the entire population studied by observing a part of it. Therefore, because of the information provided, polls are used in many research areas. In economics, statistics are used in the decision making process in choosing competitive strategies in the analysis of certain economic phenomena, the formulation of forecasts. Economic study presented in this paper is to illustrate how a simple random sampling is used to analyze the existing parking spaces situation in a given locality.

  10. Descriptive statistics.

    Science.gov (United States)

    Nick, Todd G

    2007-01-01

    Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.

  11. Statistical Engine Knock Control

    DEFF Research Database (Denmark)

    Stotsky, Alexander A.

    2008-01-01

    A new statistical concept of the knock control of a spark ignition automotive engine is proposed . The control aim is associated with the statistical hy pothesis test which compares the threshold value to the average value of the max imal amplitud e of the knock sensor signal at a given freq uency....... C ontrol algorithm which is used for minimization of the regulation error realizes a simple count-up-count-d own logic. A new ad aptation algorithm for the knock d etection threshold is also d eveloped . C onfi d ence interval method is used as the b asis for ad aptation. A simple statistical mod el...... which includ es generation of the amplitud e signals, a threshold value d etermination and a knock sound mod el is d eveloped for evaluation of the control concept....

  12. Alternative Conceptions of Wisdom: An Onion-Peeling Exercise.

    Science.gov (United States)

    Blanchard-Fields, Fredda; And Others

    1987-01-01

    Discusses contextualistic and integrative approaches to the concept of wisdom, and the evolution of the concept from an independent construct of intelligence to a component of intelligence, i.e., practical intelligence. Suggests operationalization of wisdom as the ability to integrate cognition and affect. Illustrates the integrative approach with…

  13. Social sustainability in supply chains: A framework and a Latin America illustrative case

    Directory of Open Access Journals (Sweden)

    Dafne Oliveira Carlos de Morais

    2017-12-01

    Full Text Available Social issues are under-represented in sustainability, considering historical predominance of economic and environmental issues. This also applies to Sustainable Supply Chain Management. Even with its definition clarified regarding Triple Bottom Line, research still advances disproportionately in environmental and economic dimensions, facing the social dimension. This research aims to analyze how social sustainability is addressed in focal firms and managed into its supply chain. The study explores the concepts of social issues and governance mechanisms, presenting elements discussed in the literature. A framework for managing social sustainability in supply chains is presented, followed by a case to illustrate the discussed concepts in a Latin American context.

  14. [Big data in official statistics].

    Science.gov (United States)

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.

  15. Statistics for scientists and engineers

    CERN Document Server

    Shanmugam , Ramalingam

    2015-01-01

    This book provides the theoretical framework needed to build, analyze and interpret various statistical models. It helps readers choose the correct model, distinguish among various choices that best captures the data, or solve the problem at hand. This is an introductory textbook on probability and statistics. The authors explain theoretical concepts in a step-by-step manner and provide practical examples. The introductory chapter in this book presents the basic concepts. Next, the authors discuss the measures of location, popular measures of spread, and measures of skewness and kurtosis. Prob

  16. An Exercise for Illustrating the Logic of Hypothesis Testing

    Science.gov (United States)

    Lawton, Leigh

    2009-01-01

    Hypothesis testing is one of the more difficult concepts for students to master in a basic, undergraduate statistics course. Students often are puzzled as to why statisticians simply don't calculate the probability that a hypothesis is true. This article presents an exercise that forces students to lay out on their own a procedure for testing a…

  17. Statistical methods for ranking data

    CERN Document Server

    Alvo, Mayer

    2014-01-01

    This book introduces advanced undergraduate, graduate students and practitioners to statistical methods for ranking data. An important aspect of nonparametric statistics is oriented towards the use of ranking data. Rank correlation is defined through the notion of distance functions and the notion of compatibility is introduced to deal with incomplete data. Ranking data are also modeled using a variety of modern tools such as CART, MCMC, EM algorithm and factor analysis. This book deals with statistical methods used for analyzing such data and provides a novel and unifying approach for hypotheses testing. The techniques described in the book are illustrated with examples and the statistical software is provided on the authors’ website.

  18. Lies, damn lies and statistics

    International Nuclear Information System (INIS)

    Jones, M.D.

    2001-01-01

    Statistics are widely employed within archaeological research. This is becoming increasingly so as user friendly statistical packages make increasingly sophisticated analyses available to non statisticians. However, all statistical techniques are based on underlying assumptions of which the end user may be unaware. If statistical analyses are applied in ignorance of the underlying assumptions there is the potential for highly erroneous inferences to be drawn. This does happen within archaeology and here this is illustrated with the example of 'date pooling', a technique that has been widely misused in archaeological research. This misuse may have given rise to an inevitable and predictable misinterpretation of New Zealand's archaeological record. (author). 10 refs., 6 figs., 1 tab

  19. Density by Moduli and Lacunary Statistical Convergence

    Directory of Open Access Journals (Sweden)

    Vinod K. Bhardwaj

    2016-01-01

    Full Text Available We have introduced and studied a new concept of f-lacunary statistical convergence, where f is an unbounded modulus. It is shown that, under certain conditions on a modulus f, the concepts of lacunary strong convergence with respect to a modulus f and f-lacunary statistical convergence are equivalent on bounded sequences. We further characterize those θ for which Sθf=Sf, where Sθf and Sf denote the sets of all f-lacunary statistically convergent sequences and f-statistically convergent sequences, respectively. A general description of inclusion between two arbitrary lacunary methods of f-statistical convergence is given. Finally, we give an Sθf-analog of the Cauchy criterion for convergence and a Tauberian theorem for Sθf-convergence is also proved.

  20. Learning Binomial Probability Concepts with Simulation, Random Numbers and a Spreadsheet

    Science.gov (United States)

    Rochowicz, John A., Jr.

    2005-01-01

    This paper introduces the reader to the concepts of binomial probability and simulation. A spreadsheet is used to illustrate these concepts. Random number generators are great technological tools for demonstrating the concepts of probability. Ideas of approximation, estimation, and mathematical usefulness provide numerous ways of learning…

  1. Practical Statistics for the LHC

    CERN Document Server

    Cranmer, Kyle

    2015-05-22

    This document is a pedagogical introduction to statistics for particle physics. Emphasis is placed on the terminology, concepts, and methods being used at the Large Hadron Collider. The document addresses both the statistical tests applied to a model of the data and the modeling itself.

  2. Waiting time distribution for the first conception leading to a live birth

    International Nuclear Information System (INIS)

    Shrestha, G.; Biswas, S.

    1985-01-01

    An attempt has been made in this paper to obtain probability model describing the distribution of the waiting time from marriage to first conception based on the data from marriage to first live birth. The speciality of this present approach lies in assuming the marital exposure to be finite which was assumed to be infinite by most of the earlier investigators for mathematical simplicity. Illustration of the applicability of the model on the data pertaining to first order of conception and monthly probability of conception for women married at different age groups have been illustrated in this paper. (author)

  3. Creating a digital medical illustration.

    Science.gov (United States)

    Culley, Joanna

    2016-01-01

    This paper covers the steps required to complete a medical illustration in a digital format using Adobe Illustrator and Photoshop. The project example is the surgical procedure for the release of the glenohumeral joint for the condition known as 'frozen shoulder'. The purpose is to demonstrate one method which an artist can use within digital media to create a colour illustration such as the release of the glenohumeral joint. Included is a general overview as how to deal with the administration of a medical illustration commission through the experience of a professional freelance artist.

  4. The Concise Encyclopedia of Statistics

    CERN Document Server

    Dodge, Yadolah

    2008-01-01

    The Concise Encyclopedia of Statistics presents the essential information about statistical tests, concepts, and analytical methods in language that is accessible to practitioners and students of the vast community using statistics in medicine, engineering, physical science, life science, social science, and business/economics. The reference is alphabetically arranged to provide quick access to the fundamental tools of statistical methodology and biographies of famous statisticians. The more than 500 entries include definitions, history, mathematical details, limitations, examples, references,

  5. Availability statistics for thermal power plants

    International Nuclear Information System (INIS)

    1989-01-01

    Denmark, Finland and Sweden have adopted almost the same methods of recording and calculation of availability data. For a number of years comparable availability and outage data for thermal power have been summarized and published in one report. The purpose of the report now presented for 1989 containing general statistical data is to produce basic information on existing kinds of thermal power in the countries concerned. With this information as a basis additional and more detailed information can be exchanged in direct contacts between bodies in the above mentioned countries according to forms established for that purpose. The report includes fossil steam power, nuclear power and gas turbines. The information is presented in separate diagrams for each country, but for plants burning fossil fuel also in a joint NORDEL statistics with data grouped according to type of fuel used. The grouping of units into classes of capacity has been made in accordance with the classification adopted by UNIPEDE/WEC. Values based on energy have been adopted as basic availability data. The same applies to the preference made in the definitions outlined by UNIPEDE and UNIPEDE/WEC. Some data based on time have been included to make possible comparisons with certain international values and for further illustration of the performance. For values given in the report, the definitions in the NORDEL document ''Concepts of Availability for Thermal Power, September 1977'', have been applied. (author)

  6. Conformity and statistical tolerancing

    Science.gov (United States)

    Leblond, Laurent; Pillet, Maurice

    2018-02-01

    Statistical tolerancing was first proposed by Shewhart (Economic Control of Quality of Manufactured Product, (1931) reprinted 1980 by ASQC), in spite of this long history, its use remains moderate. One of the probable reasons for this low utilization is undoubtedly the difficulty for designers to anticipate the risks of this approach. The arithmetic tolerance (worst case) allows a simple interpretation: conformity is defined by the presence of the characteristic in an interval. Statistical tolerancing is more complex in its definition. An interval is not sufficient to define the conformance. To justify the statistical tolerancing formula used by designers, a tolerance interval should be interpreted as the interval where most of the parts produced should probably be located. This tolerance is justified by considering a conformity criterion of the parts guaranteeing low offsets on the latter characteristics. Unlike traditional arithmetic tolerancing, statistical tolerancing requires a sustained exchange of information between design and manufacture to be used safely. This paper proposes a formal definition of the conformity, which we apply successively to the quadratic and arithmetic tolerancing. We introduce a concept of concavity, which helps us to demonstrate the link between tolerancing approach and conformity. We use this concept to demonstrate the various acceptable propositions of statistical tolerancing (in the space decentring, dispersion).

  7. Illustrated Guide to Astronomical Wonders From Novice to Master Observer

    CERN Document Server

    Thompson, Robert

    2011-01-01

    With the advent of inexpensive, high-power telescopes priced at under 250, amateur astronomy is now within the reach of anyone, and this is the ideal book to get you started. The Illustrated Guide to Astronomical Wonders offers you a guide to the equipment you need, and shows you how and where to find hundreds of spectacular objects in the deep sky -- double and multiple stars as well as spectacular star clusters, nebulae, and galaxies. You get a solid grounding in the fundamental concepts and terminology of astronomy, and specific advice about choosing, buying, using, and maintaining the eq

  8. Medical Illustration

    Science.gov (United States)

    ... as medical books, journals, magazines, pharma or biotech marketing, films, online video, exhibits, posters, wall charts, educational ... of the health career profession with strong communication skills, medical illustrators work closely with clients to interpret ...

  9. Spatial Foundations of Science Education: The Illustrative Case of Instruction on Introductory Geological Concepts

    Science.gov (United States)

    Liben, Lynn S.; Kastens, Kim A.; Christensen, Adam E.

    2011-01-01

    To study the role of spatial concepts in science learning, 125 college students with high, medium, or low scores on a horizontality (water-level) spatial task were given information about geological strike and dip using existing educational materials. Participants mapped an outcrop's strike and dip, a rod's orientation, pointed to a distant…

  10. Early Illustrations of Geste Antagoniste in Cervical and Generalized Dystonia

    Science.gov (United States)

    Broussolle, Emmanuel; Laurencin, Chloé; Bernard, Emilien; Thobois, Stéphane; Danaila, Teodor; Krack, Paul

    2015-01-01

    Background Geste antagoniste, or sensory trick, is a voluntary maneuver that temporarily reduces the severity of dystonic postures or movements. We present a historical review of early reports and illustrations of geste antagoniste. Results In 1894, Brissaud described this phenomenon in Paris in patients with torticollis. He noted that a violent muscular contraction could be reversed by a minor voluntary action. He considered the improvement obtained by what he called “simple mannerisms, childish behaviour or fake pathological movements” was proof of the psychogenic origin of what he named mental torticollis. This concept was supported by photographical illustrations of the patients. The term geste antagoniste was used by Brissaud’s pupils, Meige and Feindel, in their 1902 monograph on movement disorders. Other reports and illustrations of this sign were published in Europe between 1894 and 1906. Although not mentioned explicitly, geste antagoniste was also illustrated in a case report of generalized dystonia in Oppenheim’s 1911 seminal description of dystonia musculorum deformans in Berlin. Discussion Brissaud-Meige’s misinterpretation of the geste antagoniste unfortunately anchored the psychogenic origin of dystonia for decades. In New York, Herz brought dystonia back into the realm of organic neurology in 1944. Thereafter, it was given prominence by other authors, notably Fahn and Marsden in the 1970–1980s. Nowadays, neurologists routinely investigate for geste antagoniste when a dystonic syndrome is suspected, because it provides a further argument in favor of dystonia. The term alleviating maneuver was proposed in 2014 to replace sensory trick or geste antagoniste. This major sign is now part of the motor phenomenology of the 2013 Movement Disorder Society’s classification of dystonia. PMID:26417535

  11. Statistics for economics

    CERN Document Server

    Naghshpour, Shahdad

    2012-01-01

    Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...

  12. Statistical correlations in an ideal gas of particles obeying fractional exclusion statistics.

    Science.gov (United States)

    Pellegrino, F M D; Angilella, G G N; March, N H; Pucci, R

    2007-12-01

    After a brief discussion of the concepts of fractional exchange and fractional exclusion statistics, we report partly analytical and partly numerical results on thermodynamic properties of assemblies of particles obeying fractional exclusion statistics. The effect of dimensionality is one focal point, the ratio mu/k_(B)T of chemical potential to thermal energy being obtained numerically as a function of a scaled particle density. Pair correlation functions are also presented as a function of the statistical parameter, with Friedel oscillations developing close to the fermion limit, for sufficiently large density.

  13. Illustrating answers: an evaluation of automatically retrieved illustrations of answers to medical questions

    NARCIS (Netherlands)

    Bosma, W.E.; Theune, Mariet; van Hooijdonk, C.M.J.; Krahmer, E.; Maes, F.

    In this paper we discuss and evaluate a method for automatic text illustration, applied to answers to medical questions. Our method for selecting illustrations is based on the idea that similarities between the answers and picture-related text (the picture’s caption or the section/paragraph that

  14. Statistics Using Just One Formula

    Science.gov (United States)

    Rosenthal, Jeffrey S.

    2018-01-01

    This article advocates that introductory statistics be taught by basing all calculations on a single simple margin-of-error formula and deriving all of the standard introductory statistical concepts (confidence intervals, significance tests, comparisons of means and proportions, etc) from that one formula. It is argued that this approach will…

  15. Bayesian Statistics: Concepts and Applications in Animal Breeding – A Review

    Directory of Open Access Journals (Sweden)

    Lsxmikant-Sambhaji Kokate

    2011-07-01

    Full Text Available Statistics uses two major approaches- conventional (or frequentist and Bayesian approach. Bayesian approach provides a complete paradigm for both statistical inference and decision making under uncertainty. Bayesian methods solve many of the difficulties faced by conventional statistical methods, and extend the applicability of statistical methods. It exploits the use of probabilistic models to formulate scientific problems. To use Bayesian statistics, there is computational difficulty and secondly, Bayesian methods require specifying prior probability distributions. Markov Chain Monte-Carlo (MCMC methods were applied to overcome the computational difficulty, and interest in Bayesian methods was renewed. In Bayesian statistics, Bayesian structural equation model (SEM is used. It provides a powerful and flexible approach for studying quantitative traits for wide spectrum problems and thus it has no operational difficulties, with the exception of some complex cases. In this method, the problems are solved at ease, and the statisticians feel it comfortable with the particular way of expressing the results and employing the software available to analyze a large variety of problems.

  16. Playing at Statistical Mechanics

    Science.gov (United States)

    Clark, Paul M.; And Others

    1974-01-01

    Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)

  17. Statistics: a Bayesian perspective

    National Research Council Canada - National Science Library

    Berry, Donald A

    1996-01-01

    ...: it is the only introductory textbook based on Bayesian ideas, it combines concepts and methods, it presents statistics as a means of integrating data into the significant process, it develops ideas...

  18. Evolution of illustrations in anatomy: a study from the classical period in Europe to modern times.

    Science.gov (United States)

    Ghosh, Sanjib Kumar

    2015-01-01

    Illustrations constitute an essential element of learning anatomy in modern times. However it required a significant evolutionary process spread over centuries, for illustrations to achieve the present status in the subject of anatomy. This review article attempts to outline the evolutionary process by highlighting on the works of esteemed anatomists in a chronological manner. Available literature suggests that illustrations were not used in anatomy during the classical period when the subject was dominated by the descriptive text of Galen. Guido da Vigevano was first to use illustrations in anatomy during the Late Middle Ages and this concept developed further during the Renaissance period when Andreas Vesalius pioneered in illustrations becoming an indispensable tool in conveying anatomical details. Toward later stages of the Renaissance period, Fabricius ab Aquapendente endeavored to restrict dramatization of anatomical illustrations which was a prevalent trend in early Renaissance. During the 18th century, anatomical artwork was characterized by the individual styles of prominent anatomists leading to suppression of anatomical details. In the 19th century, Henry Gray used illustrations in his anatomical masterpiece that focused on depicting anatomical structures and were free from any artistic style. From early part of the 20th century medical images and photographs started to complement traditional handmade anatomical illustrations. Computer technology and advanced software systems played a key role in the evolution of anatomical illustrations during the late 20th century resulting in new generation 3D image datasets that are being used in the 21st century in innovative formats for teaching and learning anatomy. © 2014 American Association of Anatomists.

  19. Difficult cases for chromosomal dosimetry: Statistical considerations

    Energy Technology Data Exchange (ETDEWEB)

    Vinnikov, Volodymyr A., E-mail: vlad.vinnikov@mail.ru [Grigoriev Institute for Medical Radiology of the National Academy of Medical Science of Ukraine, Pushkinskaya Street 82, Kharkiv 61024 (Ukraine); Ainsbury, Elizabeth A., E-mail: liz.ainsbury@hpa.org.uk [Health Protection Agency, Centre for Radiation, Chemical and Environmental Hazards, Chilton, Didcot, Oxon OX11 0RQ (United Kingdom); Lloyd, David C., E-mail: david.lloyd@hpa.org.uk [Health Protection Agency, Centre for Radiation, Chemical and Environmental Hazards, Chilton, Didcot, Oxon OX11 0RQ (United Kingdom); Maznyk, Nataliya A., E-mail: maznik.cytogen@mail.ru [Grigoriev Institute for Medical Radiology of the National Academy of Medical Science of Ukraine, Pushkinskaya Street 82, Kharkiv 61024 (Ukraine); Rothkamm, Kai, E-mail: kai.rothkamm@hpa.org.uk [Health Protection Agency, Centre for Radiation, Chemical and Environmental Hazards, Chilton, Didcot, Oxon OX11 0RQ (United Kingdom)

    2011-09-15

    Several examples are selected from the literature in order to illustrate combinations of complicating factors, which may occur in real-life radiation exposure scenarios that affect the accuracy of cytogenetic dose estimates. An analysis of limitations in the current statistical methods used in biodosimetry was carried out. Possible directions for further improvement of the statistical basis of chromosomal dosimetry by specific mathematical procedures are outlined.

  20. Radiology illustrated. Gastrointestinal tract

    International Nuclear Information System (INIS)

    Choi, Byung Ihn

    2015-01-01

    Radiology Illustrated: Gastrointestinal Tract is the second of two volumes designed to provide clear and practical guidance on the diagnostic imaging of abdominal diseases. The book presents approximately 300 cases with 1500 carefully selected and categorized illustrations of gastrointestinal tract diseases, along with key text messages and tables that will help the reader easily to recall the relevant images as an aid to differential diagnosis., Essential points are summarized at the end of each text message to facilitate rapid review and learning. Additionally, brief descriptions of each clinical problem are provided, followed by case studies of both common and uncommon pathologies that illustrate the roles of the different imaging modalities, including ultrasound, radiography, computed tomography, and magnetic resonance imaging.

  1. Radiology illustrated. Gastrointestinal tract

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Byung Ihn (ed.) [Seoul National University Hospital (Korea, Republic of). Dept. of Radiology

    2015-02-01

    Radiology Illustrated: Gastrointestinal Tract is the second of two volumes designed to provide clear and practical guidance on the diagnostic imaging of abdominal diseases. The book presents approximately 300 cases with 1500 carefully selected and categorized illustrations of gastrointestinal tract diseases, along with key text messages and tables that will help the reader easily to recall the relevant images as an aid to differential diagnosis., Essential points are summarized at the end of each text message to facilitate rapid review and learning. Additionally, brief descriptions of each clinical problem are provided, followed by case studies of both common and uncommon pathologies that illustrate the roles of the different imaging modalities, including ultrasound, radiography, computed tomography, and magnetic resonance imaging.

  2. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  3. Monthly bulletin of statistics. May 1995

    International Nuclear Information System (INIS)

    1995-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  4. Monthly bulletin of statistics. January 1996

    International Nuclear Information System (INIS)

    1996-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  5. Monthly bulletin of statistics. July 1997

    International Nuclear Information System (INIS)

    1997-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  6. Monthly bulletin of statistics. June 2001

    International Nuclear Information System (INIS)

    2001-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  7. Monthly bulletin of statistics. September 1995

    International Nuclear Information System (INIS)

    1995-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  8. Monthly bulletin of statistics. December 1994

    International Nuclear Information System (INIS)

    1994-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  9. Monthly bulletin of statistics. July 1995

    International Nuclear Information System (INIS)

    1995-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  10. Monthly bulletin of statistics. September 1994

    International Nuclear Information System (INIS)

    1994-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  11. Monthly bulletin of statistics. March 1995

    International Nuclear Information System (INIS)

    1994-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  12. Monthly bulletin of statistics. October 2007

    International Nuclear Information System (INIS)

    2007-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  13. Monthly bulletin of statistics. October 1993

    International Nuclear Information System (INIS)

    1993-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  14. Monthly Bulletin of Statistics. July 1993

    International Nuclear Information System (INIS)

    1993-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  15. Monthly bulletin of statistics. March 1994

    International Nuclear Information System (INIS)

    1994-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  16. Monthly bulletin of statistics. February 1994

    International Nuclear Information System (INIS)

    1994-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  17. Monthly bulletin of statistics. June 1995

    International Nuclear Information System (INIS)

    1995-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  18. Monthly bulletin of statistics. November 2008

    International Nuclear Information System (INIS)

    2008-01-01

    The purpose of this publication is to present current monthly economic statistics for most of the countries and territories of the world. In addition, each month a different selection of special tables is presented showing annual and/or quarterly data on a variety of subjects illustrating important economic long-term trends and developments. Most of these special tables are also reproduced in the United Nations Statistical Yearbook. It is, however, considered to be useful to publish these data in the Bulletin as soon as they become available so that readers may have immediate access to the most current international statistical information

  19. Development of a concept for radiation patients exposure assessment during dental X-ray examinations and statistical data acquisition for the determination of a diagnostic reference value

    International Nuclear Information System (INIS)

    Kueppers, C.; Sering, M.; Poppe, B.; Poplawski, A.; Looe, H.K.; Beyer, D.; Pfaffenberger, A.; Chofor, N.; Eenboom, F.

    2012-01-01

    The research project on the development a concept for radiation patients exposure assessment during dental X-ray examinations and statistical data acquisition for the determination of a diagnostic reference value includes the following issues: Fundamental facts: dental X-ray examination techniques, dose relevant factors and characteristics during X-ray examinations, radiation exposed organs during dental X-ray examinations, dose assessment based on phantoms. Materials and methodologies of the project: TLD measurements using the phantom, calculation of the effective dose during dental X-ray examinations, properties and settings of the reference facilities for the determination of radiation exposure, selection of dental offices, dosimetric measurements, data acquisition and statistical evaluation. Results of dosimetric examinations: results of dosimetric measurements at reference facilities, results of dosimetric measurements in dental offices. Discussion of the concept for the determination of the radiation exposure during dental X-ray examinations.

  20. Stupid statistics!

    Science.gov (United States)

    Tellinghuisen, Joel

    2008-01-01

    The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.

  1. A DIY guide to statistical strategy

    International Nuclear Information System (INIS)

    Pregibon, D.

    1986-01-01

    This chapter is a do it yourself (DIY) guide to developing statistical strategy for a particular data analysis task. The primary audience of the chapter is research statisticians. The material should also be of interest to nonstatisticians, especially knowledge engineers, who are interested in using statistical data analysis as an application domain for Artificial Intelligence techniques. The do's and don'ts of strategy development are outlined. The ''linear regression'' task is used to illustrate many of the ideas

  2. From curve fitting to machine learning an illustrative guide to scientific data analysis and computational intelligence

    CERN Document Server

    Zielesny, Achim

    2016-01-01

    This successful book provides in its second edition an interactive and illustrative guide from two-dimensional curve fitting to multidimensional clustering and machine learning with neural networks or support vector machines. Along the way topics like mathematical optimization or evolutionary algorithms are touched. All concepts and ideas are outlined in a clear cut manner with graphically depicted plausibility arguments and a little elementary mathematics. The major topics are extensively outlined with exploratory examples and applications. The primary goal is to be as illustrative as possible without hiding problems and pitfalls but to address them. The character of an illustrative cookbook is complemented with specific sections that address more fundamental questions like the relation between machine learning and human intelligence. All topics are completely demonstrated with the computing platform Mathematica and the Computational Intelligence Packages (CIP), a high-level function library developed with M...

  3. Lectures on algebraic statistics

    CERN Document Server

    Drton, Mathias; Sullivant, Seth

    2009-01-01

    How does an algebraic geometer studying secant varieties further the understanding of hypothesis tests in statistics? Why would a statistician working on factor analysis raise open problems about determinantal varieties? Connections of this type are at the heart of the new field of "algebraic statistics". In this field, mathematicians and statisticians come together to solve statistical inference problems using concepts from algebraic geometry as well as related computational and combinatorial techniques. The goal of these lectures is to introduce newcomers from the different camps to algebraic statistics. The introduction will be centered around the following three observations: many important statistical models correspond to algebraic or semi-algebraic sets of parameters; the geometry of these parameter spaces determines the behaviour of widely used statistical inference procedures; computational algebraic geometry can be used to study parameter spaces and other features of statistical models.

  4. Structural reliability in context of statistical uncertainties and modelling discrepancies

    International Nuclear Information System (INIS)

    Pendola, Maurice

    2000-01-01

    Structural reliability methods have been largely improved during the last years and have showed their ability to deal with uncertainties during the design stage or to optimize the functioning and the maintenance of industrial installations. They are based on a mechanical modeling of the structural behavior according to the considered failure modes and on a probabilistic representation of input parameters of this modeling. In practice, only limited statistical information is available to build the probabilistic representation and different sophistication levels of the mechanical modeling may be introduced. Thus, besides the physical randomness, other uncertainties occur in such analyses. The aim of this work is triple: 1. at first, to propose a methodology able to characterize the statistical uncertainties due to the limited number of data in order to take them into account in the reliability analyses. The obtained reliability index measures the confidence in the structure considering the statistical information available. 2. Then, to show a methodology leading to reliability results evaluated from a particular mechanical modeling but by using a less sophisticated one. The objective is then to decrease the computational efforts required by the reference modeling. 3. Finally, to propose partial safety factors that are evolving as a function of the number of statistical data available and as a function of the sophistication level of the mechanical modeling that is used. The concepts are illustrated in the case of a welded pipe and in the case of a natural draught cooling tower. The results show the interest of the methodologies in an industrial context. [fr

  5. Illustrations and supporting texts for sound standing waves of air columns in pipes in introductory physics textbooks

    Directory of Open Access Journals (Sweden)

    Liang Zeng

    2014-07-01

    Full Text Available In our pilot studies, we found that many introductory physics textbook illustrations with supporting text for sound standing waves of air columns in open-open, open-closed, and closed-closed pipes inhibit student understanding of sound standing wave phenomena due to student misunderstanding of how air molecules move within these pipes. Based on the construct of meaningful learning from cognitive psychology and semiotics, a quasiexperimental study was conducted to investigate the comparative effectiveness of two alternative approaches to student understanding: a traditional textbook illustration approach versus a newly designed air molecule motion illustration approach. Thirty volunteer students from introductory physics classes were randomly assigned to two groups of 15 each. Both groups were administered a presurvey. Then, group A read the air molecule motion illustration handout, and group B read a traditional textbook illustration handout; both groups were administered postsurveys. Subsequently, the procedure was reversed: group B read the air molecule motion illustration handout and group A read the traditional textbook illustration handout. This was followed by a second postsurvey along with an exit research questionnaire. The study found that the majority of students experienced meaningful learning and stated that they understood sound standing wave phenomena significantly better using the air molecule motion illustration approach. This finding provides a method for physics education researchers to design illustrations for abstract sound standing wave concepts, for publishers to improve their illustrations with supporting text, and for instructors to facilitate deeper learning in their students on sound standing waves.

  6. A statistical manual for chemists

    CERN Document Server

    Bauer, Edward

    1971-01-01

    A Statistical Manual for Chemists, Second Edition presents simple and fast statistical tools for data analysis of working chemists. This edition is organized into nine chapters and begins with an overview of the fundamental principles of the statistical techniques used in experimental data analysis. The subsequent chapters deal with the concept of statistical average, experimental design, and analysis of variance. The discussion then shifts to control charts, with particular emphasis on variable charts that are more useful to chemists and chemical engineers. A chapter focuses on the effect

  7. Causality Statistical Perspectives and Applications

    CERN Document Server

    Berzuini, Carlo; Bernardinell, Luisa

    2012-01-01

    A state of the art volume on statistical causality Causality: Statistical Perspectives and Applications presents a wide-ranging collection of seminal contributions by renowned experts in the field, providing a thorough treatment of all aspects of statistical causality. It covers the various formalisms in current use, methods for applying them to specific problems, and the special requirements of a range of examples from medicine, biology and economics to political science. This book:Provides a clear account and comparison of formal languages, concepts and models for statistical causality. Addr

  8. The conceptual basis of mathematics in cardiology IV: statistics and model fitting.

    Science.gov (United States)

    Bates, Jason H T; Sobel, Burton E

    2003-06-01

    This is the fourth in a series of four articles developed for the readers of Coronary Artery Disease. Without language ideas cannot be articulated. What may not be so immediately obvious is that they cannot be formulated either. One of the essential languages of cardiology is mathematics. Unfortunately, medical education does not emphasize, and in fact, often neglects empowering physicians to think mathematically. Reference to statistics, conditional probability, multicompartmental modeling, algebra, calculus and transforms is common but often without provision of genuine conceptual understanding. At the University of Vermont College of Medicine, Professor Bates developed a course designed to address these deficiencies. The course covered mathematical principles pertinent to clinical cardiovascular and pulmonary medicine and research. It focused on fundamental concepts to facilitate formulation and grasp of ideas. This series of four articles was developed to make the material available for a wider audience. The articles will be published sequentially in Coronary Artery Disease. Beginning with fundamental axioms and basic algebraic manipulations they address algebra, function and graph theory, real and complex numbers, calculus and differential equations, mathematical modeling, linear system theory and integral transforms and statistical theory. The principles and concepts they address provide the foundation needed for in-depth study of any of these topics. Perhaps of even more importance, they should empower cardiologists and cardiovascular researchers to utilize the language of mathematics in assessing the phenomena of immediate pertinence to diagnosis, pathophysiology and therapeutics. The presentations are interposed with queries (by Coronary Artery Disease abbreviated as CAD) simulating the nature of interactions that occurred during the course itself. Each article concludes with one or more examples illustrating application of the concepts covered to

  9. Statistical model with two order parameters for ductile and soft fiber bundles in nanoscience and biomaterials.

    Science.gov (United States)

    Rinaldi, Antonio

    2011-04-01

    Traditional fiber bundles models (FBMs) have been an effective tool to understand brittle heterogeneous systems. However, fiber bundles in modern nano- and bioapplications demand a new generation of FBM capturing more complex deformation processes in addition to damage. In the context of loose bundle systems and with reference to time-independent plasticity and soft biomaterials, we formulate a generalized statistical model for ductile fracture and nonlinear elastic problems capable of handling more simultaneous deformation mechanisms by means of two order parameters (as opposed to one). As the first rational FBM for coupled damage problems, it may be the cornerstone for advanced statistical models of heterogeneous systems in nanoscience and materials design, especially to explore hierarchical and bio-inspired concepts in the arena of nanobiotechnology. Applicative examples are provided for illustrative purposes at last, discussing issues in inverse analysis (i.e., nonlinear elastic polymer fiber and ductile Cu submicron bars arrays) and direct design (i.e., strength prediction).

  10. Medical/Scientific Illustration And Production Of Otological Health Awareness Materials

    Science.gov (United States)

    Hawes, Nicholas E.

    2004-01-01

    Over the past year, I have worked for my mentor, Beth Cooper, on a large variety of projects. Beth is the Manager of the Acoustical Testing Laboratory, which tests the acoustical emissions of payloads destined for the International Space Station. She is also responsible for educating, and developing new methods of educating, people of all occupational and educational backgrounds in hearing conservation. Beth spends much of her time developing new materials and strategies with which to train people and teach other people to train people in hearing conservation and noise emissions control. I have been helping Beth develop and market these materials by way of graphic design and scientific illustration. Last summer, I spent much of my time creating educational illustrations that visually explained particular concepts in Beth's presentations. Sometimes these illustrations were small "comics" while, at other times, they were an instructional series of illustrations. Since then, Beth and her lab have been developing and updating some materials which will be distributed free to hearing conservation and noise control professionals and others in related fields. I have helped with these projects by designing their packaging. In each instance, it was my responsibility to develop an aesthetically appealing package that would also, through its imagery, describe or summarize the contents of the product. I did this for 3 CD's (Auditory Demonstrations 11, MACSUG, and JeopEARdy) and saw them through their actual production and distribution. In addition to working with Beth, I work with the Imaging Technology Center on various imaging projects. Some of my activities include photo retouching and manipulation for videos and print. This summer, I also had the opportunity to develop a screen saver that would show of some of the photography contained on the soon-to-be-released "Highlights of the GRC Image Archives, vol. 2". I was also able to utilize my medical training to help several of

  11. Concept research on general passive system

    International Nuclear Information System (INIS)

    Han Xu; Yang Yanhua; Zheng Mingguang

    2009-01-01

    This paper summarized the current passive techniques used in nuclear power plants. Through classification and analysis, the functional characteristics and inherent identification of passive systems were elucidated. By improving and extending the concept of passive system, the general passive concept was proposed, and space and time relativity was discussed and assumption of general passive system were illustrated. The function of idealized general passive system is equivalent with the current passive system, but the design of idealized general passive system is more flexible. (authors)

  12. On the Distinction Between the Motivating Operation and Setting Event Concepts.

    Science.gov (United States)

    Nosik, Melissa R; Carr, James E

    2015-10-01

    In recent decades, behavior analysts have generally used two different concepts to speak about motivational influences on operant contingencies: setting event and motivating operation. Although both concepts still appear in the contemporary behavior-analytic literature and were designed to address the same antecedent phenomena, the concepts are quite different. The purpose of the present article is to describe and distinguish the concepts and to illustrate their current usage.

  13. Teaching Statistics in Middle School Mathematics Classrooms: Making Links with Mathematics but Avoiding Statistical Reasoning

    Science.gov (United States)

    Savard, Annie; Manuel, Dominic

    2015-01-01

    Statistics is a domain that is taught in Mathematics in all school levels. We suggest a potential in using an interdisciplinary approach with this concept. Thus the development of the understanding of a situation might mean to use both mathematical and statistical reasoning. In this paper, we present two case studies where two middle school…

  14. Health Habit: A Concept Analysis.

    Science.gov (United States)

    Opalinski, Andra S; Weglicki, Linda S; Gropper, Sareen S

    2018-01-01

    The aim of this article is to provide clarity of the concept of health habit. Using Walker and Avant's (1983; 2010) method for conducting a concept analysis, the authors identify the attributes and characteristics of health habit, its theoretical and practical application to nursing, and sample cases to further illustrate the concept. Empirical and conceptual literature was used to inform this concept analysis. Articles and one book from 1977 to 2014 were reviewed from PsycINFO, Medline, Cumulative Index to Nursing Health Literature (CINAHL), Science Direct, EBSCOhost and Web of Science. Offering a clear definition and conceptual model of health habit provide the foundation to identify/develop appropriate measures of the concept and guide further investigation of understanding the development and sustainability of healthy habits. Additional research is needed to test the conceptual relationships between health habits and outcome variables as they apply to different groups across the age continuum. © 2017 Wiley Periodicals, Inc.

  15. Learning Psychological Research and Statistical Concepts using Retrieval-based Practice

    Directory of Open Access Journals (Sweden)

    Stephen Wee Hun eLim

    2015-10-01

    Full Text Available Research methods and statistics are an indispensable subject in the undergraduate psychology curriculum, but there are challenges associated with teaching it, such as making learning durable. Here we hypothesized that retrieval-based learning promotes long-term retention of statistical knowledge in psychology. Participants either studied the educational material in four consecutive periods, or studied it just once and practised retrieving the information in the subsequent three periods, and then took a final test through which their learning was assessed. Whereas repeated studying yielded better test performance when the final test was immediately administered, repeated practice yielded better performance when the test was administered a week after. The data suggest that retrieval practice enhanced the learning – produced better long-term retention – of statistical knowledge in psychology than did repeated studying.

  16. Learning Psychological Research and Statistical Concepts using Retrieval-based Practice

    OpenAIRE

    Stephen Wee Hun eLim; Gavin Jun Peng eNg; Gabriel Qi Hao eWong

    2015-01-01

    Research methods and statistics are an indispensable subject in the undergraduate psychology curriculum, but there are challenges associated with engaging students in it, such as making learning durable. Here we hypothesized that retrieval-based learning promotes long-term retention of statistical knowledge in psychology. Participants either studied the educational material in four consecutive periods, or studied it just once and practiced retrieving the information in the subsequent three pe...

  17. An easy and low cost option for economic statistical process control ...

    African Journals Online (AJOL)

    An easy and low cost option for economic statistical process control using Excel. ... in both economic and economic statistical designs of the X-control chart. ... in this paper and the numerical examples illustrated are executed on this program.

  18. Weighted A-statistical convergence for sequences of positive linear operators.

    Science.gov (United States)

    Mohiuddine, S A; Alotaibi, Abdullah; Hazarika, Bipan

    2014-01-01

    We introduce the notion of weighted A-statistical convergence of a sequence, where A represents the nonnegative regular matrix. We also prove the Korovkin approximation theorem by using the notion of weighted A-statistical convergence. Further, we give a rate of weighted A-statistical convergence and apply the classical Bernstein polynomial to construct an illustrative example in support of our result.

  19. Basic Concepts of CNS Development.

    Science.gov (United States)

    Nowakowski, R. S.

    1987-01-01

    The goals of this review are to: (1) provide a set of concepts to aid in the understanding of complex processes which occur during central nervous system (CNS) development; (2) illustrate how they contribute to our knowlege of adult brain anatomy; and (3) delineate how modifications of normal developmental processes may affect the structure and…

  20. A Palatable Introduction to and Demonstration of Statistical Main Effects and Interactions

    Science.gov (United States)

    Christopher, Andrew N.; Marek, Pam

    2009-01-01

    Because concrete explanations in a familiar context facilitate understanding, we illustrate the concept of an interaction via a baking analogy to provide students with food for thought. The demonstration initially introduces the concepts of independent and dependent variables using a chocolate chip cookie recipe. The demonstration provides an…

  1. Naïve conceptions about multimedia learning: a study on primary school textbooks.

    Science.gov (United States)

    Colombo, Barbara; Antonietti, Alessandro

    2013-01-01

    HIGHLIGHTSThis interview study explores beliefs about the instructional role of illustrationsWe compared illustrators', teachers', students' and common people's ideasParticipants' responses were internally coherent and close to multimedia learning theoryWe propose and discuss an integrated multimedia learning model An interview study, based on specific pictures taken from textbooks used in primary schools, was carried out to investigate illustrators', teachers', students', and common people's beliefs about the role that illustrations play in facilitating learning. Participants' responses were internally coherent, indicating a systematic nature of the underlying naïve conceptions. Findings disprove Mayer's pessimistic claim that laypersons' conceptions of multimedia learning fail to match experimentally supported principles and theories. On the contrary, interviewees spontaneously came very close to the multimedia learning theory, which states that students learn better from pictures, which fit specific cognitive principles. Implications for school instruction are highlighted.

  2. Prospective Elementary Teachers' Conceptions of Unitizing with Whole Numbers and Fractions

    Science.gov (United States)

    Tobias, Jennifer M.; Roy, George J.; Safi, Farshid

    2015-01-01

    This article examines prospective elementary teachers' conceptions of unitizing with whole numbers and fraction concepts and operations throughout a semester-long mathematics content course. Student work samples and classroom conversations are used to illustrate the types of unitizing understandings that prospective teachers bring to teacher…

  3. Radiation epidemiology: concept, methodology and statistical resources

    International Nuclear Information System (INIS)

    Vasques, Monica H. Braga; Carneiro, Janete C. Gaburo; Sordi, Gian M.A.

    2008-01-01

    As radiation exposure is the main point of interest in radiation epidemiology, epidemiologists try to relate the risk of diseases (mainly the risk of cancer) to the different levels and patterns of humankind exposure to radiation. Statistics as a branch of mathematics is able to prove associations and infer causality. As many researches are object of methodological limits, mainly those related to both the insufficient size of the sample and descriptive analysis as well as the choice of methods and variables, this paper aims at describing firstly the main kinds of epidemiological studies. Secondly, it relates distributions and summary measures (central tendency measures, measures of dispersion and normal distributions) and hypothesis tests as well necessary for each study. It also discusses the most appropriate statistical resource to the epidemiological evaluation. Finally, the main aim of this study is both to elaborate a systematic review of the researches that have been already done in Brazil since 2000, focusing on the effects caused by the occupational exposures to ionizing radiation in order to establish positive associations between them and to analyze the risk related to the workers health. This paper has as its basis the Reports in Public Health (Public Health Books-CSP) from which several studies about the exposure effects to ionizing radiation and referred kinds of cancer (e.g.: leukemia, skin cancer, thyroid gland cancer and bone cancer) have been taken as object of analysis. The relevance of this study lies in the most applied methods of risk to establish positive associations in ionizing radiation, in the relation between workers' workplace and his health. (author)

  4. 45 CFR 1170.13 - Illustrative examples.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 3 2010-10-01 2010-10-01 false Illustrative examples. 1170.13 Section 1170.13... ASSISTED PROGRAMS OR ACTIVITIES Discrimination Prohibited § 1170.13 Illustrative examples. (a) The following examples will illustrate the application of the foregoing provisions to some of the activities...

  5. Using Statistical Process Control to Make Data-Based Clinical Decisions.

    Science.gov (United States)

    Pfadt, Al; Wheeler, Donald J.

    1995-01-01

    Statistical process control (SPC), which employs simple statistical tools and problem-solving techniques such as histograms, control charts, flow charts, and Pareto charts to implement continual product improvement procedures, can be incorporated into human service organizations. Examples illustrate use of SPC procedures to analyze behavioral data…

  6. Artificial intelligence approaches in statistics

    International Nuclear Information System (INIS)

    Phelps, R.I.; Musgrove, P.B.

    1986-01-01

    The role of pattern recognition and knowledge representation methods from Artificial Intelligence within statistics is considered. Two areas of potential use are identified and one, data exploration, is used to illustrate the possibilities. A method is presented to identify and separate overlapping groups within cluster analysis, using an AI approach. The potential of such ''intelligent'' approaches is stressed

  7. Some properties of point processes in statistical optics

    International Nuclear Information System (INIS)

    Picinbono, B.; Bendjaballah, C.

    2010-01-01

    The analysis of the statistical properties of the point process (PP) of photon detection times can be used to determine whether or not an optical field is classical, in the sense that its statistical description does not require the methods of quantum optics. This determination is, however, more difficult than ordinarily admitted and the first aim of this paper is to illustrate this point by using some results of the PP theory. For example, it is well known that the analysis of the photodetection of classical fields exhibits the so-called bunching effect. But this property alone cannot be used to decide the nature of a given optical field. Indeed, we have presented examples of point processes for which a bunching effect appears and yet they cannot be obtained from a classical field. These examples are illustrated by computer simulations. Similarly, it is often admitted that for fields with very low light intensity the bunching or antibunching can be described by using the statistical properties of the distance between successive events of the point process, which simplifies the experimental procedure. We have shown that, while this property is valid for classical PPs, it has no reason to be true for nonclassical PPs, and we have presented some examples of this situation also illustrated by computer simulations.

  8. Statistical inference a short course

    CERN Document Server

    Panik, Michael J

    2012-01-01

    A concise, easily accessible introduction to descriptive and inferential techniques Statistical Inference: A Short Course offers a concise presentation of the essentials of basic statistics for readers seeking to acquire a working knowledge of statistical concepts, measures, and procedures. The author conducts tests on the assumption of randomness and normality, provides nonparametric methods when parametric approaches might not work. The book also explores how to determine a confidence interval for a population median while also providing coverage of ratio estimation, randomness, and causal

  9. Quantum-statistical kinetic equations

    International Nuclear Information System (INIS)

    Loss, D.; Schoeller, H.

    1989-01-01

    Considering a homogeneous normal quantum fluid consisting of identical interacting fermions or bosons, the authors derive an exact quantum-statistical generalized kinetic equation with a collision operator given as explicit cluster series where exchange effects are included through renormalized Liouville operators. This new result is obtained by applying a recently developed superoperator formalism (Liouville operators, cluster expansions, symmetrized projectors, P q -rule, etc.) to nonequilibrium systems described by a density operator ρ(t) which obeys the von Neumann equation. By means of this formalism a factorization theorem is proven (being essential for obtaining closed equations), and partial resummations (leading to renormalized quantities) are performed. As an illustrative application, the quantum-statistical versions (including exchange effects due to Fermi-Dirac or Bose-Einstein statistics) of the homogeneous Boltzmann (binary collisions) and Choh-Uhlenbeck (triple collisions) equations are derived

  10. Identifying overrepresented concepts in gene lists from literature: a statistical approach based on Poisson mixture model

    Directory of Open Access Journals (Sweden)

    Zhai Chengxiang

    2010-05-01

    Full Text Available Abstract Background Large-scale genomic studies often identify large gene lists, for example, the genes sharing the same expression patterns. The interpretation of these gene lists is generally achieved by extracting concepts overrepresented in the gene lists. This analysis often depends on manual annotation of genes based on controlled vocabularies, in particular, Gene Ontology (GO. However, the annotation of genes is a labor-intensive process; and the vocabularies are generally incomplete, leaving some important biological domains inadequately covered. Results We propose a statistical method that uses the primary literature, i.e. free-text, as the source to perform overrepresentation analysis. The method is based on a statistical framework of mixture model and addresses the methodological flaws in several existing programs. We implemented this method within a literature mining system, BeeSpace, taking advantage of its analysis environment and added features that facilitate the interactive analysis of gene sets. Through experimentation with several datasets, we showed that our program can effectively summarize the important conceptual themes of large gene sets, even when traditional GO-based analysis does not yield informative results. Conclusions We conclude that the current work will provide biologists with a tool that effectively complements the existing ones for overrepresentation analysis from genomic experiments. Our program, Genelist Analyzer, is freely available at: http://workerbee.igb.uiuc.edu:8080/BeeSpace/Search.jsp

  11. Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference

    Energy Technology Data Exchange (ETDEWEB)

    Beggs, W.J.

    1981-02-01

    This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; the analysis of variance; quality control procedures; and linear regression analysis.

  12. Accounting for uncertainty in ecological analysis: the strengths and limitations of hierarchical statistical modeling.

    Science.gov (United States)

    Cressie, Noel; Calder, Catherine A; Clark, James S; Ver Hoef, Jay M; Wikle, Christopher K

    2009-04-01

    Analyses of ecological data should account for the uncertainty in the process(es) that generated the data. However, accounting for these uncertainties is a difficult task, since ecology is known for its complexity. Measurement and/or process errors are often the only sources of uncertainty modeled when addressing complex ecological problems, yet analyses should also account for uncertainty in sampling design, in model specification, in parameters governing the specified model, and in initial and boundary conditions. Only then can we be confident in the scientific inferences and forecasts made from an analysis. Probability and statistics provide a framework that accounts for multiple sources of uncertainty. Given the complexities of ecological studies, the hierarchical statistical model is an invaluable tool. This approach is not new in ecology, and there are many examples (both Bayesian and non-Bayesian) in the literature illustrating the benefits of this approach. In this article, we provide a baseline for concepts, notation, and methods, from which discussion on hierarchical statistical modeling in ecology can proceed. We have also planted some seeds for discussion and tried to show where the practical difficulties lie. Our thesis is that hierarchical statistical modeling is a powerful way of approaching ecological analysis in the presence of inevitable but quantifiable uncertainties, even if practical issues sometimes require pragmatic compromises.

  13. Statistical analysis of management data

    CERN Document Server

    Gatignon, Hubert

    2013-01-01

    This book offers a comprehensive approach to multivariate statistical analyses. It provides theoretical knowledge of the concepts underlying the most important multivariate techniques and an overview of actual applications.

  14. Representative volume size: A comparison of statistical continuum mechanics and statistical physics

    Energy Technology Data Exchange (ETDEWEB)

    AIDUN,JOHN B.; TRUCANO,TIMOTHY G.; LO,CHI S.; FYE,RICHARD M.

    1999-05-01

    In this combination background and position paper, the authors argue that careful work is needed to develop accurate methods for relating the results of fine-scale numerical simulations of material processes to meaningful values of macroscopic properties for use in constitutive models suitable for finite element solid mechanics simulations. To provide a definite context for this discussion, the problem is couched in terms of the lack of general objective criteria for identifying the size of the representative volume (RV) of a material. The objective of this report is to lay out at least the beginnings of an approach for applying results and methods from statistical physics to develop concepts and tools necessary for determining the RV size, as well as alternatives to RV volume-averaging for situations in which the RV is unmanageably large. The background necessary to understand the pertinent issues and statistical physics concepts is presented.

  15. Explorations in Statistics: Hypothesis Tests and P Values

    Science.gov (United States)

    Curran-Everett, Douglas

    2009-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This second installment of "Explorations in Statistics" delves into test statistics and P values, two concepts fundamental to the test of a scientific null hypothesis. The essence of a test statistic is that it compares what…

  16. Statistical Data Analyses of Trace Chemical, Biochemical, and Physical Analytical Signatures

    Energy Technology Data Exchange (ETDEWEB)

    Udey, Ruth Norma [Michigan State Univ., East Lansing, MI (United States)

    2013-01-01

    Analytical and bioanalytical chemistry measurement results are most meaningful when interpreted using rigorous statistical treatments of the data. The same data set may provide many dimensions of information depending on the questions asked through the applied statistical methods. Three principal projects illustrated the wealth of information gained through the application of statistical data analyses to diverse problems.

  17. Weighted A-Statistical Convergence for Sequences of Positive Linear Operators

    Directory of Open Access Journals (Sweden)

    S. A. Mohiuddine

    2014-01-01

    Full Text Available We introduce the notion of weighted A-statistical convergence of a sequence, where A represents the nonnegative regular matrix. We also prove the Korovkin approximation theorem by using the notion of weighted A-statistical convergence. Further, we give a rate of weighted A-statistical convergence and apply the classical Bernstein polynomial to construct an illustrative example in support of our result.

  18. An introduction to statistical mechanics and thermodynamics

    CERN Document Server

    Swendsen, Robert H

    2012-01-01

    This text presents the two complementary aspects of thermal physics as an integrated theory of the properties of matter. Conceptual understanding is promoted by thorough development of basic concepts. In contrast to many texts, statistical mechanics, including discussion of the required probability theory, is presented first. This provides a statistical foundation for the concept of entropy, which is central to thermal physics. A unique feature of the book is the development ofentropy based on Boltzmann's 1877 definition; this avoids contradictions or ad hoc corrections found in other texts. D

  19. Structural concepts and details for seismic design

    International Nuclear Information System (INIS)

    Johnson, M.W.; Smietana, E.A.; Murray, R.C.

    1991-01-01

    As a part of the DOE Natural Phenomena Hazards Program, a new manual has been developed, entitled UCRL-CR-106554, open-quotes Structural Concepts and Details for Seismic Design.close quotes This manual describes and illustrates good practice for seismic-resistant design

  20. Applied statistical thermodynamics

    CERN Document Server

    Lucas, Klaus

    1991-01-01

    The book guides the reader from the foundations of statisti- cal thermodynamics including the theory of intermolecular forces to modern computer-aided applications in chemical en- gineering and physical chemistry. The approach is new. The foundations of quantum and statistical mechanics are presen- ted in a simple way and their applications to the prediction of fluid phase behavior of real systems are demonstrated. A particular effort is made to introduce the reader to expli- cit formulations of intermolecular interaction models and to show how these models influence the properties of fluid sy- stems. The established methods of statistical mechanics - computer simulation, perturbation theory, and numerical in- tegration - are discussed in a style appropriate for newcom- ers and are extensively applied. Numerous worked examples illustrate how practical calculations should be carried out.

  1. Using pattern structures to support information retrieval with Formal Concept Analysis

    OpenAIRE

    Codocedo , Victor; Lykourentzou , Ioanna; Astudillo , Hernan; Napoli , Amedeo

    2013-01-01

    International audience; In this paper we introduce a novel approach to information retrieval (IR) based on Formal Concept Analysis (FCA). The use of concept lattices to support the task of document retrieval in IR has proven effective since they allow querying in the space of terms modelled by concept intents and navigation in the space of documents modelled by concept extents. However, current approaches use binary representations to illustrate the relations between documents and terms (''do...

  2. Statistical aspects of nuclear structure

    International Nuclear Information System (INIS)

    Parikh, J.C.

    1977-01-01

    The statistical properties of energy levels and a statistical approach to transition strengths are discussed in relation to nuclear structure studies at high excitation energies. It is shown that the calculations can be extended to the ground state domain also. The discussion is based on the study of random matrix theory of level density and level spacings, using the Gaussian Orthogonal Ensemble (GOE) concept. The short range and long range correlations are also studied statistically. The polynomial expansion method is used to obtain excitation strengths. (A.K.)

  3. Statistical electromagnetics: Complex cavities

    NARCIS (Netherlands)

    Naus, H.W.L.

    2008-01-01

    A selection of the literature on the statistical description of electromagnetic fields and complex cavities is concisely reviewed. Some essential concepts, for example, the application of the central limit theorem and the maximum entropy principle, are scrutinized. Implicit assumptions, biased

  4. Modern applied U-statistics

    CERN Document Server

    Kowalski, Jeanne

    2008-01-01

    A timely and applied approach to the newly discovered methods and applications of U-statisticsBuilt on years of collaborative research and academic experience, Modern Applied U-Statistics successfully presents a thorough introduction to the theory of U-statistics using in-depth examples and applications that address contemporary areas of study including biomedical and psychosocial research. Utilizing a "learn by example" approach, this book provides an accessible, yet in-depth, treatment of U-statistics, as well as addresses key concepts in asymptotic theory by integrating translational and cross-disciplinary research.The authors begin with an introduction of the essential and theoretical foundations of U-statistics such as the notion of convergence in probability and distribution, basic convergence results, stochastic Os, inference theory, generalized estimating equations, as well as the definition and asymptotic properties of U-statistics. With an emphasis on nonparametric applications when and where applic...

  5. Statistics for Finance

    DEFF Research Database (Denmark)

    Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard

    Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...... that rarely connect concepts to data and books on econometrics and time series analysis that do not cover specific problems related to option valuation. The book discusses applications of financial derivatives pertaining to risk assessment and elimination. The authors cover various statistical...... and mathematical techniques, including linear and nonlinear time series analysis, stochastic calculus models, stochastic differential equations, Itō’s formula, the Black–Scholes model, the generalized method-of-moments, and the Kalman filter. They explain how these tools are used to price financial derivatives...

  6. Towards a concept of sensible drinking and an illustration of measure.

    Science.gov (United States)

    Harburg, E; Gleiberman, L; Difranceisco, W; Peele, S

    1994-07-01

    The major focus of research on alcohol is not on the majority who drink without problems, but on the small minority who have extreme problems. Difficulty in conceiving, measuring, and analyzing non-problem drinking lies in the exclusively problem-drinking orientation of most drinking measures. Drawing on conventionally used scales (e.g. Short Michigan Alcoholism Screening Test) and other established concepts in the alcohol literature (e.g. craving, hangover), a set of 24 items was selected to classify all persons in a sample from Tecumseh, Michigan, as to their alcohol-related behaviors (N = 1266). A Sensible-Problem Drinking Classification (SPDC) was developed with five categories: very sensible, sensible, borderline, problem, and impaired. A variety of known alcohol and psychosocial variables were related monotonically across these categories in expected directions. Ethanol ounces per week was only modestly related to SPDC groups: R2 = 0.09 for women, R2 = 0.21 for men. The positive relationship of problem and non-problem SPDC groups to high and low blood pressure was P = 0.07, while ethanol (oz/week) was uncorrelated to blood pressure (mm Hg) in this subsample (N = 453). The development of SPDC requires additional items measuring self and group regulatory alcohol behavior. However, this initial analysis of no-problem subgroups has direct import for public health regulation of alcohol use by providing a model of a sensible view of alcohol use.

  7. Statistical mechanics

    CERN Document Server

    Schwabl, Franz

    2006-01-01

    The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...

  8. Fulfilling the needs for statistical expertise at Aalborg Hospital

    DEFF Research Database (Denmark)

    Dethlefsen, Claus

    In 2005, the first statistician was employed at Aalborg Hospital due to expanding research activities as part of Aarhus University Hospital. Since then, there has been an increased demand for statistical expertise at all levels. In the talk, I will give an overview of the current staff...... of statisticians and the organisation. I will give examples from our statistical consultancy and illustrate some of the challenges that have led to research projects with heavy statistical involvement....

  9. Bayesian approach to inverse statistical mechanics

    Science.gov (United States)

    Habeck, Michael

    2014-05-01

    Inverse statistical mechanics aims to determine particle interactions from ensemble properties. This article looks at this inverse problem from a Bayesian perspective and discusses several statistical estimators to solve it. In addition, a sequential Monte Carlo algorithm is proposed that draws the interaction parameters from their posterior probability distribution. The posterior probability involves an intractable partition function that is estimated along with the interactions. The method is illustrated for inverse problems of varying complexity, including the estimation of a temperature, the inverse Ising problem, maximum entropy fitting, and the reconstruction of molecular interaction potentials.

  10. Isotopic safeguards statistics

    International Nuclear Information System (INIS)

    Timmerman, C.L.; Stewart, K.B.

    1978-06-01

    The methods and results of our statistical analysis of isotopic data using isotopic safeguards techniques are illustrated using example data from the Yankee Rowe reactor. The statistical methods used in this analysis are the paired comparison and the regression analyses. A paired comparison results when a sample from a batch is analyzed by two different laboratories. Paired comparison techniques can be used with regression analysis to detect and identify outlier batches. The second analysis tool, linear regression, involves comparing various regression approaches. These approaches use two basic types of models: the intercept model (y = α + βx) and the initial point model [y - y 0 = β(x - x 0 )]. The intercept model fits strictly the exposure or burnup values of isotopic functions, while the initial point model utilizes the exposure values plus the initial or fabricator's data values in the regression analysis. Two fitting methods are applied to each of these models. These methods are: (1) the usual least squares fitting approach where x is measured without error, and (2) Deming's approach which uses the variance estimates obtained from the paired comparison results and considers x and y are both measured with error. The Yankee Rowe data were first measured by Nuclear Fuel Services (NFS) and remeasured by Nuclear Audit and Testing Company (NATCO). The ratio of Pu/U versus 235 D (in which 235 D is the amount of depleted 235 U expressed in weight percent) using actual numbers is the isotopic function illustrated. Statistical results using the Yankee Rowe data indicates the attractiveness of Deming's regression model over the usual approach by simple comparison of the given regression variances with the random variance from the paired comparison results

  11. Fractional statistics and quantum theory

    CERN Document Server

    Khare, Avinash

    1997-01-01

    This book explains the subtleties of quantum statistical mechanics in lower dimensions and their possible ramifications in quantum theory. The discussion is at a pedagogical level and is addressed to both graduate students and advanced research workers with a reasonable background in quantum and statistical mechanics. The main emphasis will be on explaining new concepts. Topics in the first part of the book includes the flux tube model of anyons, the braid group and quantum and statistical mechanics of noninteracting anyon gas. The second part of the book provides a detailed discussion about f

  12. Basics of modern mathematical statistics

    CERN Document Server

    Spokoiny, Vladimir

    2015-01-01

    This textbook provides a unified and self-contained presentation of the main approaches to and ideas of mathematical statistics. It collects the basic mathematical ideas and tools needed as a basis for more serious studies or even independent research in statistics. The majority of existing textbooks in mathematical statistics follow the classical asymptotic framework. Yet, as modern statistics has changed rapidly in recent years, new methods and approaches have appeared. The emphasis is on finite sample behavior, large parameter dimensions, and model misspecifications. The present book provides a fully self-contained introduction to the world of modern mathematical statistics, collecting the basic knowledge, concepts and findings needed for doing further research in the modern theoretical and applied statistics. This textbook is primarily intended for graduate and postdoc students and young researchers who are interested in modern statistical methods.

  13. Fluctuations and correlations in statistical models of hadron production

    International Nuclear Information System (INIS)

    Gorenstein, M. I.

    2012-01-01

    An extension of the standard concept of the statistical ensembles is suggested. Namely, the statistical ensembles with extensive quantities fluctuating according to an externally given distribution are introduced. Applications in the statistical models of multiple hadron production in high energy physics are discussed.

  14. Measuring Social Studies Concept Attainment: Boys and Girls. Report from the Project on A Structure of Concept Attainment Abilities.

    Science.gov (United States)

    Harris, Margaret L.; Tabachnick, B. Robert

    This paper describes test development efforts for measuring achievement of selected concepts in social studies. It includes descriptive item and test statistics for the tests developed. Twelve items were developed for each of 30 concepts. Subject specialists categorized the concepts into three major areas: Geographic Region, Man and Society, and…

  15. Statistical Information and Uncertainty: A Critique of Applications in Experimental Psychology

    Directory of Open Access Journals (Sweden)

    Donald Laming

    2010-04-01

    Full Text Available This paper presents, first, a formal exploration of the relationships between information (statistically defined, statistical hypothesis testing, the use of hypothesis testing in reverse as an investigative tool, channel capacity in a communication system, uncertainty, the concept of entropy in thermodynamics, and Bayes’ theorem. This exercise brings out the close mathematical interrelationships between different applications of these ideas in diverse areas of psychology. Subsequent illustrative examples are grouped under (a the human operator as an ideal communications channel, (b the human operator as a purely physical system, and (c Bayes’ theorem as an algorithm for combining information from different sources. Some tentative conclusions are drawn about the usefulness of information theory within these different categories. (a The idea of the human operator as an ideal communications channel has long been abandoned, though it provides some lessons that still need to be absorbed today. (b Treating the human operator as a purely physical system provides a platform for the quantitative exploration of many aspects of human performance by analogy with the analysis of other physical systems. (c The use of Bayes’ theorem to calculate the effects of prior probabilities and stimulus frequencies on human performance is probably misconceived, but it is difficult to obtain results precise enough to resolve this question.

  16. Statistical process control for alpha spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Richardson, W; Majoras, R E [Oxford Instruments, Inc. P.O. Box 2560, Oak Ridge TN 37830 (United States); Joo, I O; Seymour, R S [Accu-Labs Research, Inc. 4663 Table Mountain Drive, Golden CO 80403 (United States)

    1995-10-01

    Statistical process control(SPC) allows for the identification of problems in alpha spectroscopy processes before they occur, unlike standard laboratory Q C which only identifies problems after a process fails. SPC tools that are directly applicable to alpha spectroscopy include individual X-charts and X-bar charts, process capability plots, and scatter plots. Most scientists are familiar with the concepts the and methods employed by SPC. These tools allow analysis of process bias, precision, accuracy and reproducibility as well as process capability. Parameters affecting instrument performance are monitored and analyzed using SPC methods. These instrument parameters can also be compared to sampling, preparation, measurement, and analysis Q C parameters permitting the evaluation of cause effect relationships. Three examples of SPC, as applied to alpha spectroscopy , are presented. The first example investigates background contamination using averaging to show trends quickly. A second example demonstrates how SPC can identify sample processing problems, analyzing both how and why this problem occurred. A third example illustrates how SPC can predict when an alpha spectroscopy process is going to fail. This allows for an orderly and timely shutdown of the process to perform preventative maintenance, avoiding the need to repeat costly sample analyses. 7 figs., 2 tabs.

  17. Statistical process control for alpha spectroscopy

    International Nuclear Information System (INIS)

    Richardson, W.; Majoras, R.E.; Joo, I.O.; Seymour, R.S.

    1995-01-01

    Statistical process control(SPC) allows for the identification of problems in alpha spectroscopy processes before they occur, unlike standard laboratory Q C which only identifies problems after a process fails. SPC tools that are directly applicable to alpha spectroscopy include individual X-charts and X-bar charts, process capability plots, and scatter plots. Most scientists are familiar with the concepts the and methods employed by SPC. These tools allow analysis of process bias, precision, accuracy and reproducibility as well as process capability. Parameters affecting instrument performance are monitored and analyzed using SPC methods. These instrument parameters can also be compared to sampling, preparation, measurement, and analysis Q C parameters permitting the evaluation of cause effect relationships. Three examples of SPC, as applied to alpha spectroscopy , are presented. The first example investigates background contamination using averaging to show trends quickly. A second example demonstrates how SPC can identify sample processing problems, analyzing both how and why this problem occurred. A third example illustrates how SPC can predict when an alpha spectroscopy process is going to fail. This allows for an orderly and timely shutdown of the process to perform preventative maintenance, avoiding the need to repeat costly sample analyses. 7 figs., 2 tabs

  18. Concept Mapping Using Cmap Tools to Enhance Meaningful Learning

    Science.gov (United States)

    Cañas, Alberto J.; Novak, Joseph D.

    Concept maps are graphical tools that have been used in all facets of education and training for organizing and representing knowledge. When learners build concept maps, meaningful learning is facilitated. Computer-based concept mapping software such as CmapTools have further extended the use of concept mapping and greatly enhanced the potential of the tool, facilitating the implementation of a concept map-centered learning environment. In this chapter, we briefly present concept mapping and its theoretical foundation, and illustrate how it can lead to an improved learning environment when it is combined with CmapTools and the Internet. We present the nationwide “Proyecto Conéctate al Conocimiento” in Panama as an example of how concept mapping, together with technology, can be adopted by hundreds of schools as a means to enhance meaningful learning.

  19. On statistical acceleration convergence of double sequences

    Directory of Open Access Journals (Sweden)

    Bipan Hazarika

    2017-04-01

    Full Text Available In this article the notion of statistical acceleration convergence of double sequences in Pringsheim's sense has been introduced. We prove the decompostion theorems for  statistical acceleration convergence of double sequences and some theorems related to that concept have been established using the four dimensional matrix transformations. We provided some examples, where the results of acceleration convergence fails to hold for the statistical cases.

  20. Agents with left and right dominant hemispheres and quantum statistics

    Science.gov (United States)

    Ezhov, Alexandr A.; Khrennikov, Andrei Yu.

    2005-01-01

    We present a multiagent model illustrating the emergence of two different quantum statistics, Bose-Einstein and Fermi-Dirac, in a friendly population of individuals with the right-brain dominance and in a competitive population of individuals with the left-brain hemisphere dominance, correspondingly. Doing so, we adduce the arguments that Lefebvre’s “algebra of conscience” can be used in a natural way to describe decision-making strategies of agents simulating people with different brain dominance. One can suggest that the emergence of the two principal statistical distributions is able to illustrate different types of society organization and also to be used in order to simulate market phenomena and psychic disorders, when a switching of hemisphere dominance is involved.

  1. SEBREZ: an inertial-fusion-reactor concept

    International Nuclear Information System (INIS)

    Meier, W.R.

    1982-01-01

    The neutronic aspects of an inertial fusion reactor concept that relies on asymmetrical neutronic effects to enhance the tritium production in the breeding zones have been studied. We find that it is possible to obtain a tritium breeding ratio greater than 1.0 with a chamber configuration in which the breeding zones subtend only a fraction of the total solid angle. This is the origin of the name SEBREZ which stands for SEgregated BREeding Zones. It should be emphasized that this is not a reactor design study; rather this study illustrates certain neutronic effects in the context of a particular reactor concept. An understanding of these effects forms the basis of a design technique which has broader application than just the SEBREZ concept

  2. Statistics for Engineers

    International Nuclear Information System (INIS)

    Kim, Jin Gyeong; Park, Jin Ho; Park, Hyeon Jin; Lee, Jae Jun; Jun, Whong Seok; Whang, Jin Su

    2009-08-01

    This book explains statistics for engineers using MATLAB, which includes arrangement and summary of data, probability, probability distribution, sampling distribution, assumption, check, variance analysis, regression analysis, categorical data analysis, quality assurance such as conception of control chart, consecutive control chart, breakthrough strategy and analysis using Matlab, reliability analysis like measurement of reliability and analysis with Maltab, and Markov chain.

  3. Conceptions of Conflict in Organizational Conflict Research

    DEFF Research Database (Denmark)

    Mikkelsen, Elisabeth Naima; Clegg, Stewart

    2017-01-01

    . In doing so, we first apply a genealogical approach to study conceptions of conflict, and we find that three distinct and essentially contested conceptions frame studies of conflict at work. Second, we employ two empirical examples of conflict to illustrate how organizational conflict research can benefit......Diverse and often unacknowledged assumptions underlie organizational conflict research. In this essay, we identify distinct ways of conceptualizing conflict in the theoretical domain of organizational conflict with the aim of setting a new critical agenda for reflexivity in conflict research...

  4. A Study on Contingency Learning in Introductory Physics Concepts

    Science.gov (United States)

    Scaife, Thomas M.

    Instructors of physics often use examples to illustrate new or complex physical concepts to students. For any particular concept, there are an infinite number of examples, thus presenting instructors with a difficult question whenever they wish to use one in their teaching: which example will most effectively illustrate the concept so that student learning is maximized? The choice is typically made by an intuitive assumption about which exact example will result in the most lucid illustration and the greatest student improvement. By questioning 583 students in four experiments, I examined a more principled approach to example selection. By controlling the manner in which physical dimensions vary, the parameter space of each concept can be divided into a discrete number of example categories. The effects of training with members of each of category was explored in two different physical contexts: projectile motion and torque. In the first context, students were shown two trajectories and asked to determine which represented the longer time of flight. Height, range, and time of flight were the physical dimensions that were used to categorize the examples. In the second context, students were shown a balance-scale with loads of differing masses placed at differing positions along either side of the balance-arm. Mass, lever-arm length, and torque were the physical dimensions used to categorize these examples. For both contexts, examples were chosen so that one or two independent dimensions were varied. After receiving training with examples from specific categories, students were tested with questions from all question categories. Successful training or instruction can be measured either as producing correct, expert-like behavior (as observed through answers to the questions) or as explicitly instilling an understanding of the underlying rule that governs a physical phenomenon. A student's behavior might not be consistent with their explicit rule, so following the

  5. Probing the statistical properties of unknown texts: application to the Voynich Manuscript.

    Science.gov (United States)

    Amancio, Diego R; Altmann, Eduardo G; Rybski, Diego; Oliveira, Osvaldo N; Costa, Luciano da F

    2013-01-01

    While the use of statistical physics methods to analyze large corpora has been useful to unveil many patterns in texts, no comprehensive investigation has been performed on the interdependence between syntactic and semantic factors. In this study we propose a framework for determining whether a text (e.g., written in an unknown alphabet) is compatible with a natural language and to which language it could belong. The approach is based on three types of statistical measurements, i.e. obtained from first-order statistics of word properties in a text, from the topology of complex networks representing texts, and from intermittency concepts where text is treated as a time series. Comparative experiments were performed with the New Testament in 15 different languages and with distinct books in English and Portuguese in order to quantify the dependency of the different measurements on the language and on the story being told in the book. The metrics found to be informative in distinguishing real texts from their shuffled versions include assortativity, degree and selectivity of words. As an illustration, we analyze an undeciphered medieval manuscript known as the Voynich Manuscript. We show that it is mostly compatible with natural languages and incompatible with random texts. We also obtain candidates for keywords of the Voynich Manuscript which could be helpful in the effort of deciphering it. Because we were able to identify statistical measurements that are more dependent on the syntax than on the semantics, the framework may also serve for text analysis in language-dependent applications.

  6. The Statistics of wood assays for preservative retention

    Science.gov (United States)

    Patricia K. Lebow; Scott W. Conklin

    2011-01-01

    This paper covers general statistical concepts that apply to interpreting wood assay retention values. In particular, since wood assays are typically obtained from a single composited sample, the statistical aspects, including advantages and disadvantages, of simple compositing are covered.

  7. A Statistical Approach to Optimizing Concrete Mixture Design

    OpenAIRE

    Ahmad, Shamsad; Alghamdi, Saeid A.

    2014-01-01

    A step-by-step statistical approach is proposed to obtain optimum proportioning of concrete mixtures using the data obtained through a statistically planned experimental program. The utility of the proposed approach for optimizing the design of concrete mixture is illustrated considering a typical case in which trial mixtures were considered according to a full factorial experiment design involving three factors and their three levels (33). A total of 27 concrete mixtures with three replicate...

  8. Mathematical concepts

    CERN Document Server

    Jost, Jürgen

    2015-01-01

    The main intention of this book is to describe and develop the conceptual, structural and abstract thinking of mathematics. Specific mathematical structures are used to illustrate the conceptual approach; providing a deeper insight into mutual relationships and abstract common features. These ideas are carefully motivated, explained and illustrated by examples so that many of the more technical proofs can be omitted. The book can therefore be used: ·         simply as an overview of the panorama of mathematical structures and the relations between them, to be supplemented by more detailed texts whenever you want to acquire a working knowledge of some structure ·         by itself as a first introduction to abstract mathematics ·         together with existing textbooks, to put their results into a more general perspective ·         to gain a new and hopefully deeper perspective after having studied such textbooks Mathematical Concepts has a broader scope and is less detaile...

  9. Kinetic concepts of thermally stimulated reactions in solids

    Science.gov (United States)

    Vyazovkin, Sergey

    Historical analysis suggests that the basic kinetic concepts of reactions in solids were inherited from homogeneous kinetics. These concepts rest upon the assumption of a single-step reaction that disagrees with the multiple-step nature of solid-state processes. The inadequate concepts inspire such unjustified anticipations of kinetic analysis as evaluating constant activation energy and/or deriving a single-step reaction mechanism for the overall process. A more adequate concept is that of the effective activation energy, which may vary with temperature and extent of conversion. The adequacy of this concept is illustrated by literature data as well as by experimental data on the thermal dehydration of calcium oxalate monohydrate and thermal decomposition of calcium carbonate, ammonium nitrate and 1,3,5,7- tetranitro-1,3,5,7-tetrazocine.

  10. Concept Modeling vs. Data modeling in Practice

    DEFF Research Database (Denmark)

    Madsen, Bodil Nistrup; Erdman Thomsen, Hanne

    2015-01-01

    This chapter shows the usefulness of terminological concept modeling as a first step in data modeling. First, we introduce terminological concept modeling with terminological ontologies, i.e. concept systems enriched with characteristics modeled as feature specifications. This enables a formal...... account of the inheritance of characteristics and allows us to introduce a number of principles and constraints which render concept modeling more coherent than earlier approaches. Second, we explain how terminological ontologies can be used as the basis for developing conceptual and logical data models....... We also show how to map from the various elements in the terminological ontology to elements in the data models, and explain the differences between the models. Finally the usefulness of terminological ontologies as a prerequisite for IT development and data modeling is illustrated with examples from...

  11. Delineating Concept Meanings: The Case of Terrorism.

    Science.gov (United States)

    Kleg, Milton; Mahlios, Marc

    1990-01-01

    Presents a teacher-initiated model for reaching class consensus on the meaning of confusing or interchangeable concepts in social studies classrooms. Illustrates the model by delineating terrorism. Shows procedural steps that involve students in self and small group interviews where definitions are clarified until consensus is reached. Suggests…

  12. Different Conceptions of Mental Illness: Consequences for the Association with Patients†

    OpenAIRE

    Helmchen, Hanfried

    2013-01-01

    Whenever partial knowledge is considered absolute and turned into ideological and dogmatic conceptions, the risk increases that the conditions for the people involved might become dangerous. This will be illustrated by casuistic examples of consequences of one-sided psychiatric conceptions such as social, biological, and psychological ideas about the treatment and care of the mentally ill. Present perspectives of an integrative model, i.e. the bio-psycho-social conception about specific inter...

  13. Coherent states for oscillators of non-conventional statistics

    International Nuclear Information System (INIS)

    Dao Vong Duc; Nguyen Ba An

    1998-12-01

    In this work we consider systematically the concept of coherent states for oscillators of non-conventional statistics - parabose oscillator, infinite statistics oscillator and generalised q-deformed oscillator. The expressions for the quadrature variances and particle number distribution are derived and displayed graphically. The obtained results show drastic changes when going from one statistics to another. (author)

  14. Professional representation and the free-lance medical illustrator.

    Science.gov (United States)

    Mount, K N; Daugherty, J

    1994-01-01

    We researched factors related to the success or failure in working relationships between free-lance medical illustrators and artist's representatives. In the fall of 1992, surveys were mailed to 230 medical illustrators; 105 (46%) completed surveys were returned. Respondents were divided into three categories: 1) medical illustrators currently represented, 2) medical illustrators previously represented, and 3) medical illustrators who had never been represented. Comparisons made among illustrators from the three groups included business practices, clientele, experience, and self-promotion techniques. These comparisons revealed notable differences and similarities between the three groups and were subsequently analyzed to identify the characteristics of medical illustrators who would benefit from professional representation.

  15. Science Academies' Refresher Course in Statistical Physics

    Indian Academy of Sciences (India)

    The Course is aimed at college teachers of statistical physics at BSc/MSc level. It will cover basic principles and techniques, in a pedagogical manner, through lectures and tutorials, with illustrative problems. Some advanced topics, and common difficulties faced by students will also be discussed. College/University ...

  16. Statistical process control in nursing research.

    Science.gov (United States)

    Polit, Denise F; Chaboyer, Wendy

    2012-02-01

    In intervention studies in which randomization to groups is not possible, researchers typically use quasi-experimental designs. Time series designs are strong quasi-experimental designs but are seldom used, perhaps because of technical and analytic hurdles. Statistical process control (SPC) is an alternative analytic approach to testing hypotheses about intervention effects using data collected over time. SPC, like traditional statistical methods, is a tool for understanding variation and involves the construction of control charts that distinguish between normal, random fluctuations (common cause variation), and statistically significant special cause variation that can result from an innovation. The purpose of this article is to provide an overview of SPC and to illustrate its use in a study of a nursing practice improvement intervention. Copyright © 2011 Wiley Periodicals, Inc.

  17. Teaching Statistics Using Classic Psychology Research: An Activities-Based Approach

    Science.gov (United States)

    Holmes, Karen Y.; Dodd, Brett A.

    2012-01-01

    In this article, we discuss a collection of active learning activities derived from classic psychology studies that illustrate the appropriate use of descriptive and inferential statistics. (Contains 2 tables.)

  18. Designing an image retrieval interface for abstract concepts within the domain of journalism

    NARCIS (Netherlands)

    R. Besseling (Ron)

    2011-01-01

    htmlabstractResearch has shown that users have difficulties finding images which illustrate abstract concepts. We carried out a user study that confirms the finding that the selection of search terms is perceived difficult and that users find the subjectivity of abstract concepts problematic. In

  19. Concepts of electrodynamics

    CERN Document Server

    Kumar, Vinay

    2016-01-01

    The present book entitled Concepts of Electrodynamics meets the demand of students of all engineering, graduate, honours and postgraduate courses in a single volume. This book covers all the topics on electrodynamics as per the new syllabus prescribed by UGC and AICTE and we do hope that this book will revive interest in the study of various topics on electrodynamics which will carries the reader to a high level of understanding. The text is enriched with a large number of solved examples apart from appropriate illustrations and examples in each chapter.

  20. Analysis of statistical misconception in terms of statistical reasoning

    Science.gov (United States)

    Maryati, I.; Priatna, N.

    2018-05-01

    Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.

  1. R for statistics

    CERN Document Server

    Cornillon, Pierre-Andre; Husson, Francois; Jegou, Nicolas; Josse, Julie; Kloareg, Maela; Matzner-Lober, Eric; Rouviere, Laurent

    2012-01-01

    An Overview of RMain ConceptsInstalling RWork SessionHelpR ObjectsFunctionsPackagesExercisesPreparing DataReading Data from FileExporting ResultsManipulating VariablesManipulating IndividualsConcatenating Data TablesCross-TabulationExercisesR GraphicsConventional Graphical FunctionsGraphical Functions with latticeExercisesMaking Programs with RControl FlowsPredefined FunctionsCreating a FunctionExercisesStatistical MethodsIntroduction to the Statistical MethodsA Quick Start with RInstalling ROpening and Closing RThe Command PromptAttribution, Objects, and FunctionSelectionOther Rcmdr PackageImporting (or Inputting) DataGraphsStatistical AnalysisHypothesis TestConfidence Intervals for a MeanChi-Square Test of IndependenceComparison of Two MeansTesting Conformity of a ProportionComparing Several ProportionsThe Power of a TestRegressionSimple Linear RegressionMultiple Linear RegressionPartial Least Squares (PLS) RegressionAnalysis of Variance and CovarianceOne-Way Analysis of VarianceMulti-Way Analysis of Varian...

  2. Toward Global Comparability of Sexual Orientation Data in Official Statistics: A Conceptual Framework of Sexual Orientation for Health Data Collection in New Zealand's Official Statistics System

    Science.gov (United States)

    Gray, Alistair; Veale, Jaimie F.; Binson, Diane; Sell, Randell L.

    2013-01-01

    Objective. Effectively addressing health disparities experienced by sexual minority populations requires high-quality official data on sexual orientation. We developed a conceptual framework of sexual orientation to improve the quality of sexual orientation data in New Zealand's Official Statistics System. Methods. We reviewed conceptual and methodological literature, culminating in a draft framework. To improve the framework, we held focus groups and key-informant interviews with sexual minority stakeholders and producers and consumers of official statistics. An advisory board of experts provided additional guidance. Results. The framework proposes working definitions of the sexual orientation topic and measurement concepts, describes dimensions of the measurement concepts, discusses variables framing the measurement concepts, and outlines conceptual grey areas. Conclusion. The framework proposes standard definitions and concepts for the collection of official sexual orientation data in New Zealand. It presents a model for producers of official statistics in other countries, who wish to improve the quality of health data on their citizens. PMID:23840231

  3. Basic elements of computational statistics

    CERN Document Server

    Härdle, Wolfgang Karl; Okhrin, Yarema

    2017-01-01

    This textbook on computational statistics presents tools and concepts of univariate and multivariate statistical data analysis with a strong focus on applications and implementations in the statistical software R. It covers mathematical, statistical as well as programming problems in computational statistics and contains a wide variety of practical examples. In addition to the numerous R sniplets presented in the text, all computer programs (quantlets) and data sets to the book are available on GitHub and referred to in the book. This enables the reader to fully reproduce as well as modify and adjust all examples to their needs. The book is intended for advanced undergraduate and first-year graduate students as well as for data analysts new to the job who would like a tour of the various statistical tools in a data analysis workshop. The experienced reader with a good knowledge of statistics and programming might skip some sections on univariate models and enjoy the various mathematical roots of multivariate ...

  4. New concepts for the recovery and isotopic separation of tritium in fusion reactors

    International Nuclear Information System (INIS)

    Dombra, A.H.; Holtslander, W.J.; Miller, A.I.; Canadian Fusion Fuels Technology Project, Toronto, Ontario)

    1986-01-01

    New concepts for the recovery of tritium from light water coolant of LiPb blankets, and high-pressure helium coolant of Li-ceramic blankets are introduced. Application of these concepts to fusion reactors is illustrated with conceptual system designs for the anticipated NET blanket requirements. (author)

  5. 31 CFR 411.1 - Color illustrations authorized.

    Science.gov (United States)

    2010-07-01

    ... SERVICE, DEPARTMENT OF THE TREASURY COLOR ILLUSTRATIONS OF UNITED STATES CURRENCY § 411.1 Color... necessary plates or items for such printing or publishing, of color illustrations of U.S. currency provided... storage devices, and any other thing used in the making of the illustration that contain an image of the...

  6. Statistical properties of several models of fractional random point processes

    Science.gov (United States)

    Bendjaballah, C.

    2011-08-01

    Statistical properties of several models of fractional random point processes have been analyzed from the counting and time interval statistics points of view. Based on the criterion of the reduced variance, it is seen that such processes exhibit nonclassical properties. The conditions for these processes to be treated as conditional Poisson processes are examined. Numerical simulations illustrate part of the theoretical calculations.

  7. A modern course in statistical physics

    CERN Document Server

    Reichl, Linda E

    2016-01-01

    "A Modern Course in Statistical Physics" is a textbook that illustrates the foundations of equilibrium and non-equilibrium statistical physics, and the universal nature of thermodynamic processes, from the point of view of contemporary research problems. The book treats such diverse topics as the microscopic theory of critical phenomena, superfluid dynamics, quantum conductance, light scattering, transport processes, and dissipative structures, all in the framework of the foundations of statistical physics and thermodynamics. It shows the quantum origins of problems in classical statistical physics. One focus of the book is fluctuations that occur due to the discrete nature of matter, a topic of growing importance for nanometer scale physics and biophysics. Another focus concerns classical and quantum phase transitions, in both monatomic and mixed particle systems. This fourth edition extends the range of topics considered to include, for example, entropic forces, electrochemical processes in biological syste...

  8. Error calculations statistics in radioactive measurements

    International Nuclear Information System (INIS)

    Verdera, Silvia

    1994-01-01

    Basic approach and procedures frequently used in the practice of radioactive measurements.Statistical principles applied are part of Good radiopharmaceutical Practices and quality assurance.Concept of error, classification as systematic and random errors.Statistic fundamentals,probability theories, populations distributions, Bernoulli, Poisson,Gauss, t-test distribution,Ξ2 test, error propagation based on analysis of variance.Bibliography.z table,t-test table, Poisson index ,Ξ2 test

  9. Statistics of Electron Avalanches and Streamers

    Directory of Open Access Journals (Sweden)

    T. Ficker

    2007-01-01

    Full Text Available We have studied the severe systematic deviations of populations of electron avalanches from the Furry distribution, which has been held to be the statistical law corresponding to them, and a possible explanation has been sought. A  new theoretical concept based on fractal avalanche multiplication has been proposed and is shown to be a convenient candidate for explaining these deviations from Furry statistics

  10. Statistical learning and selective inference.

    Science.gov (United States)

    Taylor, Jonathan; Tibshirani, Robert J

    2015-06-23

    We describe the problem of "selective inference." This addresses the following challenge: Having mined a set of data to find potential associations, how do we properly assess the strength of these associations? The fact that we have "cherry-picked"--searched for the strongest associations--means that we must set a higher bar for declaring significant the associations that we see. This challenge becomes more important in the era of big data and complex statistical modeling. The cherry tree (dataset) can be very large and the tools for cherry picking (statistical learning methods) are now very sophisticated. We describe some recent new developments in selective inference and illustrate their use in forward stepwise regression, the lasso, and principal components analysis.

  11. Mathematical statistics essays on history and methodology

    CERN Document Server

    Pfanzagl, Johann

    2017-01-01

    This book presents a detailed description of the development of statistical theory. In the mid twentieth century, the development of mathematical statistics underwent an enduring change, due to the advent of more refined mathematical tools. New concepts like sufficiency, superefficiency, adaptivity etc. motivated scholars to reflect upon the interpretation of mathematical concepts in terms of their real-world relevance. Questions concerning the optimality of estimators, for instance, had remained unanswered for decades, because a meaningful concept of optimality (based on the regularity of the estimators, the representation of their limit distribution and assertions about their concentration by means of Anderson’s Theorem) was not yet available. The rapidly developing asymptotic theory provided approximate answers to questions for which non-asymptotic theory had found no satisfying solutions. In four engaging essays, this book presents a detailed description of how the use of mathematical methods stimulated...

  12. Thermodynamic properties of particles with intermediate statistics

    International Nuclear Information System (INIS)

    Joyce, G.S.; Sarkar, S.; Spal/ek, J.; Byczuk, K.

    1996-01-01

    Analytic expressions for the distribution function of an ideal gas of particles (exclusons) which have statistics intermediate between Fermi-Dirac and Bose-Einstein are obtained for all values of the Haldane statistics parameter α element-of[0,1]. The analytic structure of the distribution function is investigated and found to have no singularities in the physical region when the parameter α lies in the range 0 V of the D-dimensional excluson gas. The low-temperature series for the thermodynamic properties illustrate the pseudofermion nature of exclusons. copyright 1996 The American Physical Society

  13. Understanding the statistics of small risks

    International Nuclear Information System (INIS)

    Siddall, E.

    1983-10-01

    Monte Carlo analyses are used to show what inferences can and cannot be drawn when either a very small number of accidents result from a considerable exposure or where a very small number of people, down to a single individual, are exposed to small added risks. The distinction between relative and absolute uncertainty is illustrated. No new statistical principles are involved

  14. Statistical Analysis of Big Data on Pharmacogenomics

    Science.gov (United States)

    Fan, Jianqing; Liu, Han

    2013-01-01

    This paper discusses statistical methods for estimating complex correlation structure from large pharmacogenomic datasets. We selectively review several prominent statistical methods for estimating large covariance matrix for understanding correlation structure, inverse covariance matrix for network modeling, large-scale simultaneous tests for selecting significantly differently expressed genes and proteins and genetic markers for complex diseases, and high dimensional variable selection for identifying important molecules for understanding molecule mechanisms in pharmacogenomics. Their applications to gene network estimation and biomarker selection are used to illustrate the methodological power. Several new challenges of Big data analysis, including complex data distribution, missing data, measurement error, spurious correlation, endogeneity, and the need for robust statistical methods, are also discussed. PMID:23602905

  15. 48 CFR 9904.401-60 - Illustrations.

    Science.gov (United States)

    2010-10-01

    ... Section 9904.401-60 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET PROCUREMENT PRACTICES AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.401-60 Illustrations. (a) The following examples are illustrative...

  16. 48 CFR 9904.409-60 - Illustrations.

    Science.gov (United States)

    2010-10-01

    ... Section 9904.409-60 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET PROCUREMENT PRACTICES AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.409-60 Illustrations. The following examples are illustrative of...

  17. Concepts for measuring maintenance performance and methods for analysing competing failure modes

    DEFF Research Database (Denmark)

    Cooke, R.; Paulsen, J.L.

    1997-01-01

    competing failure modes. This article examines ways to assess maintenance performance without introducing statistical assumptions, then introduces a plausible statistical model for describing the interaction of preventive and corrective maintenance, and finally illustrates these with examples from...

  18. Fish: A New Computer Program for Friendly Introductory Statistics Help

    Science.gov (United States)

    Brooks, Gordon P.; Raffle, Holly

    2005-01-01

    All introductory statistics students must master certain basic descriptive statistics, including means, standard deviations and correlations. Students must also gain insight into such complex concepts as the central limit theorem and standard error. This article introduces and describes the Friendly Introductory Statistics Help (FISH) computer…

  19. Radiology illustrated. Uroradiology. 2. ed.

    International Nuclear Information System (INIS)

    Kim, Seung Hyup

    2012-01-01

    Uroradiology is an up-to-date, image-oriented reference in the style of a teaching file that has been designed specifically to be of value in clinical practice. All aspects of the imaging of urologic diseases are covered, and case studies illustrate the findings obtained with the relevant imaging modalities in both common and uncommon conditions. Most chapters focus on a particular clinical problem, but normal findings, congenital anomalies, and interventions are also discussed and illustrated. In this second edition, the range and quality of the illustrations have been enhanced, and many schematic drawings have been added to help readers memorize characteristic imaging findings through pattern recognition. The accompanying text is concise and informative. Besides serving as an outstanding aid to differential diagnosis, this book will provide a user-friendly review tool for certification or recertification in radiology. (orig.)

  20. Exploring physics concepts among novice teachers through CMAP tools

    Science.gov (United States)

    Suprapto, N.; Suliyanah; Prahani, B. K.; Jauhariyah, M. N. R.; Admoko, S.

    2018-03-01

    Concept maps are graphical tools for organising, elaborating and representing knowledge. Through Cmap tools software, it can be explored the understanding and the hierarchical structuring of physics concepts among novice teachers. The software helps physics teachers indicated a physics context, focus questions, parking lots, cross-links, branching, hierarchy, and propositions. By using an exploratory quantitative study, a total 13-concept maps with different physics topics created by novice physics teachers were analysed. The main differences of scoring between lecturer and peer-teachers’ scoring were also illustrated. The study offered some implications, especially for physics educators to determine the hierarchical structure of the physics concepts, to construct a physics focus question, and to see how a concept in one domain of knowledge represented on the map is related to a concept in another domain shown on the map.

  1. Effect of model choice and sample size on statistical tolerance limits

    International Nuclear Information System (INIS)

    Duran, B.S.; Campbell, K.

    1980-03-01

    Statistical tolerance limits are estimates of large (or small) quantiles of a distribution, quantities which are very sensitive to the shape of the tail of the distribution. The exact nature of this tail behavior cannot be ascertained brom small samples, so statistical tolerance limits are frequently computed using a statistical model chosen on the basis of theoretical considerations or prior experience with similar populations. This report illustrates the effects of such choices on the computations

  2. Matrix Tricks for Linear Statistical Models

    CERN Document Server

    Puntanen, Simo; Styan, George PH

    2011-01-01

    In teaching linear statistical models to first-year graduate students or to final-year undergraduate students there is no way to proceed smoothly without matrices and related concepts of linear algebra; their use is really essential. Our experience is that making some particular matrix tricks very familiar to students can substantially increase their insight into linear statistical models (and also multivariate statistical analysis). In matrix algebra, there are handy, sometimes even very simple "tricks" which simplify and clarify the treatment of a problem - both for the student and

  3. 48 CFR 9904.404-60 - Illustrations.

    Science.gov (United States)

    2010-10-01

    ... Section 9904.404-60 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET PROCUREMENT PRACTICES AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.404-60 Illustrations. (a) Illustrations of costs which must be...

  4. 48 CFR 9904.402-60 - Illustrations.

    Science.gov (United States)

    2010-10-01

    ... Section 9904.402-60 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET PROCUREMENT PRACTICES AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.402-60 Illustrations. (a) Illustrations of costs which are incurred...

  5. Linear mixed models a practical guide using statistical software

    CERN Document Server

    West, Brady T; Galecki, Andrzej T

    2006-01-01

    Simplifying the often confusing array of software programs for fitting linear mixed models (LMMs), Linear Mixed Models: A Practical Guide Using Statistical Software provides a basic introduction to primary concepts, notation, software implementation, model interpretation, and visualization of clustered and longitudinal data. This easy-to-navigate reference details the use of procedures for fitting LMMs in five popular statistical software packages: SAS, SPSS, Stata, R/S-plus, and HLM. The authors introduce basic theoretical concepts, present a heuristic approach to fitting LMMs based on bo

  6. Illustrations of the twin paradox

    International Nuclear Information System (INIS)

    Rebhan, E.

    1985-01-01

    In order to provide a more intuitive understanding of the twin paradox, several illustrations of this are presented. In one of these, each of the twins is equipped with a lamp whose monochromatic light can be observed by the other. In other illustrations the travelling twin uses an Einstein train instead of a space ship, all the cars of the train and all stations along the route of the train being equipped with clocks. (author)

  7. Considerations on a new fast extraction kicker concept for SPS

    CERN Document Server

    Barnes, M

    2010-01-01

    An alternative extraction kicker concept is investigated for the SPS, based on open C-type kickers and a fast-bumper system. The beam is moved into the kicker gap some tens of ms before extraction. The concept is illustrated in detail with the LSS4 extraction from the SPS – very similar parameters and considerations apply to LSS6. A similar concept could also be conceived for injection but is more difficult due to the larger beam size. The technical issues are presented and the potential impact on the machine impedance outlined.

  8. Bio-Inspired Multi-Functional Drug Transport Design Concept and Simulations.

    Science.gov (United States)

    Pidaparti, Ramana M; Cartin, Charles; Su, Guoguang

    2017-04-25

    In this study, we developed a microdevice concept for drug/fluidic transport taking an inspiration from supramolecular motor found in biological cells. Specifically, idealized multi-functional design geometry (nozzle/diffuser/nozzle) was developed for (i) fluidic/particle transport; (ii) particle separation; and (iii) droplet generation. Several design simulations were conducted to demonstrate the working principles of the multi-functional device. The design simulations illustrate that the proposed design concept is feasible for multi-functionality. However, further experimentation and optimization studies are needed to fully evaluate the multifunctional device concept for multiple applications.

  9. Beautiful Surfaces. Style and Substance in Florentius Schuyl's Illustrations for Descartes' Treatise on Man.

    Science.gov (United States)

    Chan, Eleanor

    2016-01-01

    The assumption that the Cartesian bête-machine is the invention of René Descartes (1596-1650) is rarely contested. Close examination of Descartes' texts proves that this is a concept founded not on the basis of his own writings, but a subsequent critical interpretation, which developed and began to dominate his work after his death. Descartes' Treatise on Man, published posthumously in two rival editions, Florentius Schuyl's Latin translation De Homine (1662), and Claude Clerselier's Traité de l'homme, has proved particularly problematic. The surviving manuscript copies of the Treatise on Man left no illustrations, leaving both editors the daunting task of producing a set of images to accompany and clarify the fragmented text. In this intriguing case, the images can be seen to have spoken louder than the text which they illustrated. This paper assesses Schuyl's choice to represent Descartes' Man in a highly stylized manner, without superimposing Clerselier's intentions onto De Homine.

  10. Statistical convergence of double sequences in intuitionistic fuzzy normed spaces

    International Nuclear Information System (INIS)

    Mursaleen, M.; Mohiuddine, S.A.

    2009-01-01

    Recently, the concept of intuitionistic fuzzy normed spaces was introduced by Saadati and Park [Saadati R, Park JH. Chaos, Solitons and Fractals 2006;27:331-44]. Karakus et al. [Karakus S, Demirci K, Duman O. Chaos, Solitons and Fractals 2008;35:763-69] have quite recently studied the notion of statistical convergence for single sequences in intuitionistic fuzzy normed spaces. In this paper, we study the concept of statistically convergent and statistically Cauchy double sequences in intuitionistic fuzzy normed spaces. Furthermore, we construct an example of a double sequence to show that in IFNS statistical convergence does not imply convergence and our method of convergence even for double sequences is stronger than the usual convergence in intuitionistic fuzzy normed space.

  11. Vibronic coupling density and related concepts

    International Nuclear Information System (INIS)

    Sato, Tohru; Uejima, Motoyuki; Iwahara, Naoya; Haruta, Naoki; Shizu, Katsuyuki; Tanaka, Kazuyoshi

    2013-01-01

    Vibronic coupling density is derived from a general point of view as a one-electron property density. Related concepts as well as their applications are presented. Linear and nonlinear vibronic coupling density and related concepts, orbital vibronic coupling density, reduced vibronic coupling density, atomic vibronic coupling constant, and effective vibronic coupling density, illustrate the origin of vibronic couplings and enable us to design novel functional molecules or to elucidate chemical reactions. Transition dipole moment density is defined as an example of the one-electron property density. Vibronic coupling density and transition dipole moment density open a way to design light-emitting molecules with high efficiency.

  12. Toward Global Comparability of Sexual Orientation Data in Official Statistics: A Conceptual Framework of Sexual Orientation for Health Data Collection in New Zealand’s Official Statistics System

    Directory of Open Access Journals (Sweden)

    Frank Pega

    2013-01-01

    Full Text Available Objective. Effectively addressing health disparities experienced by sexual minority populations requires high-quality official data on sexual orientation. We developed a conceptual framework of sexual orientation to improve the quality of sexual orientation data in New Zealand’s Official Statistics System. Methods. We reviewed conceptual and methodological literature, culminating in a draft framework. To improve the framework, we held focus groups and key-informant interviews with sexual minority stakeholders and producers and consumers of official statistics. An advisory board of experts provided additional guidance. Results. The framework proposes working definitions of the sexual orientation topic and measurement concepts, describes dimensions of the measurement concepts, discusses variables framing the measurement concepts, and outlines conceptual grey areas. Conclusion. The framework proposes standard definitions and concepts for the collection of official sexual orientation data in New Zealand. It presents a model for producers of official statistics in other countries, who wish to improve the quality of health data on their citizens.

  13. Application of Ontology Technology in Health Statistic Data Analysis.

    Science.gov (United States)

    Guo, Minjiang; Hu, Hongpu; Lei, Xingyun

    2017-01-01

    Research Purpose: establish health management ontology for analysis of health statistic data. Proposed Methods: this paper established health management ontology based on the analysis of the concepts in China Health Statistics Yearbook, and used protégé to define the syntactic and semantic structure of health statistical data. six classes of top-level ontology concepts and their subclasses had been extracted and the object properties and data properties were defined to establish the construction of these classes. By ontology instantiation, we can integrate multi-source heterogeneous data and enable administrators to have an overall understanding and analysis of the health statistic data. ontology technology provides a comprehensive and unified information integration structure of the health management domain and lays a foundation for the efficient analysis of multi-source and heterogeneous health system management data and enhancement of the management efficiency.

  14. Statistical and signal-processing concepts in surface metrology

    International Nuclear Information System (INIS)

    Church, E.L.; Takacs, P.Z.

    1986-03-01

    This paper proposes the use of a simple two-scale model of surface roughness for testing and specifying the topographic figure and finish of synchrotron-radiation mirrors. In this approach the effects of figure and finish are described in terms of their slope distribution and power spectrum, respectively, which are then combined with the system point spread function to produce a composite image. The result can be used to predict mirror performance or to translate design requirements into manufacturing specifications. Pacing problems in this approach are the development of a practical long-trace slope-profiling instrument and realistic statistical models for figure and finish errors

  15. Statistical and signal-processing concepts in surface metrology

    Energy Technology Data Exchange (ETDEWEB)

    Church, E.L.; Takacs, P.Z.

    1986-03-01

    This paper proposes the use of a simple two-scale model of surface roughness for testing and specifying the topographic figure and finish of synchrotron-radiation mirrors. In this approach the effects of figure and finish are described in terms of their slope distribution and power spectrum, respectively, which are then combined with the system point spread function to produce a composite image. The result can be used to predict mirror performance or to translate design requirements into manufacturing specifications. Pacing problems in this approach are the development of a practical long-trace slope-profiling instrument and realistic statistical models for figure and finish errors.

  16. How Do Korsakoff Patients Learn New Concepts?

    Science.gov (United States)

    Pitel, Anne Lise; Beaunieux, Helene; Guillery-Girard, Berengere; Witkowski, Thomas; de la Sayette, Vincent; Viader, Fausto; Desgranges, Beatrice; Eustache, Francis

    2009-01-01

    The goal of the present investigation was to assess semantic learning in Korsakoff patients (KS), compared with uncomplicated alcoholics (AL) and control subjects (CS), taking the nature of the information to-be-learned and the episodic memory profiles of the three groups into account. Ten new complex concepts, each illustrated by a photo and…

  17. Using cognitive concept mapping to understand what health care means to the elderly: an illustrative approach for planning and marketing.

    Science.gov (United States)

    Shewchuk, Richard; O'Connor, Stephen J

    2002-01-01

    This article describes a process that can be used for eliciting and systematically organizing perceptions held by key stakeholders. An example using a limited sample of older Medicare recipients is developed to illustrate how this approach can be used. Internally, a nominal group technique (NGT) meeting was conducted to identify an array of health care issues that were perceived as important by this group. These perceptions were then used as stimuli to develop an unforced card sort task. Data from the card sorts were analyzed using multidimensional scaling and hierarchical cluster analysis to demonstrate how qualitative input of participants can be organized. The results of these analyses are described to illustrate an example of an interpretive framework that might be used when seeking input from relevant constituents. Suggestions for how this process might be extended to health care planning/marketing efforts are provided.

  18. Triemli hospital, Zuerich: a future-oriented building concept; Fuer ein zukunftsweisendes Gebaeudekonzept

    Energy Technology Data Exchange (ETDEWEB)

    Kaelin, W.; Sigg, R.

    2004-07-01

    This article discusses the results of the work of five teams on the development of a sustainable concept concerning energy and technology. The chosen concept is to be implemented during the refurbishment and extension work at the 'Triemli' hospital complex in Zuerich, Switzerland. Work on the energy and sustainability concept was given priority over the architectural concept. The various stages of the competition, including pre-qualification and the presentation of the studies are discussed. The results of intensive analysis and evaluation of the concepts are presented. The winning project stressed long-term planning aspects and well-thought-out concepts. The main features of the concept chosen are summarised with the help of diagrams and illustrations.

  19. The Potential of Threshold Concepts: An Emerging Framework for Educational Research and Practice

    Science.gov (United States)

    Lucas, Ursula; Mladenovic, Rosina

    2007-01-01

    This paper explores the notion of a "threshold concept" and discusses its possible implications for higher education research and practice. Using the case of introductory accounting as an illustration, it is argued that the idea of a threshold concept provides an emerging theoretical framework for a "re-view" of educational…

  20. Chemistry, Poetry, and Artistic Illustration: An Interdisciplinary Approach to Teaching and Promoting Chemistry

    Science.gov (United States)

    Furlan, Ping Y.; Kitson, Herbert; Andes, Cynthia

    2007-10-01

    This article describes a successful interdisciplinary collaboration among chemistry, humanities and English faculty members, who utilized poetry and artistic illustration to help students learn, appreciate, and enjoy chemistry. Students taking general chemistry classes were introduced to poetry writing and museum-type poster preparation during one class period. They were then encouraged to use their imagination and creativity to brainstorm and write chemistry poems or humors on the concepts and principles covered in the chemistry classes and artistically illustrate their original work on posters. The project, 2 3 months in length, was perceived by students as effective at helping them learn chemistry and express their understanding in a fun, personal, and creative way. The instructors found students listened to the directives because many posters were witty, clever, and eye-catching. They showed fresh use of language and revealed a good understanding of chemistry. The top posters were created by a mix of A-, B-, and C-level students. The fine art work, coupled with poetry, helped chemistry come alive on campus, providing an aesthetic presentation of materials that engaged the general viewer.

  1. Statistics and analysis of scientific data

    CERN Document Server

    Bonamente, Massimiliano

    2013-01-01

    Statistics and Analysis of Scientific Data covers the foundations of probability theory and statistics, and a number of numerical and analytical methods that are essential for the present-day analyst of scientific data. Topics covered include probability theory, distribution functions of statistics, fits to two-dimensional datasheets and parameter estimation, Monte Carlo methods and Markov chains. Equal attention is paid to the theory and its practical application, and results from classic experiments in various fields are used to illustrate the importance of statistics in the analysis of scientific data. The main pedagogical method is a theory-then-application approach, where emphasis is placed first on a sound understanding of the underlying theory of a topic, which becomes the basis for an efficient and proactive use of the material for practical applications. The level is appropriate for undergraduates and beginning graduate students, and as a reference for the experienced researcher. Basic calculus is us...

  2. Fundamental Concepts in Biophysics Volume 1

    CERN Document Server

    Jue, Thomas

    2009-01-01

    HANDBOOK OF MODERN BIOPHYSICS Series Editor Thomas Jue, PhD Handbook of Modern Biophysics brings current biophysics topics into focus, so that biology, medical, engineering, mathematics, and physical-science students or researchers can learn fundamental concepts and the application of new techniques in addressing biomedical challenges. Chapters explicate the conceptual framework of the physics formalism and illustrate the biomedical applications. With the addition of problem sets, guides to further study, and references, the interested reader can continue to explore independently the ideas presented. Volume I: Fundamental Concepts in Biophysics Editor Thomas Jue, PhD In Fundamental Concepts in Biophysics, prominent professors have established a foundation for the study of biophysics related to the following topics: Mathematical Methods in Biophysics Quantum Mechanics Basic to Biophysical Methods Computational Modeling of Receptor–Ligand Binding and Cellular Signaling Processes Fluorescence Spectroscopy Elec...

  3. Concept of dynamic memory in economics

    Science.gov (United States)

    Tarasova, Valentina V.; Tarasov, Vasily E.

    2018-02-01

    In this paper we discuss a concept of dynamic memory and an application of fractional calculus to describe the dynamic memory. The concept of memory is considered from the standpoint of economic models in the framework of continuous time approach based on fractional calculus. We also describe some general restrictions that can be imposed on the structure and properties of dynamic memory. These restrictions include the following three principles: (a) the principle of fading memory; (b) the principle of memory homogeneity on time (the principle of non-aging memory); (c) the principle of memory reversibility (the principle of memory recovery). Examples of different memory functions are suggested by using the fractional calculus. To illustrate an application of the concept of dynamic memory in economics we consider a generalization of the Harrod-Domar model, where the power-law memory is taken into account.

  4. Appropriate statistical methods are required to assess diagnostic tests for replacement, add-on, and triage

    NARCIS (Netherlands)

    Hayen, Andrew; Macaskill, Petra; Irwig, Les; Bossuyt, Patrick

    2010-01-01

    To explain which measures of accuracy and which statistical methods should be used in studies to assess the value of a new binary test as a replacement test, an add-on test, or a triage test. Selection and explanation of statistical methods, illustrated with examples. Statistical methods for

  5. Instrumental Neutron Activation Analysis and Multivariate Statistics for Pottery Provenance

    Science.gov (United States)

    Glascock, M. D.; Neff, H.; Vaughn, K. J.

    2004-06-01

    The application of instrumental neutron activation analysis and multivariate statistics to archaeological studies of ceramics and clays is described. A small pottery data set from the Nasca culture in southern Peru is presented for illustration.

  6. Instrumental Neutron Activation Analysis and Multivariate Statistics for Pottery Provenance

    International Nuclear Information System (INIS)

    Glascock, M. D.; Neff, H.; Vaughn, K. J.

    2004-01-01

    The application of instrumental neutron activation analysis and multivariate statistics to archaeological studies of ceramics and clays is described. A small pottery data set from the Nasca culture in southern Peru is presented for illustration.

  7. Instrumental Neutron Activation Analysis and Multivariate Statistics for Pottery Provenance

    Energy Technology Data Exchange (ETDEWEB)

    Glascock, M. D.; Neff, H. [University of Missouri, Research Reactor Center (United States); Vaughn, K. J. [Pacific Lutheran University, Department of Anthropology (United States)

    2004-06-15

    The application of instrumental neutron activation analysis and multivariate statistics to archaeological studies of ceramics and clays is described. A small pottery data set from the Nasca culture in southern Peru is presented for illustration.

  8. SDI: Statistical dynamic interactions

    International Nuclear Information System (INIS)

    Blann, M.; Mustafa, M.G.; Peilert, G.; Stoecker, H.; Greiner, W.

    1991-01-01

    We focus on the combined statistical and dynamical aspects of heavy ion induced reactions. The overall picture is illustrated by considering the reaction 36 Ar + 238 U at a projectile energy of 35 MeV/nucleon. We illustrate the time dependent bound excitation energy due to the fusion/relaxation dynamics as calculated with the Boltzmann master equation. An estimate of the mass, charge and excitation of an equilibrated nucleus surviving the fast (dynamic) fusion-relaxation process is used as input into an evaporation calculation which includes 20 heavy fragment exit channels. The distribution of excitations between residue and clusters is explicitly calculated, as is the further deexcitation of clusters to bound nuclei. These results are compared with the exclusive cluster multiplicity measurements of Kim et al., and are found to give excellent agreement. We consider also an equilibrated residue system at 25% lower initial excitation, which gives an unsatisfactory exclusive multiplicity distribution. This illustrates that exclusive fragment multiplicity may provide a thermometer for system excitation. This analysis of data involves successive binary decay with no compressional effects nor phase transitions. Several examples of primary versus final (stable) cluster decay probabilities for an A = 100 nucleus at excitations of 100 to 800 MeV are presented. From these results a large change in multifragmentation patterns may be understood as a simple phase space consequence, invoking neither phase transitions, nor equation of state information. These results are used to illustrate physical quantities which are ambiguous to deduce from experimental fragment measurements. 14 refs., 4 figs

  9. Concepts for fusion fuel production blankets

    International Nuclear Information System (INIS)

    Gierszewski, P.

    1986-06-01

    The fusion blanket surrounds the burning hydrogen core of the fusion reactor. It is in this blanket that most of the energy released by the DT fusion reaction is converted into useable product, and where tritium fuel is produced to enable further operation of the reactor. Blankets will involve new materials, conditions and processes. Several recent fusion blanket concepts are presented to illustrate the range of ideas

  10. "Illustrating the Machinery of Life": Viruses

    Science.gov (United States)

    Goodsell, David S.

    2012-01-01

    Data from electron microscopy, X-ray crystallography, and biophysical analysis are used to create illustrations of viruses in their cellular context. This report describes the scientific data and artistic methods used to create three illustrations: a depiction of the poliovirus lifecycle, budding of influenza virus from a cell surface, and a…

  11. Statistical theory and inference

    CERN Document Server

    Olive, David J

    2014-01-01

    This text is for  a one semester graduate course in statistical theory and covers minimal and complete sufficient statistics, maximum likelihood estimators, method of moments, bias and mean square error, uniform minimum variance estimators and the Cramer-Rao lower bound, an introduction to large sample theory, likelihood ratio tests and uniformly most powerful  tests and the Neyman Pearson Lemma. A major goal of this text is to make these topics much more accessible to students by using the theory of exponential families. Exponential families, indicator functions and the support of the distribution are used throughout the text to simplify the theory. More than 50 ``brand name" distributions are used to illustrate the theory with many examples of exponential families, maximum likelihood estimators and uniformly minimum variance unbiased estimators. There are many homework problems with over 30 pages of solutions.

  12. Statistical methods in spatial genetics

    DEFF Research Database (Denmark)

    Guillot, Gilles; Leblois, Raphael; Coulon, Aurelie

    2009-01-01

    The joint analysis of spatial and genetic data is rapidly becoming the norm in population genetics. More and more studies explicitly describe and quantify the spatial organization of genetic variation and try to relate it to underlying ecological processes. As it has become increasingly difficult...... to keep abreast with the latest methodological developments, we review the statistical toolbox available to analyse population genetic data in a spatially explicit framework. We mostly focus on statistical concepts but also discuss practical aspects of the analytical methods, highlighting not only...

  13. Using Carbon Emissions Data to "Heat Up" Descriptive Statistics

    Science.gov (United States)

    Brooks, Robert

    2012-01-01

    This article illustrates using carbon emissions data in an introductory statistics assignment. The carbon emissions data has desirable characteristics including: choice of measure; skewness; and outliers. These complexities allow research and public policy debate to be introduced. (Contains 4 figures and 2 tables.)

  14. Recurrent Idiopathic Catatonia: Implications beyond the Diagnostic and Statistical Manual of Mental Disorders 5th Edition.

    Science.gov (United States)

    Caroff, Stanley N; Hurford, Irene; Bleier, Henry R; Gorton, Gregg E; Campbell, E Cabrina

    2015-08-31

    We describe a case of recurrent, life-threatening, catatonic stupor, without evidence of any associated medical, toxic or mental disorder. This case provides support for the inclusion of a separate category of "unspecified catatonia" in the Diagnostic and Statistical Manual of Mental Disorders 5th edition (DSM-5) to be used to classify idiopathic cases, which appears to be consistent with Kahlbaum's concept of catatonia as a distinct disease state. But beyond the limited, cross-sectional, syndromal approach adopted in DSM-5, this case more importantly illustrates the prognostic and therapeutic significance of the longitudinal course of illness in differentiating cases of catatonia, which is better defined in the Wernicke-Kleist-Leonhard classification system. The importance of differentiating cases of catatonia is further supported by the efficacy of antipsychotics in treatment of this case, contrary to conventional guidelines.

  15. Advanced Level Physics Students' Conceptions of Quantum Physics.

    Science.gov (United States)

    Mashhadi, Azam

    This study addresses questions about particle physics that focus on the nature of electrons. Speculations as to whether they are more like particles or waves or like neither illustrate the difficulties with which students are confronted when trying to incorporate the concepts of quantum physics into their overall conceptual framework. Such…

  16. Content-based VLE designs improve learning efficiency in constructivist statistics education.

    Science.gov (United States)

    Wessa, Patrick; De Rycker, Antoon; Holliday, Ian Edward

    2011-01-01

    We introduced a series of computer-supported workshops in our undergraduate statistics courses, in the hope that it would help students to gain a deeper understanding of statistical concepts. This raised questions about the appropriate design of the Virtual Learning Environment (VLE) in which such an approach had to be implemented. Therefore, we investigated two competing software design models for VLEs. In the first system, all learning features were a function of the classical VLE. The second system was designed from the perspective that learning features should be a function of the course's core content (statistical analyses), which required us to develop a specific-purpose Statistical Learning Environment (SLE) based on Reproducible Computing and newly developed Peer Review (PR) technology. The main research question is whether the second VLE design improved learning efficiency as compared to the standard type of VLE design that is commonly used in education. As a secondary objective we provide empirical evidence about the usefulness of PR as a constructivist learning activity which supports non-rote learning. Finally, this paper illustrates that it is possible to introduce a constructivist learning approach in large student populations, based on adequately designed educational technology, without subsuming educational content to technological convenience. Both VLE systems were tested within a two-year quasi-experiment based on a Reliable Nonequivalent Group Design. This approach allowed us to draw valid conclusions about the treatment effect of the changed VLE design, even though the systems were implemented in successive years. The methodological aspects about the experiment's internal validity are explained extensively. The effect of the design change is shown to have substantially increased the efficiency of constructivist, computer-assisted learning activities for all cohorts of the student population under investigation. The findings demonstrate that a

  17. Content-based VLE designs improve learning efficiency in constructivist statistics education.

    Directory of Open Access Journals (Sweden)

    Patrick Wessa

    Full Text Available BACKGROUND: We introduced a series of computer-supported workshops in our undergraduate statistics courses, in the hope that it would help students to gain a deeper understanding of statistical concepts. This raised questions about the appropriate design of the Virtual Learning Environment (VLE in which such an approach had to be implemented. Therefore, we investigated two competing software design models for VLEs. In the first system, all learning features were a function of the classical VLE. The second system was designed from the perspective that learning features should be a function of the course's core content (statistical analyses, which required us to develop a specific-purpose Statistical Learning Environment (SLE based on Reproducible Computing and newly developed Peer Review (PR technology. OBJECTIVES: The main research question is whether the second VLE design improved learning efficiency as compared to the standard type of VLE design that is commonly used in education. As a secondary objective we provide empirical evidence about the usefulness of PR as a constructivist learning activity which supports non-rote learning. Finally, this paper illustrates that it is possible to introduce a constructivist learning approach in large student populations, based on adequately designed educational technology, without subsuming educational content to technological convenience. METHODS: Both VLE systems were tested within a two-year quasi-experiment based on a Reliable Nonequivalent Group Design. This approach allowed us to draw valid conclusions about the treatment effect of the changed VLE design, even though the systems were implemented in successive years. The methodological aspects about the experiment's internal validity are explained extensively. RESULTS: The effect of the design change is shown to have substantially increased the efficiency of constructivist, computer-assisted learning activities for all cohorts of the student

  18. Content-Based VLE Designs Improve Learning Efficiency in Constructivist Statistics Education

    Science.gov (United States)

    Wessa, Patrick; De Rycker, Antoon; Holliday, Ian Edward

    2011-01-01

    Background We introduced a series of computer-supported workshops in our undergraduate statistics courses, in the hope that it would help students to gain a deeper understanding of statistical concepts. This raised questions about the appropriate design of the Virtual Learning Environment (VLE) in which such an approach had to be implemented. Therefore, we investigated two competing software design models for VLEs. In the first system, all learning features were a function of the classical VLE. The second system was designed from the perspective that learning features should be a function of the course's core content (statistical analyses), which required us to develop a specific–purpose Statistical Learning Environment (SLE) based on Reproducible Computing and newly developed Peer Review (PR) technology. Objectives The main research question is whether the second VLE design improved learning efficiency as compared to the standard type of VLE design that is commonly used in education. As a secondary objective we provide empirical evidence about the usefulness of PR as a constructivist learning activity which supports non-rote learning. Finally, this paper illustrates that it is possible to introduce a constructivist learning approach in large student populations, based on adequately designed educational technology, without subsuming educational content to technological convenience. Methods Both VLE systems were tested within a two-year quasi-experiment based on a Reliable Nonequivalent Group Design. This approach allowed us to draw valid conclusions about the treatment effect of the changed VLE design, even though the systems were implemented in successive years. The methodological aspects about the experiment's internal validity are explained extensively. Results The effect of the design change is shown to have substantially increased the efficiency of constructivist, computer-assisted learning activities for all cohorts of the student population under

  19. Enhancing an Undergraduate Business Statistics Course: Linking Teaching and Learning with Assessment Issues

    Science.gov (United States)

    Fairfield-Sonn, James W.; Kolluri, Bharat; Rogers, Annette; Singamsetti, Rao

    2009-01-01

    This paper examines several ways in which teaching effectiveness and student learning in an undergraduate Business Statistics course can be enhanced. First, we review some key concepts in Business Statistics that are often challenging to teach and show how using real data sets assist students in developing deeper understanding of the concepts.…

  20. Thermal and statistical properties of nuclei and nuclear systems

    International Nuclear Information System (INIS)

    Moretto, L.G.; Wozniak, G.J.

    1989-07-01

    The term statistical decay, statistical or thermodynamic equilibrium, thermalization, temperature, etc., have been used in nuclear physics since the introduction of the compound nucleus (CN) concept, and they are still used, perhaps even more frequently, in the context of intermediate- and high-energy heavy-ion reactions. Unfortunately, the increased popularity of these terms has not made them any clearer, and more often than not one encounters sweeping statements about the alleged statisticity of a nuclear process where the ''statistical'' connotation is a more apt description of the state of the speaker's mind than of the nuclear reaction. It is our goal, in this short set of lectures, to set at least some ideas straight on this broad and beautiful subject, on the one hand by clarifying some fundamental concepts, on the other by presenting some interesting applications to actual physical cases. 74 refs., 38 figs

  1. The Scientist as Illustrator.

    Science.gov (United States)

    Iwasa, Janet H

    2016-04-01

    Proficiency in art and illustration was once considered an essential skill for biologists, because text alone often could not suffice to describe observations of biological systems. With modern imaging technology, it is no longer necessary to illustrate what we can see by eye. However, in molecular and cellular biology, our understanding of biological processes is dependent on our ability to synthesize diverse data to generate a hypothesis. Creating visual models of these hypotheses is important for generating new ideas and for communicating to our peers and to the public. Here, I discuss the benefits of creating visual models in molecular and cellular biology and consider steps to enable researchers to become more effective visual communicators. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Pedagogical Uses of the Public Goods Concept in Economics.

    Science.gov (United States)

    Kiesling, Herbert J.

    1990-01-01

    Describes some of the relatively unknown aspects of the concept of public goods and shows how they might be brought into undergraduate textbooks in microeconomic principles, public finance, and welfare economics. Illustrates how these aspects of public goods can be brought into undergraduate instruction. (DB)

  3. Break preclusion concept and its application to the EPRTM reactor

    Energy Technology Data Exchange (ETDEWEB)

    Chapuliot, S., E-mail: stephane.chapuliot@areva.com; Migné, C.

    2014-04-01

    This paper provides a synthesis of the technical basis supporting the break preclusion concept and its implementation on the Main Coolant Lines and Main Steam Lines of the EPR™ reactor. In a first step, it describes the background of the break preclusion concept, and then it details the requirements associated to its implementation in a Defense In Depth approach.In second steps, main benefits and few illustrative examples are given for the MCL.

  4. Statistical Process Control: Going to the Limit for Quality.

    Science.gov (United States)

    Training, 1987

    1987-01-01

    Defines the concept of statistical process control, a quality control method used especially in manufacturing. Generally, concept users set specific standard levels that must be met. Makes the point that although employees work directly with the method, management is responsible for its success within the plant. (CH)

  5. Line identification studies using traditional techniques and wavelength coincidence statistics

    International Nuclear Information System (INIS)

    Cowley, C.R.; Adelman, S.J.

    1990-01-01

    Traditional line identification techniques result in the assignment of individual lines to an atomic or ionic species. These methods may be supplemented by wavelength coincidence statistics (WCS). The strength and weakness of these methods are discussed using spectra of a number of normal and peculiar B and A stars that have been studied independently by both methods. The present results support the overall findings of some earlier studies. WCS would be most useful in a first survey, before traditional methods have been applied. WCS can quickly make a global search for all species and in this way may enable identifications of an unexpected spectrum that could easily be omitted entirely from a traditional study. This is illustrated by O I. WCS is a subject to well known weakness of any statistical technique, for example, a predictable number of spurious results are to be expected. The danger of small number statistics are illustrated. WCS is at its best relative to traditional methods in finding a line-rich atomic species that is only weakly present in a complicated stellar spectrum

  6. Statistics for mathematicians a rigorous first course

    CERN Document Server

    Panaretos, Victor M

    2016-01-01

    This textbook provides a coherent introduction to the main concepts and methods of one-parameter statistical inference. Intended for students of Mathematics taking their first course in Statistics, the focus is on Statistics for Mathematicians rather than on Mathematical Statistics. The goal is not to focus on the mathematical/theoretical aspects of the subject, but rather to provide an introduction to the subject tailored to the mindset and tastes of Mathematics students, who are sometimes turned off by the informal nature of Statistics courses. This book can be used as the basis for an elementary semester-long first course on Statistics with a firm sense of direction that does not sacrifice rigor. The deeper goal of the text is to attract the attention of promising Mathematics students.

  7. Quantum information theory and quantum statistics

    International Nuclear Information System (INIS)

    Petz, D.

    2008-01-01

    Based on lectures given by the author, this book focuses on providing reliable introductory explanations of key concepts of quantum information theory and quantum statistics - rather than on results. The mathematically rigorous presentation is supported by numerous examples and exercises and by an appendix summarizing the relevant aspects of linear analysis. Assuming that the reader is familiar with the content of standard undergraduate courses in quantum mechanics, probability theory, linear algebra and functional analysis, the book addresses graduate students of mathematics and physics as well as theoretical and mathematical physicists. Conceived as a primer to bridge the gap between statistical physics and quantum information, a field to which the author has contributed significantly himself, it emphasizes concepts and thorough discussions of the fundamental notions to prepare the reader for deeper studies, not least through the selection of well chosen exercises. (orig.)

  8. Probabilistic and Statistical Aspects of Quantum Theory

    CERN Document Server

    Holevo, Alexander S

    2011-01-01

    This book is devoted to aspects of the foundations of quantum mechanics in which probabilistic and statistical concepts play an essential role. The main part of the book concerns the quantitative statistical theory of quantum measurement, based on the notion of positive operator-valued measures. During the past years there has been substantial progress in this direction, stimulated to a great extent by new applications such as Quantum Optics, Quantum Communication and high-precision experiments. The questions of statistical interpretation, quantum symmetries, theory of canonical commutation re

  9. Statistical weighted A-summability with application to Korovkin’s type approximation theorem

    Directory of Open Access Journals (Sweden)

    Syed Abdul Mohiuddine

    2016-03-01

    Full Text Available Abstract We introduce the notion of statistical weighted A-summability of a sequence and establish its relation with weighted A-statistical convergence. We also define weighted regular matrix and obtain necessary and sufficient conditions for the matrix A to be weighted regular. As an application, we prove the Korovkin type approximation theorem through statistical weighted A-summability and using the BBH operator to construct an illustrative example in support of our result.

  10. Experimental statistics for biological sciences.

    Science.gov (United States)

    Bang, Heejung; Davidian, Marie

    2010-01-01

    In this chapter, we cover basic and fundamental principles and methods in statistics - from "What are Data and Statistics?" to "ANOVA and linear regression," which are the basis of any statistical thinking and undertaking. Readers can easily find the selected topics in most introductory statistics textbooks, but we have tried to assemble and structure them in a succinct and reader-friendly manner in a stand-alone chapter. This text has long been used in real classroom settings for both undergraduate and graduate students who do or do not major in statistical sciences. We hope that from this chapter, readers would understand the key statistical concepts and terminologies, how to design a study (experimental or observational), how to analyze the data (e.g., describe the data and/or estimate the parameter(s) and make inference), and how to interpret the results. This text would be most useful if it is used as a supplemental material, while the readers take their own statistical courses or it would serve as a great reference text associated with a manual for any statistical software as a self-teaching guide.

  11. Statistical methods for data analysis in particle physics

    CERN Document Server

    AUTHOR|(CDS)2070643

    2015-01-01

    This concise set of course-based notes provides the reader with the main concepts and tools to perform statistical analysis of experimental data, in particular in the field of high-energy physics (HEP). First, an introduction to probability theory and basic statistics is given, mainly as reminder from advanced undergraduate studies, yet also in view to clearly distinguish the Frequentist versus Bayesian approaches and interpretations in subsequent applications. More advanced concepts and applications are gradually introduced, culminating in the chapter on upper limits as many applications in HEP concern hypothesis testing, where often the main goal is to provide better and better limits so as to be able to distinguish eventually between competing hypotheses or to rule out some of them altogether. Many worked examples will help newcomers to the field and graduate students to understand the pitfalls in applying theoretical concepts to actual data

  12. Statistical evaluation of vibration analysis techniques

    Science.gov (United States)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  13. Simulation-Based Performance Assessment: An Innovative Approach to Exploring Understanding of Physical Science Concepts

    Science.gov (United States)

    Gale, Jessica; Wind, Stefanie; Koval, Jayma; Dagosta, Joseph; Ryan, Mike; Usselman, Marion

    2016-01-01

    This paper illustrates the use of simulation-based performance assessment (PA) methodology in a recent study of eighth-grade students' understanding of physical science concepts. A set of four simulation-based PA tasks were iteratively developed to assess student understanding of an array of physical science concepts, including net force,…

  14. Symmetry and statistics

    International Nuclear Information System (INIS)

    French, J.B.

    1974-01-01

    The concepts of statistical behavior and symmetry are presented from the point of view of many body spectroscopy. Remarks are made on methods for the evaluation of moments, particularly widths, for the purpose of giving a feeling for the types of mathematical structures encountered. Applications involving ground state energies, spectra, and level densities are discussed. The extent to which Hamiltonian eigenstates belong to irreducible representations is mentioned. (4 figures, 1 table) (U.S.)

  15. Energy, Entropy and Exergy Concepts and Their Roles in Thermal Engineering

    OpenAIRE

    Dincer, Ibrahim; Cengel, Yunus A.

    2001-01-01

    Abstract: Energy, entropy and exergy concepts come from thermodynamics and are applicable to all fields of science and engineering. Therefore, this article intends to provide background for better understanding of these concepts and their differences among various classes of life support systems with a diverse coverage. It also covers the basic principles, general definitions and practical applications and implications. Some illustrative examples are presented to highlight the importance of t...

  16. Exploring the Concept of (Un)familiarity

    DEFF Research Database (Denmark)

    Andersen, D. J.

    2013-01-01

    the concept of (un)familiarity, an explanatory problem remains concerning people's unarticulated and perhaps deeper reasons for mobility and lack thereof. This leaves a question mark as to why feelings of (un)familiarity occur in the first place as well as the actual degree to which they constitute barriers......In border region studies, the concept of (un)familiarity is applied in empirical studies of consumer culture across borders, illustrating how feelings of unfamiliarity can have an off-putting influence on cross-border interaction (e.g. because of dislike of or lack of attraction to the other side......) at the same time as it can be an incentive for people living at borders to cross them (e.g. to explore the exotic other side). The concepts explanatory scope has, thus, far responded to the normative claim that a borderless Europe encourages and increases mobility. However, in previous studies applying...

  17. Illustrating, Quantifying, and Correcting for Bias in Post-hoc Analysis of Gene-Based Rare Variant Tests of Association

    Science.gov (United States)

    Grinde, Kelsey E.; Arbet, Jaron; Green, Alden; O'Connell, Michael; Valcarcel, Alessandra; Westra, Jason; Tintle, Nathan

    2017-01-01

    To date, gene-based rare variant testing approaches have focused on aggregating information across sets of variants to maximize statistical power in identifying genes showing significant association with diseases. Beyond identifying genes that are associated with diseases, the identification of causal variant(s) in those genes and estimation of their effect is crucial for planning replication studies and characterizing the genetic architecture of the locus. However, we illustrate that straightforward single-marker association statistics can suffer from substantial bias introduced by conditioning on gene-based test significance, due to the phenomenon often referred to as “winner's curse.” We illustrate the ramifications of this bias on variant effect size estimation and variant prioritization/ranking approaches, outline parameters of genetic architecture that affect this bias, and propose a bootstrap resampling method to correct for this bias. We find that our correction method significantly reduces the bias due to winner's curse (average two-fold decrease in bias, p bias and improve inference in post-hoc analysis of gene-based tests under a wide variety of genetic architectures. PMID:28959274

  18. Foundations and applications of statistics an introduction using R

    CERN Document Server

    Pruim, Randall

    2011-01-01

    Foundations and Applications of Statistics simultaneously emphasizes both the foundational and the computational aspects of modern statistics. Engaging and accessible, this book is useful to undergraduate students with a wide range of backgrounds and career goals. The exposition immediately begins with statistics, presenting concepts and results from probability along the way. Hypothesis testing is introduced very early, and the motivation for several probability distributions comes from p-value computations. Pruim develops the students' practical statistical reasoning through explicit example

  19. Statistics of extremes theory and applications

    CERN Document Server

    Beirlant, Jan; Segers, Johan; Teugels, Jozef; De Waal, Daniel; Ferro, Chris

    2006-01-01

    Research in the statistical analysis of extreme values has flourished over the past decade: new probability models, inference and data analysis techniques have been introduced; and new application areas have been explored. Statistics of Extremes comprehensively covers a wide range of models and application areas, including risk and insurance: a major area of interest and relevance to extreme value theory. Case studies are introduced providing a good balance of theory and application of each model discussed, incorporating many illustrated examples and plots of data. The last part of the book covers some interesting advanced topics, including  time series, regression, multivariate and Bayesian modelling of extremes, the use of which has huge potential.  

  20. Can Strategies Facilitate Learning from Illustrated Science Texts?

    Science.gov (United States)

    Iding, Marie K.

    2000-01-01

    Examines the effectiveness of schema training in illustration types and text-illustration relations for learning from college level physiology texts and discusses findings that are consistent with prior research on learning from illustrated materials and with dual coding theory. Considers future directions for strategy training research and…

  1. Exploring Foundation Concepts in Introductory Statistics Using Dynamic Data Points

    Science.gov (United States)

    Ekol, George

    2015-01-01

    This paper analyses introductory statistics students' verbal and gestural expressions as they interacted with a dynamic sketch (DS) designed using "Sketchpad" software. The DS involved numeric data points built on the number line whose values changed as the points were dragged along the number line. The study is framed on aggregate…

  2. Some fundamental technical concepts about cost based transmission pricing

    International Nuclear Information System (INIS)

    Shirmohammadi, D.; Filho, X.V.; Gorenstin, B.; Pereira, M.V.P.

    1996-01-01

    In this paper the authors describe the basic technical concepts involved in developing cost based transmission prices. They introduce the concepts of transmission pricing paradigms and methodologies to better illustrate how transmission costs are transformed into transmission prices. The authors also briefly discuss the role of these paradigms and methodologies in promoting ''economic efficiency'' which is narrowly defined in this paper. They conclude the paper with an example of the application of some of these paradigms and methodologies for pricing transmission services in Brazil

  3. Practical Statistics for LHC Physicists: Bayesian Inference (3/3)

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    These lectures cover those principles and practices of statistics that are most relevant for work at the LHC. The first lecture discusses the basic ideas of descriptive statistics, probability and likelihood. The second lecture covers the key ideas in the frequentist approach, including confidence limits, profile likelihoods, p-values, and hypothesis testing. The third lecture covers inference in the Bayesian approach. Throughout, real-world examples will be used to illustrate the practical application of the ideas. No previous knowledge is assumed.

  4. Practical Statistics for LHC Physicists: Frequentist Inference (2/3)

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    These lectures cover those principles and practices of statistics that are most relevant for work at the LHC. The first lecture discusses the basic ideas of descriptive statistics, probability and likelihood. The second lecture covers the key ideas in the frequentist approach, including confidence limits, profile likelihoods, p-values, and hypothesis testing. The third lecture covers inference in the Bayesian approach. Throughout, real-world examples will be used to illustrate the practical application of the ideas. No previous knowledge is assumed.

  5. A Simple Statistical Thermodynamics Experiment

    Science.gov (United States)

    LoPresto, Michael C.

    2010-01-01

    Comparing the predicted and actual rolls of combinations of both two and three dice can help to introduce many of the basic concepts of statistical thermodynamics, including multiplicity, probability, microstates, and macrostates, and demonstrate that entropy is indeed a measure of randomness, that disordered states (those of higher entropy) are…

  6. Using Guided Reinvention to Develop Teachers' Understanding of Hypothesis Testing Concepts

    Science.gov (United States)

    Dolor, Jason; Noll, Jennifer

    2015-01-01

    Statistics education reform efforts emphasize the importance of informal inference in the learning of statistics. Research suggests statistics teachers experience similar difficulties understanding statistical inference concepts as students and how teacher knowledge can impact student learning. This study investigates how teachers reinvented an…

  7. Explorations in Statistics: Standard Deviations and Standard Errors

    Science.gov (United States)

    Curran-Everett, Douglas

    2008-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This series in "Advances in Physiology Education" provides an opportunity to do just that: we will investigate basic concepts in statistics using the free software package R. Because this series uses R solely as a vehicle…

  8. Quantum physics and statistical physics. 5. ed.

    International Nuclear Information System (INIS)

    Alonso, Marcelo; Finn, Edward J.

    2012-01-01

    By logical and uniform presentation this recognized introduction in modern physics treats both the experimental and theoretical aspects. The first part of the book deals with quantum mechanics and their application to atoms, molecules, nuclei, solids, and elementary particles. The statistical physics with classical statistics, thermodynamics, and quantum statistics is theme of the second part. Alsonso and Finn avoid complicated mathematical developments; by numerous sketches and diagrams as well as many problems and examples they make the reader early and above all easily understandably familiar with the formations of concepts of modern physics.

  9. Statistics and Data Interpretation for Social Work

    CERN Document Server

    Rosenthal, James

    2011-01-01

    "Without question, this text will be the most authoritative source of information on statistics in the human services. From my point of view, it is a definitive work that combines a rigorous pedagogy with a down to earth (commonsense) exploration of the complex and difficult issues in data analysis (statistics) and interpretation. I welcome its publication.". -Praise for the First Edition. Written by a social worker for social work students, this is a nuts and bolts guide to statistics that presents complex calculations and concepts in clear, easy-to-understand language. It includes

  10. A Career in Statistics Beyond the Numbers

    CERN Document Server

    Hahn, Gerald J

    2012-01-01

    A valuable guide to a successful career as a statistician A Career in Statistics: Beyond the Numbers prepares readers for careers in statistics by emphasizing essential concepts and practices beyond the technical tools provided in standard courses and texts. This insider's guide from internationally recognized applied statisticians helps readers decide whether a career in statistics is right for them, provides hands-on guidance on how to prepare for such a career, and shows how to succeed on the job. The book provides non-technical guidance for a successful career. The authors' extensive indu

  11. Advancing Uncertainty: Untangling and Discerning Related Concepts

    Directory of Open Access Journals (Sweden)

    Janice Penrod

    2002-12-01

    Full Text Available Methods of advancing concepts within the qualitative paradigm have been developed and articulated. In this section, I describe methodological perspectives of a project designed to advance the concept of uncertainty using multiple qualitative methods. Through a series of earlier studies, the concept of uncertainty arose repeatedly in varied contexts, working its way into prominence, and warranting further investigation. Processes of advanced concept analysis were used to initiate the formal investigation into the meaning of the concept. Through concept analysis, the concept was deconstructed to identify conceptual components and gaps in understanding. Using this skeletal framework of the concept identified through concept analysis, subsequent studies were carried out to add ‘flesh’ to the concept. First, a concept refinement using the literature as data was completed. Findings revealed that the current state of the concept of uncertainty failed to incorporate what was known of the lived experience. Therefore, using interview techniques as the primary data source, a phenomenological study of uncertainty among caregivers was conducted. Incorporating the findings of the phenomenology, the skeletal framework of the concept was further fleshed out using techniques of concept correction to produce a more mature conceptualization of uncertainty. In this section, I describe the flow of this qualitative project investigating the concept of uncertainty, with special emphasis on a particular threat to validity (called conceptual tunnel vision that was identified and addressed during the phases of concept correction. Though in this article I employ a study of uncertainty for illustration, limited substantive findings regarding uncertainty are presented to retain a clear focus on the methodological issues.

  12. Developments in statistical analysis in quantitative genetics

    DEFF Research Database (Denmark)

    Sorensen, Daniel

    2009-01-01

    of genetic means and variances, models for the analysis of categorical and count data, the statistical genetics of a model postulating that environmental variance is partly under genetic control, and a short discussion of models that incorporate massive genetic marker information. We provide an overview......A remarkable research impetus has taken place in statistical genetics since the last World Conference. This has been stimulated by breakthroughs in molecular genetics, automated data-recording devices and computer-intensive statistical methods. The latter were revolutionized by the bootstrap...... and by Markov chain Monte Carlo (McMC). In this overview a number of specific areas are chosen to illustrate the enormous flexibility that McMC has provided for fitting models and exploring features of data that were previously inaccessible. The selected areas are inferences of the trajectories over time...

  13. Statistical network analysis for analyzing policy networks

    DEFF Research Database (Denmark)

    Robins, Garry; Lewis, Jenny; Wang, Peng

    2012-01-01

    and policy network methodology is the development of statistical modeling approaches that can accommodate such dependent data. In this article, we review three network statistical methods commonly used in the current literature: quadratic assignment procedures, exponential random graph models (ERGMs......To analyze social network data using standard statistical approaches is to risk incorrect inference. The dependencies among observations implied in a network conceptualization undermine standard assumptions of the usual general linear models. One of the most quickly expanding areas of social......), and stochastic actor-oriented models. We focus most attention on ERGMs by providing an illustrative example of a model for a strategic information network within a local government. We draw inferences about the structural role played by individuals recognized as key innovators and conclude that such an approach...

  14. Radiology illustrated. Hepatobiliary and pancreatic radiology

    International Nuclear Information System (INIS)

    Choi, Byung Ihn

    2014-01-01

    Clear, practical guide to the diagnostic imaging of diseases of the liver, biliary tree, gallbladder, pancreas, and spleen. A wealth of carefully selected and categorized illustrations. Highlighted key points to facilitate rapid review. Aid to differential diagnosis. Radiology Illustrated: Hepatobiliary and Pancreatic Radiology is the first of two volumes that will serve as a clear, practical guide to the diagnostic imaging of abdominal diseases. This volume, devoted to diseases of the liver, biliary tree, gallbladder, pancreas, and spleen, covers congenital disorders, vascular diseases, benign and malignant tumors, and infectious conditions. Liver transplantation, evaluation of the therapeutic response of hepatocellular carcinoma, trauma, and post-treatment complications are also addressed. The book presents approximately 560 cases with more than 2100 carefully selected and categorized illustrations, along with key text messages and tables, that will allow the reader easily to recall the relevant images as an aid to differential diagnosis. At the end of each text message, key points are summarized to facilitate rapid review and learning. In addition, brief descriptions of each clinical problem are provided, followed by both common and uncommon case studies that illustrate the role of different imaging modalities, such as ultrasound, radiography, CT, and MRI.

  15. Radiology illustrated. Hepatobiliary and pancreatic radiology

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Byung Ihn (ed.) [Seoul National Univ. Hospital (Korea, Republic of). Dept. of Radiology

    2014-04-01

    Clear, practical guide to the diagnostic imaging of diseases of the liver, biliary tree, gallbladder, pancreas, and spleen. A wealth of carefully selected and categorized illustrations. Highlighted key points to facilitate rapid review. Aid to differential diagnosis. Radiology Illustrated: Hepatobiliary and Pancreatic Radiology is the first of two volumes that will serve as a clear, practical guide to the diagnostic imaging of abdominal diseases. This volume, devoted to diseases of the liver, biliary tree, gallbladder, pancreas, and spleen, covers congenital disorders, vascular diseases, benign and malignant tumors, and infectious conditions. Liver transplantation, evaluation of the therapeutic response of hepatocellular carcinoma, trauma, and post-treatment complications are also addressed. The book presents approximately 560 cases with more than 2100 carefully selected and categorized illustrations, along with key text messages and tables, that will allow the reader easily to recall the relevant images as an aid to differential diagnosis. At the end of each text message, key points are summarized to facilitate rapid review and learning. In addition, brief descriptions of each clinical problem are provided, followed by both common and uncommon case studies that illustrate the role of different imaging modalities, such as ultrasound, radiography, CT, and MRI.

  16. Alienation: A Concept for Understanding Low-Income, Urban Clients

    Science.gov (United States)

    Holcomb-McCoy, Cheryl

    2004-01-01

    The author examines the concept of alienation and how it can be used to understand low-income, urban clients. A description is presented of 4 dimensions of alienation: powerlessness, meaninglessness, normlessness, and social isolation. Case illustrations are provided, and recommendations are made for counseling alienated clients. This article…

  17. Textile Art as Illustration.

    Science.gov (United States)

    Dickman, Floyd C.

    1998-01-01

    Used in picture-book illustration, such techniques as embroidery, applique, wood-block printing, batik, and quilting reflect cultural heritages and add richly textured images to this annotated list of titles for children from preschool through junior high. (Author)

  18. A Concept Analysis of Attitude toward Getting Vaccinated against Human Papillomavirus

    Science.gov (United States)

    Ratanasiripong, Nop T.; Chai, Kathleen T.

    2013-01-01

    In the research literature, the concept of attitude has been used and presented widely. However, attitude has been inconsistently defined and measured in various terms. This paper presents a concept analysis, using the Wilsonian methods modified by Walker and Avant (2004), to define and clarify the concept of attitude in order to provide an operationalized definition for a research study on attitudes toward a behavior: getting vaccinated against HPV. While the finding is not conclusive, three attributes of attitude: belief, affection, and evaluation are described. A theoretical definition and sample cases are constructed to illustrate the concept further. Antecedents, consequences, and empirical referents are discussed. Recommendations regarding the use of the concept of attitude in research, nursing practice, and nursing education are also made. PMID:23781335

  19. The use of transparent media in medical illustration.

    Science.gov (United States)

    Winn, W M

    1978-03-01

    Transparent media, such as watercolor, acrylic, and dyes, have been used by scientific illustrators for centuries. This article gives one artist's view of the highly complex medium. Materials, techniques, short cuts, and potential problem areas are discussed. Illustrations of specific techniques and a step-by-step development of a medical illustration is provided.

  20. On a curvature-statistics theorem

    International Nuclear Information System (INIS)

    Calixto, M; Aldaya, V

    2008-01-01

    The spin-statistics theorem in quantum field theory relates the spin of a particle to the statistics obeyed by that particle. Here we investigate an interesting correspondence or connection between curvature (κ = ±1) and quantum statistics (Fermi-Dirac and Bose-Einstein, respectively). The interrelation between both concepts is established through vacuum coherent configurations of zero modes in quantum field theory on the compact O(3) and noncompact O(2; 1) (spatial) isometry subgroups of de Sitter and Anti de Sitter spaces, respectively. The high frequency limit, is retrieved as a (zero curvature) group contraction to the Newton-Hooke (harmonic oscillator) group. We also make some comments on the physical significance of the vacuum energy density and the cosmological constant problem.

  1. On a curvature-statistics theorem

    Energy Technology Data Exchange (ETDEWEB)

    Calixto, M [Departamento de Matematica Aplicada y Estadistica, Universidad Politecnica de Cartagena, Paseo Alfonso XIII 56, 30203 Cartagena (Spain); Aldaya, V [Instituto de Astrofisica de Andalucia, Apartado Postal 3004, 18080 Granada (Spain)], E-mail: Manuel.Calixto@upct.es

    2008-08-15

    The spin-statistics theorem in quantum field theory relates the spin of a particle to the statistics obeyed by that particle. Here we investigate an interesting correspondence or connection between curvature ({kappa} = {+-}1) and quantum statistics (Fermi-Dirac and Bose-Einstein, respectively). The interrelation between both concepts is established through vacuum coherent configurations of zero modes in quantum field theory on the compact O(3) and noncompact O(2; 1) (spatial) isometry subgroups of de Sitter and Anti de Sitter spaces, respectively. The high frequency limit, is retrieved as a (zero curvature) group contraction to the Newton-Hooke (harmonic oscillator) group. We also make some comments on the physical significance of the vacuum energy density and the cosmological constant problem.

  2. Sensitivity analysis and optimization of system dynamics models : Regression analysis and statistical design of experiments

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This tutorial discusses what-if analysis and optimization of System Dynamics models. These problems are solved, using the statistical techniques of regression analysis and design of experiments (DOE). These issues are illustrated by applying the statistical techniques to a System Dynamics model for

  3. A statistical campaign: Florence Nightingale and Harriet Martineau’s 'England and her Soldiers'

    Directory of Open Access Journals (Sweden)

    Iris Veysey

    2016-03-01

    Full Text Available This essay is an account of the making of England and her Soldiers (1859 by Harriet Martineau and Florence Nightingale. The book is a literary account of the Crimean War, written by Martineau and based on Nightingale’s statistical studies of mortality during the conflict. Nightingale was passionate about statistics and healthcare. Whilst working as a nurse in the Crimea, she witnessed thousands of soldiers die of infectious diseases that might have been prevented with proper sanitation. After the war, she launched a campaign to convince the British government to make permanent reforms to military healthcare, compiling a dataset on mortality in the Crimea. She worked with the government’s Royal Commission investigating healthcare during the war, but also worked privately with Martineau to publicise her findings. Martineau and Nightingale grasped that the lay reader was more receptive to statistical information in a literary format than in dense statistical reports. As such, Nightingale’s data was interwoven with Martineau’s text. The pair illustrated their book with Nightingale’s ‘Rose Diagram’, a statistical graphic which simply illustrated the rate of mortality.

  4. Preparing Colorful Astronomical Images and Illustrations

    Science.gov (United States)

    Levay, Z. G.; Frattare, L. M.

    2001-12-01

    We present techniques for using mainstream graphics software, specifically Adobe Photoshop and Illustrator, for producing composite color images and illustrations from astronomical data. These techniques have been used with numerous images from the Hubble Space Telescope to produce printed and web-based news, education and public presentation products as well as illustrations for technical publication. While Photoshop is not intended for quantitative analysis of full dynamic range data (as are IRAF or IDL, for example), we have had much success applying Photoshop's numerous, versatile tools to work with scaled images, masks, text and graphics in multiple semi-transparent layers and channels. These features, along with its user-oriented, visual interface, provide convenient tools to produce high-quality, full-color images and graphics for printed and on-line publication and presentation.

  5. Statistical Process Control in the Practice of Program Evaluation.

    Science.gov (United States)

    Posavac, Emil J.

    1995-01-01

    A technique developed to monitor the quality of manufactured products, statistical process control (SPC), incorporates several features that may prove attractive to evaluators. This paper reviews the history of SPC, suggests how the approach can enrich program evaluation, and illustrates its use in a hospital-based example. (SLD)

  6. The Metropolis-Hastings algorithm, a handy tool for the practice of environmental model estimation : illustration with biochemical oxygen demand data

    Directory of Open Access Journals (Sweden)

    Franck Torre

    2001-02-01

    Full Text Available Environmental scientists often face situations where: (i stimulus-response relationships are non-linear; (ii data are rare or imprecise; (iii facts are uncertain and stimulus-responses relationships are questionable. In this paper, we focus on the first two points. A powerful and easy-to-use statistical method, the Metropolis-Hastings algorithm, allows the quantification of the uncertainty attached to any model response. This stochastic simulation technique is able to reproduce the statistical joint distribution of the whole parameter set of any model. The Metropolis-Hastings algorithm is described and illustrated on a typical environmental model: the biochemical oxygen demand (BOD. The aim is to provide a helpful guideline for further, and ultimately more complex, models. As a first illustration, the MH-method is also applied to a simple regression example to demonstrate to the practitioner the ability of the algorithm to produce valid results.

  7. A proposal for the measurement of graphical statistics effectiveness: Does it enhance or interfere with statistical reasoning?

    International Nuclear Information System (INIS)

    Agus, M; Penna, M P; Peró-Cebollero, M; Guàrdia-Olmos, J

    2015-01-01

    Numerous studies have examined students' difficulties in understanding some notions related to statistical problems. Some authors observed that the presentation of distinct visual representations could increase statistical reasoning, supporting the principle of graphical facilitation. But other researchers disagree with this viewpoint, emphasising the impediments related to the use of illustrations that could overcharge the cognitive system with insignificant data. In this work we aim at comparing the probabilistic statistical reasoning regarding two different formats of problem presentations: graphical and verbal-numerical. We have conceived and presented five pairs of homologous simple problems in the verbal numerical and graphical format to 311 undergraduate Psychology students (n=156 in Italy and n=155 in Spain) without statistical expertise. The purpose of our work was to evaluate the effect of graphical facilitation in probabilistic statistical reasoning. Every undergraduate has solved each pair of problems in two formats in different problem presentation orders and sequences. Data analyses have highlighted that the effect of graphical facilitation is infrequent in psychology undergraduates. This effect is related to many factors (as knowledge, abilities, attitudes, and anxiety); moreover it might be considered the resultant of interaction between individual and task characteristics

  8. Translation Collective and Illustration: aesthetic film creation from Jane Austen

    Directory of Open Access Journals (Sweden)

    Lemuel da Cruz Gandara

    2015-08-01

    Full Text Available http://dx.doi.org/10.5007/2175-7917.2015v20n2p67 From metonymic clippings of the novels Sense and Sensibility (1811, Pride and Prejudice (1813, Emma (1815 and Persuasion (1817, all written by the English author Jane Austen (1775-1817, we propose a comparative analysis among the literary texts, the illustrations by Hugh Thomson at the end of the nineteenth century and the film footage created between 1995 and 2005. Our goal is to confront tickets to find similarities and differences dialogic between them. This perspective reveals the interpretations made by many readers over the years and the receptions of the English writer in different contexts and media. Our critical and theoretical reflection is based on studies of the Russian philosopher Mikhail Bakhtin about the aesthetics of literary creation and amplifies the concepts of literary cinema, collective translation and aesthetics of film creation.

  9. Statistical Estimation of Heterogeneities: A New Frontier in Well Testing

    Science.gov (United States)

    Neuman, S. P.; Guadagnini, A.; Illman, W. A.; Riva, M.; Vesselinov, V. V.

    2001-12-01

    Well-testing methods have traditionally relied on analytical solutions of groundwater flow equations in relatively simple domains, consisting of one or at most a few units having uniform hydraulic properties. Recently, attention has been shifting toward methods and solutions that would allow one to characterize subsurface heterogeneities in greater detail. On one hand, geostatistical inverse methods are being used to assess the spatial variability of parameters, such as permeability and porosity, on the basis of multiple cross-hole pressure interference tests. On the other hand, analytical solutions are being developed to describe the mean and variance (first and second statistical moments) of flow to a well in a randomly heterogeneous medium. Geostatistical inverse interpretation of cross-hole tests yields a smoothed but detailed "tomographic" image of how parameters actually vary in three-dimensional space, together with corresponding measures of estimation uncertainty. Moment solutions may soon allow one to interpret well tests in terms of statistical parameters such as the mean and variance of log permeability, its spatial autocorrelation and statistical anisotropy. The idea of geostatistical cross-hole tomography is illustrated through pneumatic injection tests conducted in unsaturated fractured tuff at the Apache Leap Research Site near Superior, Arizona. The idea of using moment equations to interpret well-tests statistically is illustrated through a recently developed three-dimensional solution for steady state flow to a well in a bounded, randomly heterogeneous, statistically anisotropic aquifer.

  10. A Microsoft® Excel Simulation Illustrating the Central Limit Theorem's Appropriateness for Comparing the Difference between the Means of Any Two Populations

    Science.gov (United States)

    Moen, David H.; Powell, John E.

    2008-01-01

    Using Microsoft® Excel, several interactive, computerized learning modules are developed to illustrate the Central Limit Theorem's appropriateness for comparing the difference between the means of any two populations. These modules are used in the classroom to enhance the comprehension of this theorem as well as the concepts that provide the…

  11. The use of statistical models in heavy-ion reactions studies

    International Nuclear Information System (INIS)

    Stokstad, R.G.

    1984-01-01

    This chapter reviews the use of statistical models to describe nuclear level densities and the decay of equilibrated nuclei. The statistical models of nuclear structure and nuclear reactions presented here have wide application in the analysis of heavy-ion reaction data. Applications are illustrated with examples of gamma-ray decay, the emission of light particles and heavier clusters of nucleons, and fission. In addition to the compound nucleus, the treatment of equilibrated fragments formed in binary reactions is discussed. The statistical model is shown to be an important tool for the identification of products from nonequilibrium decay

  12. Improvement of Statistical Decisions under Parametric Uncertainty

    Science.gov (United States)

    Nechval, Nicholas A.; Nechval, Konstantin N.; Purgailis, Maris; Berzins, Gundars; Rozevskis, Uldis

    2011-10-01

    A large number of problems in production planning and scheduling, location, transportation, finance, and engineering design require that decisions be made in the presence of uncertainty. Decision-making under uncertainty is a central problem in statistical inference, and has been formally studied in virtually all approaches to inference. The aim of the present paper is to show how the invariant embedding technique, the idea of which belongs to the authors, may be employed in the particular case of finding the improved statistical decisions under parametric uncertainty. This technique represents a simple and computationally attractive statistical method based on the constructive use of the invariance principle in mathematical statistics. Unlike the Bayesian approach, an invariant embedding technique is independent of the choice of priors. It allows one to eliminate unknown parameters from the problem and to find the best invariant decision rule, which has smaller risk than any of the well-known decision rules. To illustrate the proposed technique, application examples are given.

  13. The Euclid Statistical Matrix Tool

    Directory of Open Access Journals (Sweden)

    Curtis Tilves

    2017-06-01

    Full Text Available Stataphobia, a term used to describe the fear of statistics and research methods, can result from a lack of improper training in statistical methods. Poor statistical methods training can have an effect on health policy decision making and may play a role in the low research productivity seen in developing countries. One way to reduce Stataphobia is to intervene in the teaching of statistics in the classroom; however, such an intervention must tackle several obstacles, including student interest in the material, multiple ways of learning materials, and language barriers. We present here the Euclid Statistical Matrix, a tool for combatting Stataphobia on a global scale. This free tool is comprised of popular statistical YouTube channels and web sources that teach and demonstrate statistical concepts in a variety of presentation methods. Working with international teams in Iran, Japan, Egypt, Russia, and the United States, we have also developed the Statistical Matrix in multiple languages to address language barriers to learning statistics. By utilizing already-established large networks, we are able to disseminate our tool to thousands of Farsi-speaking university faculty and students in Iran and the United States. Future dissemination of the Euclid Statistical Matrix throughout the Central Asia and support from local universities may help to combat low research productivity in this region.

  14. An Applied Statistics Course for Systematics and Ecology PhD Students

    Science.gov (United States)

    Ojeda, Mario Miguel; Sosa, Victoria

    2002-01-01

    Statistics education is under review at all educational levels. Statistical concepts, as well as the use of statistical methods and techniques, can be taught in at least two contrasting ways. Specifically, (1) teaching can be theoretically and mathematically oriented, or (2) it can be less mathematically oriented being focused, instead, on…

  15. A concept for global optimization of topology design problems

    DEFF Research Database (Denmark)

    Stolpe, Mathias; Achtziger, Wolfgang; Kawamoto, Atsushi

    2006-01-01

    We present a concept for solving topology design problems to proven global optimality. We propose that the problems are modeled using the approach of simultaneous analysis and design with discrete design variables and solved with convergent branch and bound type methods. This concept is illustrated...... on two applications. The first application is the design of stiff truss structures where the bar areas are chosen from a finite set of available areas. The second considered application is simultaneous topology and geometry design of planar articulated mechanisms. For each application we outline...

  16. Flywheel-battery hydrid: a new concept for vehicle propulsion

    International Nuclear Information System (INIS)

    Anon.

    1976-01-01

    A new concept was examined for powering the automobile: a flywheel-battery hybrid that can be developed for near-term use from currently available lead-acid batteries and state-of-the-art flywheel designs. To illustrate the concept, a calculation is given of the range and performance of the hybrid power system in a typical commute vehicle, and the results are compared to the measured range and performance of an all-battery vehicle. This comparison shows improved performance and a twofold urban-range increase for the hybrid over the all-battery power system

  17. Phase Transitions in Combinatorial Optimization Problems Basics, Algorithms and Statistical Mechanics

    CERN Document Server

    Hartmann, Alexander K

    2005-01-01

    A concise, comprehensive introduction to the topic of statistical physics of combinatorial optimization, bringing together theoretical concepts and algorithms from computer science with analytical methods from physics. The result bridges the gap between statistical physics and combinatorial optimization, investigating problems taken from theoretical computing, such as the vertex-cover problem, with the concepts and methods of theoretical physics. The authors cover rapid developments and analytical methods that are both extremely complex and spread by word-of-mouth, providing all the necessary

  18. Statistical techniques for sampling and monitoring natural resources

    Science.gov (United States)

    Hans T. Schreuder; Richard Ernst; Hugo Ramirez-Maldonado

    2004-01-01

    We present the statistical theory of inventory and monitoring from a probabilistic point of view. We start with the basics and show the interrelationships between designs and estimators illustrating the methods with a small artificial population as well as with a mapped realistic population. For such applications, useful open source software is given in Appendix 4....

  19. An introduction to descriptive statistics: A review and practical guide

    International Nuclear Information System (INIS)

    Marshall, Gill; Jonker, Leon

    2010-01-01

    This paper, the first of two, demonstrates why it is necessary for radiographers to understand basic statistical concepts both to assimilate the work of others and also in their own research work. As the emphasis on evidence-based practice increases, it will become more pressing for radiographers to be able to dissect other people's research and to contribute to research themselves. The different types of data that one can come across are covered here, as well as different ways to describe data. Furthermore, the statistical terminology and methods used that comprise descriptive statistics are explained, including levels of measurement, measures of central tendency (average), and dispersion (spread) and the concept of normal distribution. This paper reviews relevant literature, provides a checklist of points to consider before progressing with the application of appropriate statistical methods to a data set, and provides a glossary of relevant terms for reference.

  20. Method for statistical data analysis of multivariate observations

    CERN Document Server

    Gnanadesikan, R

    1997-01-01

    A practical guide for multivariate statistical techniques-- now updated and revised In recent years, innovations in computer technology and statistical methodologies have dramatically altered the landscape of multivariate data analysis. This new edition of Methods for Statistical Data Analysis of Multivariate Observations explores current multivariate concepts and techniques while retaining the same practical focus of its predecessor. It integrates methods and data-based interpretations relevant to multivariate analysis in a way that addresses real-world problems arising in many areas of inte

  1. Δim-lacunary statistical convergence of order α

    Science.gov (United States)

    Altınok, Hıfsı; Et, Mikail; Işık, Mahmut

    2018-01-01

    The purpose of this work is to introduce the concepts of Δim-lacunary statistical convergence of order α and lacunary strongly (Δim,p )-convergence of order α. We establish some connections between lacunary strongly (Δim,p )-convergence of order α and Δim-lacunary statistical convergence of order α. It is shown that if a sequence is lacunary strongly (Δim,p )-summable of order α then it is Δim-lacunary statistically convergent of order α.

  2. Testing the statistical compatibility of independent data sets

    International Nuclear Information System (INIS)

    Maltoni, M.; Schwetz, T.

    2003-01-01

    We discuss a goodness-of-fit method which tests the compatibility between statistically independent data sets. The method gives sensible results even in cases where the χ 2 minima of the individual data sets are very low or when several parameters are fitted to a large number of data points. In particular, it avoids the problem that a possible disagreement between data sets becomes diluted by data points which are insensitive to the crucial parameters. A formal derivation of the probability distribution function for the proposed test statistics is given, based on standard theorems of statistics. The application of the method is illustrated on data from neutrino oscillation experiments, and its complementarity to the standard goodness-of-fit is discussed

  3. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  4. Detection and validation of unscalable item score patterns using Item Response Theory: An illustration with Harter's Self-Perception Profile for Children

    NARCIS (Netherlands)

    Meijer, R.R.; Egberink, I.J.L.; Emons, Wilco H.M.; Sijtsma, Klaas

    2008-01-01

    We illustrate the usefulness of person-fit methodology for personality assessment. For this purpose, we use person-fit methods from item response theory. First, we give a nontechnical introduction to existing person-fit statistics. Second, we analyze data from Harter's (1985)Self-Perception Profile

  5. Crédit des illustrations

    OpenAIRE

    2017-01-01

    Illustration de couverture : Lyon, cathédrale Saint-Jean, culée de l’arc-boutant 2/3 : David (XIIIe siècle) surgissant de l’échafaudage (XXIe siècle). Photo Jean-Pierre Gobillot. Illustration de la quatrième de couverture : Lyon, cathédrale Saint-Jean, arcature extérieure de l’abside, chapiteau central. Dessin Ghislaine Macabéo. Photos Jean-Pierre Gobillot : fig. 1, 2, 10, 13, 14, 16, 23, 28, 29, 34, 35, 36, 37, 40, 41, 46, 54, 55, 56, 59, 60, 67, 76a, 78, 80, 82, 83, 88, 90, 97, 105, 106, 11...

  6. Robust inference from multiple test statistics via permutations: a better alternative to the single test statistic approach for randomized trials.

    Science.gov (United States)

    Ganju, Jitendra; Yu, Xinxin; Ma, Guoguang Julie

    2013-01-01

    Formal inference in randomized clinical trials is based on controlling the type I error rate associated with a single pre-specified statistic. The deficiency of using just one method of analysis is that it depends on assumptions that may not be met. For robust inference, we propose pre-specifying multiple test statistics and relying on the minimum p-value for testing the null hypothesis of no treatment effect. The null hypothesis associated with the various test statistics is that the treatment groups are indistinguishable. The critical value for hypothesis testing comes from permutation distributions. Rejection of the null hypothesis when the smallest p-value is less than the critical value controls the type I error rate at its designated value. Even if one of the candidate test statistics has low power, the adverse effect on the power of the minimum p-value statistic is not much. Its use is illustrated with examples. We conclude that it is better to rely on the minimum p-value rather than a single statistic particularly when that single statistic is the logrank test, because of the cost and complexity of many survival trials. Copyright © 2013 John Wiley & Sons, Ltd.

  7. Line radiative transfer and statistical equilibrium*

    Directory of Open Access Journals (Sweden)

    Kamp Inga

    2015-01-01

    Full Text Available Atomic and molecular line emission from protoplanetary disks contains key information of their detailed physical and chemical structures. To unravel those structures, we need to understand line radiative transfer in dusty media and the statistical equilibrium, especially of molecules. I describe here the basic principles of statistical equilibrium and illustrate them through the two-level atom. In a second part, the fundamentals of line radiative transfer are introduced along with the various broadening mechanisms. I explain general solution methods with their drawbacks and also specific difficulties encountered in solving the line radiative transfer equation in disks (e.g. velocity gradients. I am closing with a few special cases of line emission from disks: Radiative pumping, masers and resonance scattering.

  8. Statistical convergence on intuitionistic fuzzy normed spaces

    International Nuclear Information System (INIS)

    Karakus, S.; Demirci, K.; Duman, O.

    2008-01-01

    Saadati and Park [Saadati R, Park JH, Chaos, Solitons and Fractals 2006;27:331-44] has recently introduced the notion of intuitionistic fuzzy normed space. In this paper, we study the concept of statistical convergence on intuitionistic fuzzy normed spaces. Then we give a useful characterization for statistically convergent sequences. Furthermore, we display an example such that our method of convergence is stronger than the usual convergence on intuitionistic fuzzy normed spaces

  9. A statistical study on consumer's perception of sustainable products

    Science.gov (United States)

    Pater, Liana; Izvercian, Monica; Ivaşcu, Larisa

    2017-07-01

    Sustainability and sustainable concepts are quite often but not always used correctly. The statistical research on consumer's perception of sustainable products has tried to identify the level of knowledge regarding the concept of sustainability and sustainable products, the selected criteria concerning the buying decision, the intention of purchasing a sustainable product, main sustainable products preferred by consumers.

  10. On Asymptotically Lacunary Statistical Equivalent Sequences of Order α in Probability

    Directory of Open Access Journals (Sweden)

    Işık Mahmut

    2017-01-01

    Full Text Available In this study, we introduce and examine the concepts of asymptotically lacunary statistical equivalent of order α in probability and strong asymptotically lacunary equivalent of order α in probability. We give some relations connected to these concepts.

  11. Feature Statistics Modulate the Activation of Meaning during Spoken Word Processing

    Science.gov (United States)

    Devereux, Barry J.; Taylor, Kirsten I.; Randall, Billi; Geertzen, Jeroen; Tyler, Lorraine K.

    2016-01-01

    Understanding spoken words involves a rapid mapping from speech to conceptual representations. One distributed feature-based conceptual account assumes that the statistical characteristics of concepts' features--the number of concepts they occur in ("distinctiveness/sharedness") and likelihood of co-occurrence ("correlational…

  12. Introductory statistics for the behavioral sciences

    CERN Document Server

    Welkowitz, Joan; Cohen, Jacob

    1971-01-01

    Introductory Statistics for the Behavioral Sciences provides an introduction to statistical concepts and principles. This book emphasizes the robustness of parametric procedures wherein such significant tests as t and F yield accurate results even if such assumptions as equal population variances and normal population distributions are not well met.Organized into three parts encompassing 16 chapters, this book begins with an overview of the rationale upon which much of behavioral science research is based, namely, drawing inferences about a population based on data obtained from a samp

  13. A course in mathematical statistics and large sample theory

    CERN Document Server

    Bhattacharya, Rabi; Patrangenaru, Victor

    2016-01-01

    This graduate-level textbook is primarily aimed at graduate students of statistics, mathematics, science, and engineering who have had an undergraduate course in statistics, an upper division course in analysis, and some acquaintance with measure theoretic probability. It provides a rigorous presentation of the core of mathematical statistics. Part I of this book constitutes a one-semester course on basic parametric mathematical statistics. Part II deals with the large sample theory of statistics — parametric and nonparametric, and its contents may be covered in one semester as well. Part III provides brief accounts of a number of topics of current interest for practitioners and other disciplines whose work involves statistical methods. Large Sample theory with many worked examples, numerical calculations, and simulations to illustrate theory Appendices provide ready access to a number of standard results, with many proofs Solutions given to a number of selected exercises from Part I Part II exercises with ...

  14. Computing with concepts, computing with numbers: Llull, Leibniz, and Boole

    NARCIS (Netherlands)

    Uckelman, S.L.

    2010-01-01

    We consider two ways to understand "reasoning as computation", one which focuses on the computation of concept symbols and the other on the computation of number symbols. We illustrate these two ways with Llull’s Ars Combinatoria and Leibniz’s attempts to arithmetize language, respectively. We then

  15. Developing Preschool Teachers' Knowledge of Students' Number Conceptions

    Science.gov (United States)

    Tsamir, Pessia; Tirosh, Dina; Levenson, Esther; Tabach, Michal; Barkai, Ruthi

    2014-01-01

    This article describes a study that investigates preschool teachers' knowledge of their young students' number conceptions and the teachers' related self-efficacy beliefs. It also presents and illustrates elements of a professional development program designed explicitly to promote this knowledge among preschool teachers. Results…

  16. Customer Preference-Based Information Retrieval to Build Module Concepts

    Directory of Open Access Journals (Sweden)

    Dongxing Cao

    2013-01-01

    Full Text Available Preference is viewed as an outer feeling of a product, also as a reflection of human's inner thought. It dominates the designers' decisions and affects our purchase intention. In the paper, a model of preference elicitation from customers is proposed to build module concepts. Firstly, the attributes of customer preference are classified in a hierarchy and make the surveys to build customer preference concepts. Secondly, the documents or catalogs of design requirements, perhaps containing some textual description and geometric data, are normalized by using semantic expressions. Some semantic rules are developed to describe low-level features of customer preference to construct a knowledge base of customer preference. Thirdly, designers' needs are used to map customer preference for generating module concepts. Finally, an empirical study of the stapler is surveyed to illustrate the validity of module concept generation.

  17. Measured, modeled, and causal conceptions of fitness

    Science.gov (United States)

    Abrams, Marshall

    2012-01-01

    This paper proposes partial answers to the following questions: in what senses can fitness differences plausibly be considered causes of evolution?What relationships are there between fitness concepts used in empirical research, modeling, and abstract theoretical proposals? How does the relevance of different fitness concepts depend on research questions and methodological constraints? The paper develops a novel taxonomy of fitness concepts, beginning with type fitness (a property of a genotype or phenotype), token fitness (a property of a particular individual), and purely mathematical fitness. Type fitness includes statistical type fitness, which can be measured from population data, and parametric type fitness, which is an underlying property estimated by statistical type fitnesses. Token fitness includes measurable token fitness, which can be measured on an individual, and tendential token fitness, which is assumed to be an underlying property of the individual in its environmental circumstances. Some of the paper's conclusions can be outlined as follows: claims that fitness differences do not cause evolution are reasonable when fitness is treated as statistical type fitness, measurable token fitness, or purely mathematical fitness. Some of the ways in which statistical methods are used in population genetics suggest that what natural selection involves are differences in parametric type fitnesses. Further, it's reasonable to think that differences in parametric type fitness can cause evolution. Tendential token fitnesses, however, are not themselves sufficient for natural selection. Though parametric type fitnesses are typically not directly measurable, they can be modeled with purely mathematical fitnesses and estimated by statistical type fitnesses, which in turn are defined in terms of measurable token fitnesses. The paper clarifies the ways in which fitnesses depend on pragmatic choices made by researchers. PMID:23112804

  18. Application of the transport system concept to the transport of LSA waste

    International Nuclear Information System (INIS)

    Lombard, J.; Appleton, P.; Libon, H.; Sannen, H.

    1994-01-01

    The aim of this presentation is to illustrate using two examples how a particular special arrangement can be envisaged for the transport of a well defined category of waste according to the ''Transport System Concept''. (authors)

  19. A mathematical model for HIV and hepatitis C co-infection and its assessment from a statistical perspective.

    Science.gov (United States)

    Castro Sanchez, Amparo Yovanna; Aerts, Marc; Shkedy, Ziv; Vickerman, Peter; Faggiano, Fabrizio; Salamina, Guiseppe; Hens, Niel

    2013-03-01

    The hepatitis C virus (HCV) and the human immunodeficiency virus (HIV) are a clear threat for public health, with high prevalences especially in high risk groups such as injecting drug users. People with HIV infection who are also infected by HCV suffer from a more rapid progression to HCV-related liver disease and have an increased risk for cirrhosis and liver cancer. Quantifying the impact of HIV and HCV co-infection is therefore of great importance. We propose a new joint mathematical model accounting for co-infection with the two viruses in the context of injecting drug users (IDUs). Statistical concepts and methods are used to assess the model from a statistical perspective, in order to get further insights in: (i) the comparison and selection of optional model components, (ii) the unknown values of the numerous model parameters, (iii) the parameters to which the model is most 'sensitive' and (iv) the combinations or patterns of values in the high-dimensional parameter space which are most supported by the data. Data from a longitudinal study of heroin users in Italy are used to illustrate the application of the proposed joint model and its statistical assessment. The parameters associated with contact rates (sharing syringes) and the transmission rates per syringe-sharing event are shown to play a major role. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Enhanced LOD Concepts for Virtual 3d City Models

    Science.gov (United States)

    Benner, J.; Geiger, A.; Gröger, G.; Häfele, K.-H.; Löwner, M.-O.

    2013-09-01

    Virtual 3D city models contain digital three dimensional representations of city objects like buildings, streets or technical infrastructure. Because size and complexity of these models continuously grow, a Level of Detail (LoD) concept effectively supporting the partitioning of a complete model into alternative models of different complexity and providing metadata, addressing informational content, complexity and quality of each alternative model is indispensable. After a short overview on various LoD concepts, this paper discusses the existing LoD concept of the CityGML standard for 3D city models and identifies a number of deficits. Based on this analysis, an alternative concept is developed and illustrated with several examples. It differentiates between first, a Geometric Level of Detail (GLoD) and a Semantic Level of Detail (SLoD), and second between the interior building and its exterior shell. Finally, a possible implementation of the new concept is demonstrated by means of an UML model.

  1. Statistical ensembles in quantum mechanics

    International Nuclear Information System (INIS)

    Blokhintsev, D.

    1976-01-01

    The interpretation of quantum mechanics presented in this paper is based on the concept of quantum ensembles. This concept differs essentially from the canonical one by that the interference of the observer into the state of a microscopic system is of no greater importance than in any other field of physics. Owing to this fact, the laws established by quantum mechanics are not of less objective character than the laws governing classical statistical mechanics. The paradoxical nature of some statements of quantum mechanics which result from the interpretation of the wave functions as the observer's notebook greatly stimulated the development of the idea presented. (Auth.)

  2. Illustrating the use of concepts from the discipline of policy studies in energy research : An explorative literature review

    NARCIS (Netherlands)

    Hoppe, T.; Coenen, Frans; van den Berg, Maya

    2016-01-01

    With the increasing challenges the energy sector faces, energy policy strategies and instruments are becoming ever more relevant. The discipline of policy studies might offer relevant concepts to enrich multidisciplinary energy research. The main research question of this article is: How can

  3. Illustrating the use of concepts from the discipline of policy studies in energy research: An explorative literature review

    NARCIS (Netherlands)

    Hoppe, Thomas; Coenen, Franciscus H.J.M.; van den Berg, Maya Marieke

    2016-01-01

    With the increasing challenges the energy sector faces, energy policy strategies and instruments are becoming ever more relevant. The discipline of policy studies might offer relevant concepts to enrich multidisciplinary energy research. The main research question of this article is: How can policy

  4. R statistical application development by example : beginner's guide

    CERN Document Server

    Tattar, Narayanachart Prabhanjan

    2013-01-01

    Full of screenshots and examples, this Beginner's Guide by Example will teach you practically everything you need to know about R statistical application development from scratch. You will begin learning the first concepts of statistics in R which is vital in this fast paced era and it is also a bargain as you do not need to do a preliminary course on the subject.

  5. Order statistics & inference estimation methods

    CERN Document Server

    Balakrishnan, N

    1991-01-01

    The literature on order statistics and inferenc eis quite extensive and covers a large number of fields ,but most of it is dispersed throughout numerous publications. This volume is the consolidtion of the most important results and places an emphasis on estimation. Both theoretical and computational procedures are presented to meet the needs of researchers, professionals, and students. The methods of estimation discussed are well-illustrated with numerous practical examples from both the physical and life sciences, including sociology,psychology,a nd electrical and chemical engineering. A co

  6. The Impact of an Interactive Statistics Module on Novices' Development of Scientific Process Skills and Attitudes in a First-Semester Research Foundations Course.

    Science.gov (United States)

    Marsan, Lynnsay A; D'Arcy, Christina E; Olimpo, Jeffrey T

    2016-12-01

    Evidence suggests that incorporating quantitative reasoning exercises into existent curricular frameworks within the science, technology, engineering, and mathematics (STEM) disciplines is essential for novices' development of conceptual understanding and process skills in these domains. Despite this being the case, such studies acknowledge that students often experience difficulty in applying mathematics in the context of scientific problems. To address this concern, the present study sought to explore the impact of active demonstrations and critical reading exercises on novices' comprehension of basic statistical concepts, including hypothesis testing, experimental design, and interpretation of research findings. Students first engaged in a highly interactive height activity that served to intuitively illustrate normal distribution, mean, standard deviation, and sample selection criteria. To enforce practical applications of standard deviation and p -value, student teams were subsequently assigned a figure from a peer-reviewed primary research article and instructed to evaluate the trustworthiness of the data. At the conclusion of this exercise, students presented their evaluations to the class for open discussion and commentary. Quantitative assessment of pre- and post-module survey data indicated a statistically significant increase both in students' scientific reasoning and process skills and in their self-reported confidence in understanding the statistical concepts presented in the module. Furthermore, data indicated that the majority of students (>85%) found the module both interesting and helpful in nature. Future studies will seek to develop additional, novel exercises within this area and to evaluate the impact of such modules across a variety of STEM and non-STEM contexts.

  7. The Impact of an Interactive Statistics Module on Novices’ Development of Scientific Process Skills and Attitudes in a First-Semester Research Foundations Course

    Directory of Open Access Journals (Sweden)

    Lynnsay A. Marsan

    2016-12-01

    Full Text Available Evidence suggests that incorporating quantitative reasoning exercises into existent curricular frameworks within the science, technology, engineering, and mathematics (STEM disciplines is essential for novices’ development of conceptual understanding and process skills in these domains. Despite this being the case, such studies acknowledge that students often experience difficulty in applying mathematics in the context of scientific problems. To address this concern, the present study sought to explore the impact of active demonstrations and critical reading exercises on novices’ comprehension of basic statistical concepts, including hypothesis testing, experimental design, and interpretation of research findings. Students first engaged in a highly interactive height activity that served to intuitively illustrate normal distribution, mean, standard deviation, and sample selection criteria. To enforce practical applications of standard deviation and p-value, student teams were subsequently assigned a figure from a peer-reviewed primary research article and instructed to evaluate the trustworthiness of the data. At the conclusion of this exercise, students presented their evaluations to the class for open discussion and commentary. Quantitative assessment of pre- and post-module survey data indicated a statistically significant increase both in students’ scientific reasoning and process skills and in their self-reported confidence in understanding the statistical concepts presented in the module. Furthermore, data indicated that the majority of students (>85% found the module both interesting and helpful in nature. Future studies will seek to develop additional, novel exercises within this area and to evaluate the impact of such modules across a variety of STEM and non-STEM contexts.

  8. Statistical Content in Middle Grades Mathematics Textbooks

    Science.gov (United States)

    Pickle, Maria Consuelo Capiral

    2012-01-01

    This study analyzed the treatment and scope of statistical concepts in four, widely-used, contemporary, middle grades mathematics textbook series: "Glencoe Math Connects," "Prentice Hall Mathematics," "Connected Mathematics Project," and "University of Chicago School Mathematics Project." There were three…

  9. Concept, measurement and use of acculturation in health and disease risk studies.

    Science.gov (United States)

    Chakraborty, Bandana M; Chakraborty, Ranajit

    2010-12-01

    Acculturation, a concept with its root in social science and cultural anthropology, is a process intimately related to health behavior and health status of minority populations in a multicultural society. This paper provides a brief review of the subject of acculturation as it relates to health research, showing that this concept has a potential to identify risk factors that underlie increased prevalence of chronic diseases, particularly in immigrant populations. A proper understanding of this is helpful in designing intervention programs to reduce the burden of such diseases and to increase the quality of life in such populations. The concept is defined with an outline of its history showing its evolution over time. Criteria for measuring acculturation are described to illustrate the need of accommodating its multidimensional features. Drawing examples from health research in US Hispanics, the role of acculturation on health behavior is discussed to document that the discordant findings are at least partially due to either use of incomplete dimensions of the concept, or not accounting for the dynamic aspect of its process. Finally, with illustration of a finding from a study among overweight Mexican American women of South Texas, a model of acculturation study is proposed that may be used in other immigrant populations undergoing the acculturation process.

  10. The Co-evolution of Concepts and Motivation.

    Science.gov (United States)

    Delton, Andrew W; Sell, Aaron

    2014-04-01

    Does the human mind contain evolved concepts? Many psychologists have doubted this or have investigated only a narrow set (e.g., object, number, cause). Does the human mind contain evolved motivational systems? Many more assent to this claim, holding that there are evolved motivational systems for, among other tasks, social affiliation, aggressive competition, and finding food. An emerging research program, however, reveals that these are not separate questions. Any evolved motivational system needs a wealth of conceptual structure that tethers the motivations to real world entities. For instance, what use is a fear of predators without knowing what predators are and how to respond to them effectively? As we illustrate with case studies of cooperation and conflict, there is no motivation without representation: To generate adaptive behavior, motivational systems must be interwoven with the concepts required to support them, and cannot be understood without explicit reference to those concepts.

  11. Elements of probability and statistics an introduction to probability with De Finetti’s approach and to Bayesian statistics

    CERN Document Server

    Biagini, Francesca

    2016-01-01

    This book provides an introduction to elementary probability and to Bayesian statistics using de Finetti's subjectivist approach. One of the features of this approach is that it does not require the introduction of sample space – a non-intrinsic concept that makes the treatment of elementary probability unnecessarily complicate – but introduces as fundamental the concept of random numbers directly related to their interpretation in applications. Events become a particular case of random numbers and probability a particular case of expectation when it is applied to events. The subjective evaluation of expectation and of conditional expectation is based on an economic choice of an acceptable bet or penalty. The properties of expectation and conditional expectation are derived by applying a coherence criterion that the evaluation has to follow. The book is suitable for all introductory courses in probability and statistics for students in Mathematics, Informatics, Engineering, and Physics.

  12. Understanding common statistical methods, Part I: descriptive methods, probability, and continuous data.

    Science.gov (United States)

    Skinner, Carl G; Patel, Manish M; Thomas, Jerry D; Miller, Michael A

    2011-01-01

    Statistical methods are pervasive in medical research and general medical literature. Understanding general statistical concepts will enhance our ability to critically appraise the current literature and ultimately improve the delivery of patient care. This article intends to provide an overview of the common statistical methods relevant to medicine.

  13. Foundation of statistical energy analysis in vibroacoustics

    CERN Document Server

    Le Bot, A

    2015-01-01

    This title deals with the statistical theory of sound and vibration. The foundation of statistical energy analysis is presented in great detail. In the modal approach, an introduction to random vibration with application to complex systems having a large number of modes is provided. For the wave approach, the phenomena of propagation, group speed, and energy transport are extensively discussed. Particular emphasis is given to the emergence of diffuse field, the central concept of the theory.

  14. Basic statistics with Microsoft Excel: a review.

    Science.gov (United States)

    Divisi, Duilio; Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto

    2017-06-01

    The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel.

  15. Statistical Decision Theory Estimation, Testing, and Selection

    CERN Document Server

    Liese, Friedrich

    2008-01-01

    Suitable for advanced graduate students and researchers in mathematical statistics and decision theory, this title presents an account of the concepts and a treatment of the major results of classical finite sample size decision theory and modern asymptotic decision theory

  16. Nonextensive statistical mechanics and high energy physics

    Directory of Open Access Journals (Sweden)

    Tsallis Constantino

    2014-04-01

    Full Text Available The use of the celebrated Boltzmann-Gibbs entropy and statistical mechanics is justified for ergodic-like systems. In contrast, complex systems typically require more powerful theories. We will provide a brief introduction to nonadditive entropies (characterized by indices like q, which, in the q → 1 limit, recovers the standard Boltzmann-Gibbs entropy and associated nonextensive statistical mechanics. We then present somerecent applications to systems such as high-energy collisions, black holes and others. In addition to that, we clarify and illustrate the neat distinction that exists between Lévy distributions and q-exponential ones, a point which occasionally causes some confusion in the literature, very particularly in the LHC literature

  17. A Pilot Study Teaching Metrology in an Introductory Statistics Course

    Science.gov (United States)

    Casleton, Emily; Beyler, Amy; Genschel, Ulrike; Wilson, Alyson

    2014-01-01

    Undergraduate students who have just completed an introductory statistics course often lack deep understanding of variability and enthusiasm for the field of statistics. This paper argues that by introducing the commonly underemphasized concept of measurement error, students will have a better chance of attaining both. We further present lecture…

  18. Use of demonstrations and experiments in teaching business statistics

    OpenAIRE

    Johnson, D. G.; John, J. A.

    2003-01-01

    The aim of a business statistics course should be to help students think statistically and to interpret and understand data, rather than to focus on mathematical detail and computation. To achieve this students must be thoroughly involved in the learning process, and encouraged to discover for themselves the meaning, importance and relevance of statistical concepts. In this paper we advocate the use of experiments and demonstrations as aids to achieving these goals. A number of demonstrations...

  19. Big Data as a Source for Official Statistics

    Directory of Open Access Journals (Sweden)

    Daas Piet J.H.

    2015-06-01

    Full Text Available More and more data are being produced by an increasing number of electronic devices physically surrounding us and on the internet. The large amount of data and the high frequency at which they are produced have resulted in the introduction of the term ‘Big Data’. Because these data reflect many different aspects of our daily lives and because of their abundance and availability, Big Data sources are very interesting from an official statistics point of view. This article discusses the exploration of both opportunities and challenges for official statistics associated with the application of Big Data. Experiences gained with analyses of large amounts of Dutch traffic loop detection records and Dutch social media messages are described to illustrate the topics characteristic of the statistical analysis and use of Big Data.

  20. USING RASCH ANALYSIS TO EXPLORE WHAT STUDENTS LEARN ABOUT PROBABILITY CONCEPTS

    Directory of Open Access Journals (Sweden)

    Zamalia Mahmud

    2015-01-01

    Full Text Available Students’ understanding of probability concepts have been investigated from various different perspectives. This study was set out to investigate perceived understanding of probability concepts of forty-four students from the STAT131 Understanding Uncertainty and Variation course at the University of Wollongong, NSW. Rasch measurement which is based on a probabilistic model was used to identify concepts that students find easy, moderate and difficult to understand. Data were captured from the e-learning Moodle platform where students provided their responses through an on-line quiz. As illustrated in the Rasch map, 96% of the students could understand about sample space, simple events, mutually exclusive events and tree diagram while 67% of the students found concepts of conditional and independent events rather easy to understand

  1. Using Rasch Analysis To Explore What Students Learn About Probability Concepts

    Directory of Open Access Journals (Sweden)

    Zamalia Mahmud

    2015-01-01

    Full Text Available Students’ understanding of probability concepts have been investigated from various different perspectives. This study was set out to investigate perceived understanding of probability concepts of forty-four students from the STAT131 Understanding Uncertainty and Variation course at the University of Wollongong, NSW. Rasch measurement which is based on a probabilistic model was used to identify concepts that students find easy, moderate and difficult to understand. Data were captured from the e-learning Moodle platform where students provided their responses through an on-line quiz. As illustrated in the Rasch map, 96% of the students could understand about sample space, simple events, mutually exclusive events and tree diagram while 67% of the students found concepts of conditional and independent events rather easy to understand.

  2. A possible generalization of the concept of symmetry in analytical mechanics

    International Nuclear Information System (INIS)

    Grigore, D.R.

    1987-09-01

    A theorem of Lee Hwa Chung suggests a possible generalization of the symmetry concept in classical mechanics. It is shown that the theory of Konstant-Souriau-Kirillov can be adapted to this more general case. The theory is illustrated with a number of exaples.(author)

  3. THE ANALYSIS OF ILLUSTRATIONS IN THE FOURTH CLASS GEOGRAPHY TEXTBOOKS

    Directory of Open Access Journals (Sweden)

    IOANA CHIRCEV

    2014-01-01

    Full Text Available This study focuses on the analysis of the illustrations found in five different Geography textbooks in Romania. The analysis is based on several criteria: number, size, clarity, pedagogical usefulness. The following conclusions have been drawn: the illustrations are numerous; most of the illustrations are too small and unclear to be efficiently used in the teaching activity; the purpose of some materials is purely illustrative; some illustrations are overcharged with details, which prevent children from understanding them. Authors and publishing houses are advised to choose the illustrations in the fourth class Geography textbooks more carefully.

  4. Mechatronic Systems Design Methods, Models, Concepts

    CERN Document Server

    Janschek, Klaus

    2012-01-01

    In this textbook, fundamental methods for model-based design of mechatronic systems are presented in a systematic, comprehensive form. The method framework presented here comprises domain-neutral methods for modeling and performance analysis: multi-domain modeling (energy/port/signal-based), simulation (ODE/DAE/hybrid systems), robust control methods, stochastic dynamic analysis, and quantitative evaluation of designs using system budgets. The model framework is composed of analytical dynamic models for important physical and technical domains of realization of mechatronic functions, such as multibody dynamics, digital information processing and electromechanical transducers. Building on the modeling concept of a technology-independent generic mechatronic transducer, concrete formulations for electrostatic, piezoelectric, electromagnetic, and electrodynamic transducers are presented. More than 50 fully worked out design examples clearly illustrate these methods and concepts and enable independent study of th...

  5. Propensity Score Analysis: An Alternative Statistical Approach for HRD Researchers

    Science.gov (United States)

    Keiffer, Greggory L.; Lane, Forrest C.

    2016-01-01

    Purpose: This paper aims to introduce matching in propensity score analysis (PSA) as an alternative statistical approach for researchers looking to make causal inferences using intact groups. Design/methodology/approach: An illustrative example demonstrated the varying results of analysis of variance, analysis of covariance and PSA on a heuristic…

  6. Statistical-mechanical formulation of Lyapunov exponents

    International Nuclear Information System (INIS)

    Tanase-Nicola, Sorin; Kurchan, Jorge

    2003-01-01

    We show how the Lyapunov exponents of a dynamic system can, in general, be expressed in terms of the free energy of a (non-Hermitian) quantum many-body problem. This puts their study as a problem of statistical mechanics, whose intuitive concepts and techniques of approximation can hence be borrowed

  7. In the Artist's Studio with L'Illustration

    Directory of Open Access Journals (Sweden)

    Esner, Rachel

    2013-03-01

    Full Text Available This article explores the two series of visits to the artist's studio that appeared in the famed French illustrated magazine L'Illustration in the 1850s and in 1886. An in-depth examination of both the texts and images reveals the verbal and visual tropes used to characterize the artists and their spaces, linking these to broader notions of "the artist" – his moral characteristics, behaviors, and artistic practice – as well as to the politics of the art world and the (bourgeois ideology of L'Illustration. The aim is to uncover not only the language but also the mechanics of the "mediatization" of the image of the artist in this crucial period.

  8. Probability and statistics: A reminder

    International Nuclear Information System (INIS)

    Clement, B.

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from 'data analysis in experimental sciences' given in [1]. (authors)

  9. Statistical thermodynamics and mean-field theory for the alloy under irradiation model

    International Nuclear Information System (INIS)

    Kamyshendo, V.

    1993-01-01

    A generalization of statistical thermodynamics to the open systems case, is discussed, using as an example the alloy-under-irradiation model. The statistical properties of stationary states are described with the use of generalized thermodynamic potentials and 'quasi-interactions' determined from the master equation for micro-configuration probabilities. Methods for resolving this equation are illustrated by the mean-field type calculations of correlators, thermodynamic potentials and phase diagrams for disordered alloys

  10. Different conceptions of mental illness: consequences for the association with patients.

    Science.gov (United States)

    Helmchen, Hanfried

    2013-01-01

    Whenever partial knowledge is considered absolute and turned into ideological and dogmatic conceptions, the risk increases that the conditions for the people involved might become dangerous. This will be illustrated by casuistic examples of consequences of one-sided psychiatric conceptions such as social, biological, and psychological ideas about the treatment and care of the mentally ill. Present perspectives of an integrative model, i.e., an advanced bio-psycho-social conception about evidence-based characteristics on the social, psychological, and molecular-genetic level, require that all of these dimensions should be considered in order to personalize and thereby improve the care and treatment of the mentally ill.

  11. The Generalized Quantum Statistics

    OpenAIRE

    Hwang, WonYoung; Ji, Jeong-Young; Hong, Jongbae

    1999-01-01

    The concept of wavefunction reduction should be introduced to standard quantum mechanics in any physical processes where effective reduction of wavefunction occurs, as well as in the measurement processes. When the overlap is negligible, each particle obey Maxwell-Boltzmann statistics even if the particles are in principle described by totally symmetrized wavefunction [P.R.Holland, The Quantum Theory of Motion, Cambridge Unversity Press, 1993, p293]. We generalize the conjecture. That is, par...

  12. Evaluating the Use of Random Distribution Theory to Introduce Statistical Inference Concepts to Business Students

    Science.gov (United States)

    Larwin, Karen H.; Larwin, David A.

    2011-01-01

    Bootstrapping methods and random distribution methods are increasingly recommended as better approaches for teaching students about statistical inference in introductory-level statistics courses. The authors examined the effect of teaching undergraduate business statistics students using random distribution and bootstrapping simulations. It is the…

  13. Self-concept in overweight adolescents

    Directory of Open Access Journals (Sweden)

    Sandra Vuk Pisk

    2012-02-01

    Full Text Available Background and objective. Adolescence is considered a critical stage of life, and one during which body image and self-concept are of particular importance for peer acceptance and approval. Body weight may impact on satisfaction or dissatisfaction in adolescent girls’ self-concept. The aim of this research was to determine the association between obesity and self-concept among adolescent girls. Methods. The study sample consisted of 40 overweight (BMI 25 - 30 18-year-old girls in their last year of high school. A further 40 girls of the same age with a BMI of 18 - 25 formed a control group. The Offer Self-Image Questionnaire for Adolescents (OSIQ was used to evaluate their self-concept. Descriptive statistical methods used in analysing the data included calculation of the median and standard deviation of variables, and t-tests were used to compare group differences, with p

  14. Naïve Conceptions About Multimedia Learning:A Study on Primary School Textbooks

    OpenAIRE

    Barbara eColombo; Alessandro eAntonietti

    2013-01-01

    An interview study, based on specific pictures taken from textbooks used in primary schools, was carried out to investigate illustrators’, teachers’, students’, and common people’s beliefs about the role that illustrations play in facilitating learning. Participants’ responses were internally coherent, indicating a systematic nature of the underlying naïve conceptions. Findings disprove Mayer’s pessimistic claim that laypersons’ conceptions of multimedia learning fail to match experiment...

  15. Beyond Statistics: The Economic Content of Risk Scores

    Science.gov (United States)

    Einav, Liran; Finkelstein, Amy; Kluender, Raymond

    2016-01-01

    “Big data” and statistical techniques to score potential transactions have transformed insurance and credit markets. In this paper, we observe that these widely-used statistical scores summarize a much richer heterogeneity, and may be endogenous to the context in which they get applied. We demonstrate this point empirically using data from Medicare Part D, showing that risk scores confound underlying health and endogenous spending response to insurance. We then illustrate theoretically that when individuals have heterogeneous behavioral responses to contracts, strategic incentives for cream skimming can still exist, even in the presence of “perfect” risk scoring under a given contract. PMID:27429712

  16. Statistical uncertainties and unrecognized relationships

    International Nuclear Information System (INIS)

    Rankin, J.P.

    1985-01-01

    Hidden relationships in specific designs directly contribute to inaccuracies in reliability assessments. Uncertainty factors at the system level may sometimes be applied in attempts to compensate for the impact of such unrecognized relationships. Often uncertainty bands are used to relegate unknowns to a miscellaneous category of low-probability occurrences. However, experience and modern analytical methods indicate that perhaps the dominant, most probable and significant events are sometimes overlooked in statistical reliability assurances. The author discusses the utility of two unique methods of identifying the otherwise often unforeseeable system interdependencies for statistical evaluations. These methods are sneak circuit analysis and a checklist form of common cause failure analysis. Unless these techniques (or a suitable equivalent) are also employed along with the more widely-known assurance tools, high reliability of complex systems may not be adequately assured. This concern is indicated by specific illustrations. 8 references, 5 figures

  17. Statistics of Extremes

    KAUST Repository

    Davison, Anthony C.

    2015-04-10

    Statistics of extremes concerns inference for rare events. Often the events have never yet been observed, and their probabilities must therefore be estimated by extrapolation of tail models fitted to available data. Because data concerning the event of interest may be very limited, efficient methods of inference play an important role. This article reviews this domain, emphasizing current research topics. We first sketch the classical theory of extremes for maxima and threshold exceedances of stationary series. We then review multivariate theory, distinguishing asymptotic independence and dependence models, followed by a description of models for spatial and spatiotemporal extreme events. Finally, we discuss inference and describe two applications. Animations illustrate some of the main ideas. © 2015 by Annual Reviews. All rights reserved.

  18. On the Transfer of a Number of Concepts of Statistical Radiophysics to the Theory of One-dimensional Point Mappings

    Directory of Open Access Journals (Sweden)

    Agalar M. Agalarov

    2018-01-01

    Full Text Available In the article, the possibility of using a bispectrum under the investigation of regular and chaotic behaviour of one-dimensional point mappings is discussed. The effectiveness of the transfer of this concept to nonlinear dynamics was demonstrated by an example of the Feigenbaum mapping. Also in the work, the application of the Kullback-Leibler entropy in the theory of point mappings is considered. It has been shown that this information-like value is able to describe the behaviour of statistical ensembles of one-dimensional mappings. In the framework of this theory some general properties of its behaviour were found out. Constructivity of the Kullback-Leibler entropy in the theory of point mappings was shown by means of its direct calculation for the ”saw tooth” mapping with linear initial probability density. Moreover, for this mapping the denumerable set of initial probability densities hitting into its stationary probability density after a finite number of steps was pointed out. 

  19. ARSENIC CONTAMINATION IN GROUNDWATER: A STATISTICAL MODELING

    OpenAIRE

    Palas Roy; Naba Kumar Mondal; Biswajit Das; Kousik Das

    2013-01-01

    High arsenic in natural groundwater in most of the tubewells of the Purbasthali- Block II area of Burdwan district (W.B, India) has recently been focused as a serious environmental concern. This paper is intending to illustrate the statistical modeling of the arsenic contaminated groundwater to identify the interrelation of that arsenic contain with other participating groundwater parameters so that the arsenic contamination level can easily be predicted by analyzing only such parameters. Mul...

  20. Trajectory Design for a Single-String Impactor Concept

    Science.gov (United States)

    Dono Perez, Andres; Burton, Roland; Stupl, Jan; Mauro, David

    2017-01-01

    This paper introduces a trajectory design for a secondary spacecraft concept to augment science return in interplanetary missions. The concept consist of a single-string probe with a kinetic impactor on board that generates an artificial plume to perform in-situ sampling. The trajectory design was applied to a particular case study that samples ejecta particles from the Jovian moon Europa. Results were validated using statistical analysis. Details regarding the navigation, targeting and disposal challenges related to this concept are presented herein.

  1. Microcanonical ensemble extensive thermodynamics of Tsallis statistics

    International Nuclear Information System (INIS)

    Parvan, A.S.

    2005-01-01

    The microscopic foundation of the generalized equilibrium statistical mechanics based on the Tsallis entropy is given by using the Gibbs idea of statistical ensembles of the classical and quantum mechanics.The equilibrium distribution functions are derived by the thermodynamic method based upon the use of the fundamental equation of thermodynamics and the statistical definition of the functions of the state of the system. It is shown that if the entropic index ξ = 1/q - 1 in the microcanonical ensemble is an extensive variable of the state of the system, then in the thermodynamic limit z bar = 1/(q - 1)N = const the principle of additivity and the zero law of thermodynamics are satisfied. In particular, the Tsallis entropy of the system is extensive and the temperature is intensive. Thus, the Tsallis statistics completely satisfies all the postulates of the equilibrium thermodynamics. Moreover, evaluation of the thermodynamic identities in the microcanonical ensemble is provided by the Euler theorem. The principle of additivity and the Euler theorem are explicitly proved by using the illustration of the classical microcanonical ideal gas in the thermodynamic limit

  2. Microcanonical ensemble extensive thermodynamics of Tsallis statistics

    International Nuclear Information System (INIS)

    Parvan, A.S.

    2006-01-01

    The microscopic foundation of the generalized equilibrium statistical mechanics based on the Tsallis entropy is given by using the Gibbs idea of statistical ensembles of the classical and quantum mechanics. The equilibrium distribution functions are derived by the thermodynamic method based upon the use of the fundamental equation of thermodynamics and the statistical definition of the functions of the state of the system. It is shown that if the entropic index ξ=1/(q-1) in the microcanonical ensemble is an extensive variable of the state of the system, then in the thermodynamic limit z-bar =1/(q-1)N=const the principle of additivity and the zero law of thermodynamics are satisfied. In particular, the Tsallis entropy of the system is extensive and the temperature is intensive. Thus, the Tsallis statistics completely satisfies all the postulates of the equilibrium thermodynamics. Moreover, evaluation of the thermodynamic identities in the microcanonical ensemble is provided by the Euler theorem. The principle of additivity and the Euler theorem are explicitly proved by using the illustration of the classical microcanonical ideal gas in the thermodynamic limit

  3. Significance levels for studies with correlated test statistics.

    Science.gov (United States)

    Shi, Jianxin; Levinson, Douglas F; Whittemore, Alice S

    2008-07-01

    When testing large numbers of null hypotheses, one needs to assess the evidence against the global null hypothesis that none of the hypotheses is false. Such evidence typically is based on the test statistic of the largest magnitude, whose statistical significance is evaluated by permuting the sample units to simulate its null distribution. Efron (2007) has noted that correlation among the test statistics can induce substantial interstudy variation in the shapes of their histograms, which may cause misleading tail counts. Here, we show that permutation-based estimates of the overall significance level also can be misleading when the test statistics are correlated. We propose that such estimates be conditioned on a simple measure of the spread of the observed histogram, and we provide a method for obtaining conditional significance levels. We justify this conditioning using the conditionality principle described by Cox and Hinkley (1974). Application of the method to gene expression data illustrates the circumstances when conditional significance levels are needed.

  4. The evolution of the causation concept and its relation with statistical methods in Epidemiology

    Directory of Open Access Journals (Sweden)

    Luis Fernando Lisboa

    2008-09-01

    Full Text Available A historical review places the first registers of Epidemiology in ancient Greece, with Hippocrates, who identified environmental causes of diseases. Along the centuries, the evolution of the causation concept started to be related to changes in scientific paradigms. In London, during the 17th century, the quantitative method was introduced in Epidemiology, but it was only by the end of the 19th century that the concept of the environment and a mathematical approach to understanding Public Health issues were well established. This was a very rich period to setting new concepts and systematizations in epidemiologic methodology. The beginning of the 20th century consolidated Epidemiology as a scientific discipline and the development of computers in the post-war years brought much advance in this field. Nowadays, Epidemiology plays an important role as it integrates scientific knowledge on the health/disease process to the professional area, participating in population healthcare efforts.

  5. Uniform Statistical Convergence on Time Scales

    Directory of Open Access Journals (Sweden)

    Yavuz Altin

    2014-01-01

    Full Text Available We will introduce the concept of m- and (λ,m-uniform density of a set and m- and (λ,m-uniform statistical convergence on an arbitrary time scale. However, we will define m-uniform Cauchy function on a time scale. Furthermore, some relations about these new notions are also obtained.

  6. Statistical inference for remote sensing-based estimates of net deforestation

    Science.gov (United States)

    Ronald E. McRoberts; Brian F. Walters

    2012-01-01

    Statistical inference requires expression of an estimate in probabilistic terms, usually in the form of a confidence interval. An approach to constructing confidence intervals for remote sensing-based estimates of net deforestation is illustrated. The approach is based on post-classification methods using two independent forest/non-forest classifications because...

  7. Workshop statistics discovery with data and Minitab

    CERN Document Server

    Rossman, Allan J

    1998-01-01

    Shorn of all subtlety and led naked out of the protec­ tive fold of educational research literature, there comes a sheepish little fact: lectures don't work nearly as well as many of us would like to think. -George Cobb (1992) This book contains activities that guide students to discover statistical concepts, explore statistical principles, and apply statistical techniques. Students work toward these goals through the analysis of genuine data and through inter­ action with one another, with their instructor, and with technology. Providing a one-semester introduction to fundamental ideas of statistics for college and advanced high school students, Warkshop Statistics is designed for courses that employ an interactive learning environment by replacing lectures with hands­ on activities. The text contains enough expository material to stand alone, but it can also be used to supplement a more traditional textbook. Some distinguishing features of Workshop Statistics are its emphases on active learning, conceptu...

  8. Study of developing a database of energy statistics

    Energy Technology Data Exchange (ETDEWEB)

    Park, T.S. [Korea Energy Economics Institute, Euiwang (Korea, Republic of)

    1997-08-01

    An integrated energy database should be prepared in advance for managing energy statistics comprehensively. However, since much manpower and budget is required for developing an integrated energy database, it is difficult to establish a database within a short period of time. Therefore, this study sets the purpose in drawing methods to analyze existing statistical data lists and to consolidate insufficient data as first stage work for the energy database, and at the same time, in analyzing general concepts and the data structure of the database. I also studied the data content and items of energy databases in operation in international energy-related organizations such as IEA, APEC, Japan, and the USA as overseas cases as well as domestic conditions in energy databases, and the hardware operating systems of Japanese databases. I analyzed the making-out system of Korean energy databases, discussed the KEDB system which is representative of total energy databases, and present design concepts for new energy databases. In addition, I present the establishment directions and their contents of future Korean energy databases, data contents that should be collected by supply and demand statistics, and the establishment of data collection organization, etc. by analyzing the Korean energy statistical data and comparing them with the system of OECD/IEA. 26 refs., 15 figs., 11 tabs.

  9. Bringing Life to Illustration and Illustrating the World in Movement through Visual Literacy

    DEFF Research Database (Denmark)

    Carpe Pérez, Inmaculada Concepción; Pedersen, Hanne

    2016-01-01

    “If a picture is worth a thousand words” as Arthur Brisbane said, journalist of the New York Times in 1911 Fig.1 How many words would equal the hundreds of frames containing in an animation? In this equation as any other, illustration and animation are complex visual expressions, full of shapes c...

  10. Epidemiological Concepts Regarding Disease Monitoring and Surveillance

    Directory of Open Access Journals (Sweden)

    Christensen Jette

    2001-03-01

    Full Text Available Definitions of epidemiological concepts regarding disease monitoring and surveillance can be found in textbooks on veterinary epidemiology. This paper gives a review of how the concepts: monitoring, surveillance, and disease control strategies are defined. Monitoring and surveillance systems (MO&SS involve measurements of disease occurrence, and the design of the monitoring determines which types of disease occurrence measures can be applied. However, the knowledge of the performance of diagnostic tests (sensitivity and specificity is essential to estimate the true occurrence of the disease. The terms, disease control programme (DCP or disease eradication programme (DEP, are defined, and the steps of DCP/DEP are described to illustrate that they are a process rather than a static MO&SS.

  11. Gallium Electromagnetic (GEM) Thrustor Concept and Design

    Science.gov (United States)

    Polzin, Kurt A.; Markusic, Thomas E.

    2006-01-01

    We describe the design of a new type of two-stage pulsed electromagnetic accelerator, the gallium electromagnetic (GEM) thruster. A schematic illustration of the GEM thruster concept is given in Fig. 1. In this concept, liquid gallium propellant is pumped into the first stage through a porous metal electrode using an electromagneticpump[l]. At a designated time, a pulsed discharge (approx.10-50 J) is initiated in the first stage, ablating the liquid gallium from the porous electrode surface and ejecting a dense thermal gallium plasma into the second state. The presence of the gallium plasma in the second stage serves to trigger the high-energy (approx.500 I), send-stage puke which provides the primary electromagnetic (j x B) acceleration.

  12. Space Elevator Concept Considered a Reality

    Science.gov (United States)

    2000-01-01

    The `once upon a time' science fiction concept of a space elevator has been envisioned and studied as a real mass transportation system in the latter part of the 21st century. David Smitherman of NASA's Marshall Space Flight Center's Advanced Projects Office has compiled plans for such an elevator. The space elevator concept is a structure extending from the surface of the Earth to geostationary Earth orbit (GEO) at 35,786 km in altitude. The tower would be approximately 50 km tall with a cable tethered to the top. Its center mass would be at GEO such that the entire structure orbits the Earth in sync with the Earth's rotation maintaining a stationary position over its base attachment at the equator. Electromagnetic vehicles traveling along the cable could serve as a mass transportation system for transporting people, payloads, and power between space and Earth. This illustration by artist Pat Rawling shows the concept of a space elevator as viewed from the geostationary transfer station looking down the length of the elevator towards the Earth.

  13. Adaptation illustrations: Chapter 4

    Science.gov (United States)

    Maria Janowiak; Patricia Butler; Chris Swanston; Matt St. Pierre; Linda. Parker

    2012-01-01

    In this chapter, we demonstrate how the Adaptation Workbook (Chapter 3) can be used with the Adaptation Strategies and Approaches (Chapter 2) to develop adaptation tactics for two real-world management issues. The two illustrations in this chapter are intended to provide helpful tips to managers completing the Adaptation Workbook, as well as to show how the anticipated...

  14. Relationships between Self-Concept and Resilience Profiles in Young People with Disabilities

    Science.gov (United States)

    Suriá Martínez, Raquel

    2016-01-01

    Introduction: The present study aims to identify different profiles in self-concept and resilience. In addition, statistically significant differences in self-concept domains among the profiles previously identified are analyzed. Method: The AF5 Self-Concept Questionnaire ("Cuestionario de Autoconcepto AF5") and the Resilience Scale were…

  15. Translational research: a concept analysis.

    Science.gov (United States)

    Wendler, M Cecilia; Kirkbride, Geri; Wade, Kristen; Ferrell, Lynne

    2013-01-01

    BACKGROUND/CONCEPTUAL FRAMEWORK: Little is known about which approaches facilitate adoption and sustainment of evidence-based practice change in the highly complex care environments that constitute clinical practice today. The purpose of this article was to complete a concept analysis of translational research using a modified Walker and Avant approach. DESIGN/DATA COLLECTION: Using a rigorous and thorough review of the recent health care literature generated by a deep electronic search from 2004-2011, 85 appropriate documents were retrieved. Close reading of the articles by three coresearchers yielded an analysis of the emerging concept of translational research. Using the iterative process described by Walker and Avant, a tentative definition of the concept of translational research, along with antecedents and consequences were identified. Implications for health care professionals in education, practice, and research are offered. Further research is needed to determine the adequacy of the definition, to identify empirical referents, and to guide theory development. The study resulted in a theoretical definition of the concept of translational research, along with identification of antecedents and consequences and a description of an ideal or model case to illustrate the definition. Implications for practice and education include the importance of focusing on translational research approaches that may reduce the research-practice gap in health care, thereby improving patient care delivery. Research is needed to determine the usefulness of the definition in health care clinical practice.

  16. Statistical mechanics of two-dimensional and geophysical flows

    International Nuclear Information System (INIS)

    Bouchet, Freddy; Venaille, Antoine

    2012-01-01

    The theoretical study of the self-organization of two-dimensional and geophysical turbulent flows is addressed based on statistical mechanics methods. This review is a self-contained presentation of classical and recent works on this subject; from the statistical mechanics basis of the theory up to applications to Jupiter’s troposphere and ocean vortices and jets. Emphasize has been placed on examples with available analytical treatment in order to favor better understanding of the physics and dynamics. After a brief presentation of the 2D Euler and quasi-geostrophic equations, the specificity of two-dimensional and geophysical turbulence is emphasized. The equilibrium microcanonical measure is built from the Liouville theorem. Important statistical mechanics concepts (large deviations and mean field approach) and thermodynamic concepts (ensemble inequivalence and negative heat capacity) are briefly explained and described. On this theoretical basis, we predict the output of the long time evolution of complex turbulent flows as statistical equilibria. This is applied to make quantitative models of two-dimensional turbulence, the Great Red Spot and other Jovian vortices, ocean jets like the Gulf-Stream, and ocean vortices. A detailed comparison between these statistical equilibria and real flow observations is provided. We also present recent results for non-equilibrium situations, for the studies of either the relaxation towards equilibrium or non-equilibrium steady states. In this last case, forces and dissipation are in a statistical balance; fluxes of conserved quantity characterize the system and microcanonical or other equilibrium measures no longer describe the system.

  17. Statistical inference on residual life

    CERN Document Server

    Jeong, Jong-Hyeon

    2014-01-01

    This is a monograph on the concept of residual life, which is an alternative summary measure of time-to-event data, or survival data. The mean residual life has been used for many years under the name of life expectancy, so it is a natural concept for summarizing survival or reliability data. It is also more interpretable than the popular hazard function, especially for communications between patients and physicians regarding the efficacy of a new drug in the medical field. This book reviews existing statistical methods to infer the residual life distribution. The review and comparison includes existing inference methods for mean and median, or quantile, residual life analysis through medical data examples. The concept of the residual life is also extended to competing risks analysis. The targeted audience includes biostatisticians, graduate students, and PhD (bio)statisticians. Knowledge in survival analysis at an introductory graduate level is advisable prior to reading this book.

  18. Statistics for clinical nursing practice: an introduction.

    Science.gov (United States)

    Rickard, Claire M

    2008-11-01

    Difficulty in understanding statistics is one of the most frequently reported barriers to nurses applying research results in their practice. Yet the amount of nursing research published each year continues to grow, as does the expectation that nurses will undertake practice based on this evidence. Critical care nurses do not need to be statisticians, but they do need to develop a working knowledge of statistics so they can be informed consumers of research and so practice can evolve and improve. For those undertaking a research project, statistical literacy is required to interact with other researchers and statisticians, so as to best design and undertake the project. This article is the first in a series that guides critical care nurses through statistical terms and concepts relevant to their practice.

  19. Methodology in robust and nonparametric statistics

    CERN Document Server

    Jurecková, Jana; Picek, Jan

    2012-01-01

    Introduction and SynopsisIntroductionSynopsisPreliminariesIntroductionInference in Linear ModelsRobustness ConceptsRobust and Minimax Estimation of LocationClippings from Probability and Asymptotic TheoryProblemsRobust Estimation of Location and RegressionIntroductionM-EstimatorsL-EstimatorsR-EstimatorsMinimum Distance and Pitman EstimatorsDifferentiable Statistical FunctionsProblemsAsymptotic Representations for L-Estimators

  20. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    Science.gov (United States)

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  1. The Viral Concept: the Winning Ticket of the Romanian Online Advertising Industry

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The connection between the steady development of the Internet in Romania in the last five years, as channel of transmitting the marketing message, and the viral concept, as method of transmitting the message, may become the winning ticket for the Romanian online advertising market. Thus, in the current socio-economic context, any company who wishes to be successful in the virtual space cannot ignore the viral marketing techniques for several reasons. Firstly, we are talking about the profile of Internet users who tend to constitute a new social group. Secondly, we are talking about the thirst for information. And, last but not least, we are talking about the appetite for online chatting, statistics showing that 62% of the Roma-nian Internet users consider it a very "savory" information channel. This article tries to explain, in brief, what viral marketing is, which are its peculiarities, advantages, risks, as well as the limitations of its use, and which the strategies of a viral marketing campaign are. We will illustrate by giving successful examples from the Romanian online market.

  2. A validation framework for microbial forensic methods based on statistical pattern recognition

    Energy Technology Data Exchange (ETDEWEB)

    Velsko, S P

    2007-11-12

    This report discusses a general approach to validating microbial forensic methods that attempt to simultaneously distinguish among many hypotheses concerning the manufacture of a questioned biological agent sample. It focuses on the concrete example of determining growth medium from chemical or molecular properties of a bacterial agent to illustrate the concepts involved.

  3. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  4. Fuzzy statistical decision-making theory and applications

    CERN Document Server

    Kabak, Özgür

    2016-01-01

    This book offers a comprehensive reference guide to fuzzy statistics and fuzzy decision-making techniques. It provides readers with all the necessary tools for making statistical inference in the case of incomplete information or insufficient data, where classical statistics cannot be applied. The respective chapters, written by prominent researchers, explain a wealth of both basic and advanced concepts including: fuzzy probability distributions, fuzzy frequency distributions, fuzzy Bayesian inference, fuzzy mean, mode and median, fuzzy dispersion, fuzzy p-value, and many others. To foster a better understanding, all the chapters include relevant numerical examples or case studies. Taken together, they form an excellent reference guide for researchers, lecturers and postgraduate students pursuing research on fuzzy statistics. Moreover, by extending all the main aspects of classical statistical decision-making to its fuzzy counterpart, the book presents a dynamic snapshot of the field that is expected to stimu...

  5. Statistical sampling for holdup measurement

    International Nuclear Information System (INIS)

    Picard, R.R.; Pillay, K.K.S.

    1986-01-01

    Nuclear materials holdup is a serious problem in many operating facilities. Estimating amounts of holdup is important for materials accounting and, sometimes, for process safety. Clearly, measuring holdup in all pieces of equipment is not a viable option in terms of time, money, and radiation exposure to personnel. Furthermore, 100% measurement is not only impractical but unnecessary for developing estimated values. Principles of statistical sampling are valuable in the design of cost effective holdup monitoring plans and in qualifying uncertainties in holdup estimates. The purpose of this paper is to describe those principles and to illustrate their use

  6. USA by Numbers: A Statistical Portrait of the United States.

    Science.gov (United States)

    Weber, Susan, Ed.

    This book presents demographic data about a variety of U.S. public policies, social problems, and environmental issues. The issues and problems that the statistics illustrate (such as overflowing garbage dumps, homelessness, child poverty, and smog and water pollution) are connected with, and the consequences of, the expanding U.S. population. The…

  7. Statistical Methods for Environmental Pollution Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, Richard O. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    1987-01-01

    The application of statistics to environmental pollution monitoring studies requires a knowledge of statistical analysis methods particularly well suited to pollution data. This book fills that need by providing sampling plans, statistical tests, parameter estimation procedure techniques, and references to pertinent publications. Most of the statistical techniques are relatively simple, and examples, exercises, and case studies are provided to illustrate procedures. The book is logically divided into three parts. Chapters 1, 2, and 3 are introductory chapters. Chapters 4 through 10 discuss field sampling designs and Chapters 11 through 18 deal with a broad range of statistical analysis procedures. Some statistical techniques given here are not commonly seen in statistics book. For example, see methods for handling correlated data (Sections 4.5 and 11.12), for detecting hot spots (Chapter 10), and for estimating a confidence interval for the mean of a lognormal distribution (Section 13.2). Also, Appendix B lists a computer code that estimates and tests for trends over time at one or more monitoring stations using nonparametric methods (Chapters 16 and 17). Unfortunately, some important topics could not be included because of their complexity and the need to limit the length of the book. For example, only brief mention could be made of time series analysis using Box-Jenkins methods and of kriging techniques for estimating spatial and spatial-time patterns of pollution, although multiple references on these topics are provided. Also, no discussion of methods for assessing risks from environmental pollution could be included.

  8. Identification of optimal inspection interval via delay-time concept

    Directory of Open Access Journals (Sweden)

    Glauco Ricardo Simões Gomes

    2016-06-01

    Full Text Available This paper presents an application of mathematical modeling aimed at managing maintenance based on the delay-time concept. The study scenario was the manufacturing sector of an industrial unit, which operates 24 hours a day in a continuous flow of production. The main idea was to use the concepts of this approach to determine the optimal time of preventive action by the maintenance department in order to ensure the greatest availability of equipment and facilities at appropriate maintenance costs. After a brief introduction of the subject, the article presents topics that illustrate the importance of mathematical modeling in maintenance management and the delay-time concept. It also describes the characteristics of the company where the study was conducted, as well as the data related to the production process and maintenance actions. Finally, the results obtained after applying the delay-time concept are presented and discussed, as well as the limitations of the article and the proposals for future research.

  9. Kappa statistic for clustered matched-pair data.

    Science.gov (United States)

    Yang, Zhao; Zhou, Ming

    2014-07-10

    Kappa statistic is widely used to assess the agreement between two procedures in the independent matched-pair data. For matched-pair data collected in clusters, on the basis of the delta method and sampling techniques, we propose a nonparametric variance estimator for the kappa statistic without within-cluster correlation structure or distributional assumptions. The results of an extensive Monte Carlo simulation study demonstrate that the proposed kappa statistic provides consistent estimation and the proposed variance estimator behaves reasonably well for at least a moderately large number of clusters (e.g., K ≥50). Compared with the variance estimator ignoring dependence within a cluster, the proposed variance estimator performs better in maintaining the nominal coverage probability when the intra-cluster correlation is fair (ρ ≥0.3), with more pronounced improvement when ρ is further increased. To illustrate the practical application of the proposed estimator, we analyze two real data examples of clustered matched-pair data. Copyright © 2014 John Wiley & Sons, Ltd.

  10. USING RASCH ANALYSIS TO EXPLORE WHAT STUDENTS LEARN ABOUT PROBABILITY CONCEPTS

    Directory of Open Access Journals (Sweden)

    Zamalia Mahmud

    2015-01-01

    Full Text Available Students’ understanding of probability concepts have been investigated from various different perspectives. This study was set out to investigate perceived understanding of probability concepts of forty-four students from the STAT131 Understanding Uncertainty and Variation course at the University of Wollongong, NSW. Rasch measurement which is based on a probabilistic model was used to identify concepts that students find easy, moderate and difficult to understand.  Data were captured from the e-learning Moodle platform where students provided their responses through an on-line quiz. As illustrated in the Rasch map, 96% of the students could understand about sample space, simple events, mutually exclusive events and tree diagram while 67% of the students found concepts of conditional and independent events rather easy to understand.Keywords: Perceived Understanding, Probability Concepts, Rasch Measurement Model DOI: dx.doi.org/10.22342/jme.61.1

  11. Some Observations on the Concepts of Information-Theoretic Entropy and Randomness

    Directory of Open Access Journals (Sweden)

    Jonathan D.H. Smith

    2001-02-01

    Full Text Available Abstract: Certain aspects of the history, derivation, and physical application of the information-theoretic entropy concept are discussed. Pre-dating Shannon, the concept is traced back to Pauli. A derivation from first principles is given, without use of approximations. The concept depends on the underlying degree of randomness. In physical applications, this translates to dependence on the experimental apparatus available. An example illustrates how this dependence affects Prigogine's proposal for the use of the Second Law of Thermodynamics as a selection principle for the breaking of time symmetry. The dependence also serves to yield a resolution of the so-called ``Gibbs Paradox.'' Extension of the concept from the discrete to the continuous case is discussed. The usual extension is shown to be dimensionally incorrect. Correction introduces a reference density, leading to the concept of Kullback entropy. Practical relativistic considerations suggest a possible proper reference density.

  12. TQM: the essential concepts.

    Science.gov (United States)

    Chambers, D W

    1998-01-01

    This is an introduction to the major concepts in total quality management, a loose collection of management approaches that focus on continuous improvement of processes, guided by routine data collection and adjustment of the processes. Customer focus and involvement of all members of an organization are also characteristics commonly found in TQM. The seventy-five-year history of the movement is sketched from its beginning in statistical work on quality assurance through the many improvements and redefinitions added by American and Japanese thinkers. Essential concepts covered include: control cycles, focus on the process rather than the defects, the GEAR model, importance of the customer, upstream quality, just-in-time, kaizen, and service quality.

  13. Concepts and Contexts – Argumentative Forms of Framing

    DEFF Research Database (Denmark)

    Gabrielsen, Jonas; Nørholm Just, Sine; Bengtsson, Mette

    2011-01-01

    this argument we combine theories of framing with the classical rhetorical theory of the stases, more precisely status definitio and status translatio. Our focus is primarily theoretical, but we illustrate our points by means of examples taken from public debates on the value of real estate.......The notion of framing has become central in the field of argumentation. The question is, however, what we gain from studying the process of argumentation through framing, since framing is itself a broad concept in need of specification. Different traditions understand the term differently......, and it is necessary to determine what argumentative forms the concept of framing actually covers. In this paper we argue that framing refers to at least two different argumentative forms. One is an internal definition of the concepts in question; the other is an external shift in the context of the case. In making...

  14. Different conceptions of mental illness: consequences for the association with patients

    Directory of Open Access Journals (Sweden)

    Hanfried eHelmchen

    2013-05-01

    Full Text Available Whenever partial knowledge is considered absolute and turned into ideological and dogmatic conceptions, the risk increases that the conditions for the people involved might become dangerous. This will be illustrated by casuistic examples of consequences of one-sided psychiatric conceptions such as social, biological, and psychological ideas about the treatment and care of the mentally ill. Present perspectives of an integrative model, i.e. the bio-psycho-social conception about specific interactions between the social environment and individual characteristics on both the psychological and molecular-genetic level, require that all of these dimensions should be considered in order to personalize and thereby improve the care and treatment of the mentally ill.

  15. Decision Making in Nursing Practice: A Concept Analysis.

    Science.gov (United States)

    Johansen, Mary L; O'Brien, Janice L

    2016-01-01

    The study aims to gain an understanding of the concept of decision making as it relates to the nurse practice environment. Rodgers' evolutionary method on concept analysis was used as a framework for the study of the concept. Articles from 1952 to 2014 were reviewed from PsycINFO, Medline, Cumulative Index to Nursing and Allied Health Literature (CINAHL), JSTOR, PubMed, and Science Direct. Findings suggest that decision making in the nurse practice environment is a complex process, integral to the nursing profession. The definition of decision making, and the attributes, antecedents, and consequences, are discussed. Contextual factors that influence the process are also discussed. An exemplar is presented to illustrate the concept. Decision making in the nurse practice environment is a dynamic conceptual process that may affect patient outcomes. Nurses need to call upon ways of knowing to make sound decisions and should be self-reflective in order to develop the process further in the professional arena. The need for further research is discussed. © 2015 Wiley Periodicals, Inc.

  16. Intermetallics structures, properties, and statistics

    CERN Document Server

    Steurer, Walter

    2016-01-01

    The focus of this book is clearly on the statistics, topology, and geometry of crystal structures and crystal structure types. This allows one to uncover important structural relationships and to illustrate the relative simplicity of most of the general structural building principles. It also allows one to show that a large variety of actual structures can be related to a rather small number of aristotypes. It is important that this book is readable and beneficial in the one way or another for everyone interested in intermetallic phases, from graduate students to experts in solid-state chemistry/physics/materials science. For that purpose it avoids using an enigmatic abstract terminology for the classification of structures. The focus on the statistical analysis of structures and structure types should be seen as an attempt to draw the background of the big picture of intermetallics, and to point to the white spots in it, which could be worthwhile exploring. This book was not planned as a textbook; rather, it...

  17. Statistical power and the Rorschach: 1975-1991.

    Science.gov (United States)

    Acklin, M W; McDowell, C J; Orndoff, S

    1992-10-01

    The Rorschach Inkblot Test has been the source of long-standing controversies as to its nature and its psychometric properties. Consistent with behavioral science research in general, the concept of statistical power has been entirely ignored by Rorschach researchers. The concept of power is introduced and discussed, and a power survey of the Rorschach literature published between 1975 and 1991 in the Journal of Personality Assessment, Journal of Consulting and Clinical Psychology, Journal of Abnormal Psychology, Journal of Clinical Psychology, Journal of Personality, Psychological Bulletin, American Journal of Psychiatry, and Journal of Personality and Social Psychology was undertaken. Power was calculated for 2,300 statistical tests in 158 journal articles. Power to detect small, medium, and large effect sizes was .13, .56, and .85, respectively. Similar to the findings in other power surveys conducted on behavioral science research, we concluded that Rorschach research is underpowered to detect the differences under investigation. This undoubtedly contributes to the inconsistency of research findings which has been a source of controversy and criticism over the decades. It appears that research conducted according to the Comprehensive System for the Rorschach is more powerful. Recommendations are offered for improving power and strengthening the design sensitivity of Rorschach research, including increasing sample sizes, use of parametric statistics, reduction of error variance, more accurate reporting of findings, and editorial policies reflecting concern about the magnitude of relationships beyond an exclusive focus on levels of statistical significance.

  18. Development and analysis of spectroscopic learning tools and the light and spectroscopy concept inventory for introductory college astronomy

    Science.gov (United States)

    Bardar, Erin M.

    -question Light and Spectroscopy Concept Inventory (LSCI). In the fall of 2005, a multi-institution field-test of the LSCI was conducted with student examinees from 14 course sections at 11 colleges and universities employing various instructional techniques. Through statistical analysis, the inventory was proven to be a reliable (Cronbach's alpha = 0.77) and valid assessment instrument that was able to illustrate statistically significant learning gains (p < 0.05) for most course sections, with students utilizing our suite of instructional materials exhibiting among the highest performance gains (Effect Size = 1.31).

  19. Recent and future evolution of the conception of French PWR facing safety and reliability

    International Nuclear Information System (INIS)

    Vignon, D.; Morin, R.; Brisbois, J.

    1987-11-01

    The realization of French construction of REP(54 units) has conducted at an original approach of the safety. Now this approach is finished and the totality of detained dispositions are taken in consideration for the conception of the new standardized plant series N4. For the future, after this rationalization of this safety approach, a research on the simplification of the conception is provided. This new conception is based on the experience returns and on the results of the probabilistic studies on the 900 and 1300 MWe reactors. This rationalization, the new concepts and the research of simplifications are illustrated by concrete examples in this presentation [fr

  20. Concept and Functional Structure of a Service Robot

    Directory of Open Access Journals (Sweden)

    Luis A. Pineda

    2015-02-01

    Full Text Available In this paper, we present a concept of service robot and a framework for its functional specification and implementation. The present discussion is grounded in Newell's system levels hierarchy which suggests organizing robotics research in three different layers, corresponding to Marr's computational, algorithmic and implementation levels, as follows: (1 the service robot proper, which is the subject of the present paper, (2 perception and action algorithms, and (3 the systems programming level. The concept of a service robot is articulated in practice through the introduction of a conceptual model for particular service robots; this consists of the specification of a set of basic robotic behaviours and a number of mechanisms for assembling such behaviours during the execution of complex tasks. The model involves an explicit representation of the task structure, allowing for deliberative reasoning and task management. The model also permits distinguishing between a robot's competence and performance, along the lines of Chomsky's corresponding distinction. We illustrate how this model can be realized in practice with two composition modes that we call static and dynamic; these are illustrated with the Restaurant Test and the General Purpose Service Robot Test of the RoboCup@Home competition, respectively. The present framework and methodology has been implemented in the robot Golem-II+, which is also described. The paper is concluded with an overall reflection upon the present concept of a service robot and its associated functional specifications, and the potential impact of such a conceptual model in the study, development and application of service robots in general.