Mandl, Franz
1988-01-01
The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient
Sadovskii, Michael V
2012-01-01
This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.
Wannier, Gregory Hugh
1966-01-01
Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for
Guénault, Tony
2007-01-01
In this revised and enlarged second edition of an established text Tony Guénault provides a clear and refreshingly readable introduction to statistical physics, an essential component of any first degree in physics. The treatment itself is self-contained and concentrates on an understanding of the physical ideas, without requiring a high level of mathematical sophistication. A straightforward quantum approach to statistical averaging is adopted from the outset (easier, the author believes, than the classical approach). The initial part of the book is geared towards explaining the equilibrium properties of a simple isolated assembly of particles. Thus, several important topics, for example an ideal spin-½ solid, can be discussed at an early stage. The treatment of gases gives full coverage to Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein statistics. Towards the end of the book the student is introduced to a wider viewpoint and new chapters are included on chemical thermodynamics, interactions in, for exam...
Sustainability index for Taipei
Lee, Y.-J.; Huang Chingming
2007-01-01
Sustainability indicators are an effective means of determining whether a city is moving towards sustainable development (SD). After considering the characteristics of Taipei, Taiwan, discussions with experts, scholars and government departments and an exhaustive literature review, this study selected 51 sustainability indicators corresponding to the socio-economic characteristic of Taipei City. Such indicators should be regarded as a basis for assessing SD in Taipei City. The 51 indicators are classified into economic, social, environmental and institutional dimensions. Furthermore, statistical data is adopted to identify the trend of SD from 1994 to 2004. Moreover, the sustainability index is calculated for the four dimensions and for Taipei as a whole. Analysis results demonstrate that social and environmental indicators are moving towards SD, while economic and institutional dimensions are performing relatively poorly. However, since 2002, the economic sustainability index has gradually moved towards SD. Overall, the Taipei sustainability index indicates a gradual trend towards sustainable development during the past 11 years
Theoretical physics 8 statistical physics
Nolting, Wolfgang
2018-01-01
This textbook offers a clear and comprehensive introduction to statistical physics, one of the core components of advanced undergraduate physics courses. It follows on naturally from the previous volumes in this series, using methods of probability theory and statistics to solve physical problems. The first part of the book gives a detailed overview on classical statistical physics and introduces all mathematical tools needed. The second part of the book covers topics related to quantized states, gives a thorough introduction to quantum statistics, followed by a concise treatment of quantum gases. Ideally suited to undergraduate students with some grounding in quantum mechanics, the book is enhanced throughout with learning features such as boxed inserts and chapter summaries, with key mathematical derivations highlighted to aid understanding. The text is supported by numerous worked examples and end of chapter problem sets. About the Theoretical Physics series Translated from the renowned and highly successf...
Statistical Physics An Introduction
Yoshioka, Daijiro
2007-01-01
This book provides a comprehensive presentation of the basics of statistical physics. The first part explains the essence of statistical physics and how it provides a bridge between microscopic and macroscopic phenomena, allowing one to derive quantities such as entropy. Here the author avoids going into details such as Liouville’s theorem or the ergodic theorem, which are difficult for beginners and unnecessary for the actual application of the statistical mechanics. In the second part, statistical mechanics is applied to various systems which, although they look different, share the same mathematical structure. In this way readers can deepen their understanding of statistical physics. The book also features applications to quantum dynamics, thermodynamics, the Ising model and the statistical dynamics of free spins.
Statistical symmetries in physics
Green, H.S.; Adelaide Univ., SA
1994-01-01
Every law of physics is invariant under some group of transformations and is therefore the expression of some type of symmetry. Symmetries are classified as geometrical, dynamical or statistical. At the most fundamental level, statistical symmetries are expressed in the field theories of the elementary particles. This paper traces some of the developments from the discovery of Bose statistics, one of the two fundamental symmetries of physics. A series of generalizations of Bose statistics is described. A supersymmetric generalization accommodates fermions as well as bosons, and further generalizations, including parastatistics, modular statistics and graded statistics, accommodate particles with properties such as 'colour'. A factorization of elements of ggl(n b ,n f ) can be used to define truncated boson operators. A general construction is given for q-deformed boson operators, and explicit constructions of the same type are given for various 'deformed' algebras. A summary is given of some of the applications and potential applications. 39 refs., 2 figs
Methods of statistical physics
Akhiezer, Aleksandr I
1981-01-01
Methods of Statistical Physics is an exposition of the tools of statistical mechanics, which evaluates the kinetic equations of classical and quantized systems. The book also analyzes the equations of macroscopic physics, such as the equations of hydrodynamics for normal and superfluid liquids and macroscopic electrodynamics. The text gives particular attention to the study of quantum systems. This study begins with a discussion of problems of quantum statistics with a detailed description of the basics of quantum mechanics along with the theory of measurement. An analysis of the asymptotic be
Müller-Kirsten, Harald J W
2013-01-01
Statistics links microscopic and macroscopic phenomena, and requires for this reason a large number of microscopic elements like atoms. The results are values of maximum probability or of averaging. This introduction to statistical physics concentrates on the basic principles, and attempts to explain these in simple terms supplemented by numerous examples. These basic principles include the difference between classical and quantum statistics, a priori probabilities as related to degeneracies, the vital aspect of indistinguishability as compared with distinguishability in classical physics, the differences between conserved and non-conserved elements, the different ways of counting arrangements in the three statistics (Maxwell-Boltzmann, Fermi-Dirac, Bose-Einstein), the difference between maximization of the number of arrangements of elements, and averaging in the Darwin-Fowler method. Significant applications to solids, radiation and electrons in metals are treated in separate chapters, as well as Bose-Eins...
Elementary statistical physics
Kittel, C
1965-01-01
This book is intended to help physics students attain a modest working knowledge of several areas of statistical mechanics, including stochastic processes and transport theory. The areas discussed are among those forming a useful part of the intellectual background of a physicist.
Statistical physics of vaccination
Wang, Zhen; Bauch, Chris T.; Bhattacharyya, Samit; d'Onofrio, Alberto; Manfredi, Piero; Perc, Matjaž; Perra, Nicola; Salathé, Marcel; Zhao, Dawei
2016-12-01
Historically, infectious diseases caused considerable damage to human societies, and they continue to do so today. To help reduce their impact, mathematical models of disease transmission have been studied to help understand disease dynamics and inform prevention strategies. Vaccination-one of the most important preventive measures of modern times-is of great interest both theoretically and empirically. And in contrast to traditional approaches, recent research increasingly explores the pivotal implications of individual behavior and heterogeneous contact patterns in populations. Our report reviews the developmental arc of theoretical epidemiology with emphasis on vaccination, as it led from classical models assuming homogeneously mixing (mean-field) populations and ignoring human behavior, to recent models that account for behavioral feedback and/or population spatial/social structure. Many of the methods used originated in statistical physics, such as lattice and network models, and their associated analytical frameworks. Similarly, the feedback loop between vaccinating behavior and disease propagation forms a coupled nonlinear system with analogs in physics. We also review the new paradigm of digital epidemiology, wherein sources of digital data such as online social media are mined for high-resolution information on epidemiologically relevant individual behavior. Armed with the tools and concepts of statistical physics, and further assisted by new sources of digital data, models that capture nonlinear interactions between behavior and disease dynamics offer a novel way of modeling real-world phenomena, and can help improve health outcomes. We conclude the review by discussing open problems in the field and promising directions for future research.
Nonequilibrium statistical physics
Röpke, Gerd
2013-01-01
Authored by one of the top theoretical physicists in Germany, and a well-known authority in the field, this is the only coherent presentation of the subject suitable for masters and PhD students, as well as postdocs in physics and related disciplines.Starting from a general discussion of the nonequilibrium state, different standard approaches such as master equations, and kinetic and linear response theory, are derived after special assumptions. This allows for an insight into the problems of nonequilibrium physics, a discussion of the limits, and suggestions for improvements. Applications
Quantum physics and statistical physics. 5. ed.
Alonso, Marcelo; Finn, Edward J.
2012-01-01
By logical and uniform presentation this recognized introduction in modern physics treats both the experimental and theoretical aspects. The first part of the book deals with quantum mechanics and their application to atoms, molecules, nuclei, solids, and elementary particles. The statistical physics with classical statistics, thermodynamics, and quantum statistics is theme of the second part. Alsonso and Finn avoid complicated mathematical developments; by numerous sketches and diagrams as well as many problems and examples they make the reader early and above all easily understandably familiar with the formations of concepts of modern physics.
Statistical methods in radiation physics
Turner, James E; Bogard, James S
2012-01-01
This statistics textbook, with particular emphasis on radiation protection and dosimetry, deals with statistical solutions to problems inherent in health physics measurements and decision making. The authors begin with a description of our current understanding of the statistical nature of physical processes at the atomic level, including radioactive decay and interactions of radiation with matter. Examples are taken from problems encountered in health physics, and the material is presented such that health physicists and most other nuclear professionals will more readily understand the application of statistical principles in the familiar context of the examples. Problems are presented at the end of each chapter, with solutions to selected problems provided online. In addition, numerous worked examples are included throughout the text.
Statistical methods for physical science
Stanford, John L
1994-01-01
This volume of Methods of Experimental Physics provides an extensive introduction to probability and statistics in many areas of the physical sciences, with an emphasis on the emerging area of spatial statistics. The scope of topics covered is wide-ranging-the text discusses a variety of the most commonly used classical methods and addresses newer methods that are applicable or potentially important. The chapter authors motivate readers with their insightful discussions, augmenting their material withKey Features* Examines basic probability, including coverage of standard distributions, time s
Introduction to mathematical statistical physics
Minlos, R A
1999-01-01
This book presents a mathematically rigorous approach to the main ideas and phenomena of statistical physics. The introduction addresses the physical motivation, focussing on the basic concept of modern statistical physics, that is the notion of Gibbsian random fields. Properties of Gibbsian fields are analyzed in two ranges of physical parameters: "regular" (corresponding to high-temperature and low-density regimes) where no phase transition is exhibited, and "singular" (low temperature regimes) where such transitions occur. Next, a detailed approach to the analysis of the phenomena of phase transitions of the first kind, the Pirogov-Sinai theory, is presented. The author discusses this theory in a general way and illustrates it with the example of a lattice gas with three types of particles. The conclusion gives a brief review of recent developments arising from this theory. The volume is written for the beginner, yet advanced students will benefit from it as well. The book will serve nicely as a supplement...
Statistical methods in physical mapping
Nelson, D.O.
1995-05-01
One of the great success stories of modern molecular genetics has been the ability of biologists to isolate and characterize the genes responsible for serious inherited diseases like fragile X syndrome, cystic fibrosis and myotonic muscular dystrophy. This dissertation concentrates on constructing high-resolution physical maps. It demonstrates how probabilistic modeling and statistical analysis can aid molecular geneticists in the tasks of planning, execution, and evaluation of physical maps of chromosomes and large chromosomal regions. The dissertation is divided into six chapters. Chapter 1 provides an introduction to the field of physical mapping, describing the role of physical mapping in gene isolation and ill past efforts at mapping chromosomal regions. The next two chapters review and extend known results on predicting progress in large mapping projects. Such predictions help project planners decide between various approaches and tactics for mapping large regions of the human genome. Chapter 2 shows how probability models have been used in the past to predict progress in mapping projects. Chapter 3 presents new results, based on stationary point process theory, for progress measures for mapping projects based on directed mapping strategies. Chapter 4 describes in detail the construction of all initial high-resolution physical map for human chromosome 19. This chapter introduces the probability and statistical models involved in map construction in the context of a large, ongoing physical mapping project. Chapter 5 concentrates on one such model, the trinomial model. This chapter contains new results on the large-sample behavior of this model, including distributional results, asymptotic moments, and detection error rates. In addition, it contains an optimality result concerning experimental procedures based on the trinomial model. The last chapter explores unsolved problems and describes future work
Statistical methods in physical mapping
Nelson, David O. [Univ. of California, Berkeley, CA (United States)
1995-05-01
One of the great success stories of modern molecular genetics has been the ability of biologists to isolate and characterize the genes responsible for serious inherited diseases like fragile X syndrome, cystic fibrosis and myotonic muscular dystrophy. This dissertation concentrates on constructing high-resolution physical maps. It demonstrates how probabilistic modeling and statistical analysis can aid molecular geneticists in the tasks of planning, execution, and evaluation of physical maps of chromosomes and large chromosomal regions. The dissertation is divided into six chapters. Chapter 1 provides an introduction to the field of physical mapping, describing the role of physical mapping in gene isolation and ill past efforts at mapping chromosomal regions. The next two chapters review and extend known results on predicting progress in large mapping projects. Such predictions help project planners decide between various approaches and tactics for mapping large regions of the human genome. Chapter 2 shows how probability models have been used in the past to predict progress in mapping projects. Chapter 3 presents new results, based on stationary point process theory, for progress measures for mapping projects based on directed mapping strategies. Chapter 4 describes in detail the construction of all initial high-resolution physical map for human chromosome 19. This chapter introduces the probability and statistical models involved in map construction in the context of a large, ongoing physical mapping project. Chapter 5 concentrates on one such model, the trinomial model. This chapter contains new results on the large-sample behavior of this model, including distributional results, asymptotic moments, and detection error rates. In addition, it contains an optimality result concerning experimental procedures based on the trinomial model. The last chapter explores unsolved problems and describes future work.
Report on the Taipei CHEP 2010 Conference
CERN. Geneva
2010-01-01
This year, the 18th Computing in High Energy and Nuclear Physics (CHEP) conference has been held at Taipei, Taiwan, from 18-22 October 2010.CHEP conferences provide an international forum to exchange information on computing experience and needs for the High Energy Physics and Nuclear Physics communities, and to review recent, ongoing and future activities.The speakers at this Computing Seminar will present their views of the CHEP 2010 conference.
Statistics for High Energy Physics
CERN. Geneva
2018-01-01
The lectures emphasize the frequentist approach used for Dark Matter search and the Higgs search, discovery and measurements of its properties. An emphasis is put on hypothesis test using the asymptotic formulae formalism and its derivation, and on the derivation of the trial factor formulae in one and two dimensions. Various test statistics and their applications are discussed. Some keywords: Profile Likelihood, Neyman Pearson, Feldman Cousins, Coverage, CLs. Nuisance Parameters Impact, Look Elsewhere Effect... Selected Bibliography: G. J. Feldman and R. D. Cousins, A Unified approach to the classical statistical analysis of small signals, Phys.\\ Rev.\\ D {\\bf 57}, 3873 (1998). A. L. Read, Presentation of search results: The CL(s) technique,'' J.\\ Phys.\\ G {\\bf 28}, 2693 (2002). G. Cowan, K. Cranmer, E. Gross and O. Vitells, Asymptotic formulae for likelihood-based tests of new physics,' Eur.\\ Phys.\\ J.\\ C {\\bf 71}, 1554 (2011) Erratum: [Eur.\\ Phys.\\ J.\\ C {\\bf 73}...
Statistical physics and condensed matter
NONE
2003-07-01
This document is divided into 4 sections: 1) General aspects of statistical physics. The themes include: possible geometrical structures of thermodynamics, the thermodynamical foundation of quantum measurement, transport phenomena (kinetic theory, hydrodynamics and turbulence) and out of equilibrium systems (stochastic dynamics and turbulence). The techniques involved here are typical of applied analysis: stability criteria, mode decomposition, shocks and stochastic equations. 2) Disordered, glassy and granular systems: statics and dynamics. The complexity of the systems can be studied through the structure of their phase space. The geometry of this phase space is studied in several works: the overlap distribution can now be computed with a very high precision; the boundary energy between low lying states does not behave like in ordinary systems; and the Edward's hypothesis of equi-probability of low lying metastable states is invalidated. The phenomenon of aging, characteristic of glassy dynamics, is studied in several models. Dynamics of biological systems or of fracture is shown to bear some resemblance with that of disordered systems. 3) Quantum systems. The themes include: mesoscopic superconductors, supersymmetric approach to strongly correlated electrons, quantum criticality and heavy fermion compounds, optical sum rule violation in the cuprates, heat capacity of lattice spin models from high-temperature series expansion, Lieb-Schultz-Mattis theorem in dimension larger than one, quantum Hall effect, Bose-Einstein condensation and multiple-spin exchange model on the triangular lattice. 4) Soft condensed matter and biological systems. Path integral representations are invaluable to describe polymers, proteins and self-avoiding membranes. Using these methods, problems as diverse as the titration of a weak poly-acid by a strong base, the denaturation transition of DNA or bridge-hopping in conducting polymers have been addressed. The problems of RNA folding
Statistical physics and condensed matter
2003-01-01
This document is divided into 4 sections: 1) General aspects of statistical physics. The themes include: possible geometrical structures of thermodynamics, the thermodynamical foundation of quantum measurement, transport phenomena (kinetic theory, hydrodynamics and turbulence) and out of equilibrium systems (stochastic dynamics and turbulence). The techniques involved here are typical of applied analysis: stability criteria, mode decomposition, shocks and stochastic equations. 2) Disordered, glassy and granular systems: statics and dynamics. The complexity of the systems can be studied through the structure of their phase space. The geometry of this phase space is studied in several works: the overlap distribution can now be computed with a very high precision; the boundary energy between low lying states does not behave like in ordinary systems; and the Edward's hypothesis of equi-probability of low lying metastable states is invalidated. The phenomenon of aging, characteristic of glassy dynamics, is studied in several models. Dynamics of biological systems or of fracture is shown to bear some resemblance with that of disordered systems. 3) Quantum systems. The themes include: mesoscopic superconductors, supersymmetric approach to strongly correlated electrons, quantum criticality and heavy fermion compounds, optical sum rule violation in the cuprates, heat capacity of lattice spin models from high-temperature series expansion, Lieb-Schultz-Mattis theorem in dimension larger than one, quantum Hall effect, Bose-Einstein condensation and multiple-spin exchange model on the triangular lattice. 4) Soft condensed matter and biological systems. Path integral representations are invaluable to describe polymers, proteins and self-avoiding membranes. Using these methods, problems as diverse as the titration of a weak poly-acid by a strong base, the denaturation transition of DNA or bridge-hopping in conducting polymers have been addressed. The problems of RNA folding has
Statistical Physics of Colloidal Dispersions.
Canessa, E.
Available from UMI in association with The British Library. Requires signed TDF. This thesis is concerned with the equilibrium statistical mechanics of colloidal dispersions which represent useful model systems for the study of condensed matter physics; namely, charge stabilized colloidal dispersions and polymer stabilized colloidal dispersions. A one-component macroparticle approach is adopted in order to treat the macroscopic and microscopic properties of these systems in a simple and comprehensive manner. The thesis opens with the description of the nature of the colloidal state before reviewing some basic definitions and theory in Chapter II. In Chapter III a variational theory of phase equilibria based on the Gibbs-Bogolyobov inequality is applied to sterically stabilized colloidal dispersions. Hard spheres are chosen as the reference system for the disordered phases while an Einstein model is used for the ordered phases. The new choice of pair potential, taken for mathematical convenience, is a superposition of two Yukawa functions. By matching a double Yukawa potential to the van der Waals attractive potential at different temperatures and introducing a purely temperature dependent coefficient to the repulsive part, a rich variety of observed phase separation phenomena is qualitatively described. The behaviour of the potential is found to be consistent with a small decrease of the polymer layer thickness with increasing temperature. Using the same concept of a collapse transition the non-monotonic second virial coefficient is also explained and quantified. It is shown that a reduction of the effective macroparticle diameter with increasing temperature can only be partially examined from the point of view of a (binary-) polymer solution theory. This chapter concludes with the description of the observed, reversible, depletion flocculation behaviour. This is accomplished by using the variational formalism and by invoking the double Yukawa potential to allow
Statistical and thermal physics with computer applications
Gould, Harvey
2010-01-01
This textbook carefully develops the main ideas and techniques of statistical and thermal physics and is intended for upper-level undergraduate courses. The authors each have more than thirty years' experience in teaching, curriculum development, and research in statistical and computational physics. Statistical and Thermal Physics begins with a qualitative discussion of the relation between the macroscopic and microscopic worlds and incorporates computer simulations throughout the book to provide concrete examples of important conceptual ideas. Unlike many contemporary texts on the
Vol. 3: Statistical Physics and Phase Transitions
Sitenko, A.
1993-01-01
Problems of modern physics and the situation with physical research in Ukraine are considered. Programme of the conference includes scientific and general problems. Its proceedings are published in 6 volumes. The papers presented in this volume refer to statistical physics and phase transition theory
A modern course in statistical physics
Reichl, Linda E
2016-01-01
"A Modern Course in Statistical Physics" is a textbook that illustrates the foundations of equilibrium and non-equilibrium statistical physics, and the universal nature of thermodynamic processes, from the point of view of contemporary research problems. The book treats such diverse topics as the microscopic theory of critical phenomena, superfluid dynamics, quantum conductance, light scattering, transport processes, and dissipative structures, all in the framework of the foundations of statistical physics and thermodynamics. It shows the quantum origins of problems in classical statistical physics. One focus of the book is fluctuations that occur due to the discrete nature of matter, a topic of growing importance for nanometer scale physics and biophysics. Another focus concerns classical and quantum phase transitions, in both monatomic and mixed particle systems. This fourth edition extends the range of topics considered to include, for example, entropic forces, electrochemical processes in biological syste...
Reconstructing Macroeconomics Based on Statistical Physics
Aoki, Masanao; Yoshikawa, Hiroshi
We believe that time has come to integrate the new approach based on statistical physics or econophysics into macroeconomics. Toward this goal, there must be more dialogues between physicists and economists. In this paper, we argue that there is no reason why the methods of statistical physics so successful in many fields of natural sciences cannot be usefully applied to macroeconomics that is meant to analyze the macroeconomy comprising a large number of economic agents. It is, in fact, weird to regard the macroeconomy as a homothetic enlargement of the representative micro agent. We trust the bright future of the new approach to macroeconomies based on statistical physics.
Understanding search trees via statistical physics
ary search tree model (where stands for the number of branches of the search tree), an important problem for data storage in computer science, using a variety of statistical physics techniques that allow us to obtain exact asymptotic results.
Statistical physics including applications to condensed matter
Hermann, Claudine
2005-01-01
Statistical Physics bridges the properties of a macroscopic system and the microscopic behavior of its constituting particles, otherwise impossible due to the giant magnitude of Avogadro's number. Numerous systems of today's key technologies -- as e.g. semiconductors or lasers -- are macroscopic quantum objects; only statistical physics allows for understanding their fundamentals. Therefore, this graduate text also focuses on particular applications such as the properties of electrons in solids with applications, and radiation thermodynamics and the greenhouse effect.
Science Academies' Refresher Course in Statistical Physics
The Course is aimed at college teachers of statistical physics at BSc/MSc level. ... teachers, with at least a masters degree in Physics/Mathematics/Engineering are ... Topics: There will be six courses dealing with, Basic principles and general ...
Concept of probability in statistical physics
Guttmann, Y M
1999-01-01
Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.
Statistics for Physical Sciences An Introduction
Martin, Brian
2012-01-01
Statistical Methods for the Physical Sciences is an informal, relatively short, but systematic, guide to the more commonly used ideas and techniques in statistical analysis, as used in physical sciences, together with explanations of their origins. It steers a path between the extremes of a recipe of methods with a collection of useful formulas, and a full mathematical account of statistics, while at the same time developing the subject in a logical way. The book can be read in its entirety by anyone with a basic exposure to mathematics at the level of a first-year undergraduate student
Statistical and physical evolution of QSO's
Caditz, D.; Petrosian, V.
1989-09-01
The relationship between the physical evolution of discrete extragalactic sources, the statistical evolution of the observed population of sources, and the cosmological model is discussed. Three simple forms of statistical evolution: pure luminosity evolution (PLE), pure density evolution (PDE), and generalized luminosity evolution (GLE), are considered in detail together with what these forms imply about the physical evolution of individual sources. Two methods are used to analyze the statistical evolution of the observed distribution of QSO's (quasars) from combined flux limited samples. It is shown that both PLE and PDE are inconsistent with the data over the redshift range 0 less than z less than 2.2, and that a more complicated form of evolution such as GLE is required, independent of the cosmological model. This result is important for physical models of AGN, and in particular, for the accretion disk model which recent results show may be inconsistent with PLE
Probability and statistics in particle physics
Frodesen, A.G.; Skjeggestad, O.
1979-01-01
Probability theory is entered into at an elementary level and given a simple and detailed exposition. The material on statistics has been organised with an eye to the experimental physicist's practical need, which is likely to be statistical methods for estimation or decision-making. The book is intended for graduate students and research workers in experimental high energy and elementary particle physics, and numerous examples from these fields are presented. (JIW)
Science Academies' Refresher Course in Statistical Physics
The Course is aimed at college teachers of statistical physics at BSc/MSc level. It will cover basic principles and techniques, in a pedagogical manner, through lectures and tutorials, with illustrative problems. Some advanced topics, and common difficulties faced by students will also be discussed. College/University ...
Statistical physics of pairwise probability models
Roudi, Yasser; Aurell, Erik; Hertz, John
2009-01-01
(dansk abstrakt findes ikke) Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data......: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying...
Topics in statistical and theoretical physics
Dobrushin, R L; Shubin, M A
1996-01-01
This is the second of two volumes dedicated to the scientific heritage of F. A. Berezin (1931-1980). Before his untimely death, Berezin had an important influence on physics and mathematics, discovering new ideas in mathematical physics, representation theory, analysis, geometry, and other areas of mathematics. His crowning achievements were the introduction of a new notion of deformation quantization and Grassmannian analysis ("supermathematics"). Collected here are papers by many of his colleagues and others who worked in related areas, representing a wide spectrum of topics in statistical a
Conference: Statistical Physics and Biological Information
Gross, David J.; Hwa, Terence
2001-01-01
In the spring of 2001, the Institute for Theoretical Physics ran a 6 month scientific program on Statistical Physics and Biological Information. This program was organized by Walter Fitch (UC Irvine), Terence Hwa (UC San Diego), Luca Peliti (University Federico II), Naples Gary Stormo (Washington University School of Medicine) and Chao Tang (NEC). Overall scientific supervision was provided by David Gross, Director, ITP. The ITP has an online conference/program proceeding which consists of audio and transparencies of almost all of the talks held during this program. Over 100 talks are available on the site at http://online.kitp.ucsb.edu/online/infobio01/
Conference: Statistical Physics and Biological Information; F
Gross, David J.; Hwa, Terence
2001-01-01
In the spring of 2001, the Institute for Theoretical Physics ran a 6 month scientific program on Statistical Physics and Biological Information. This program was organized by Walter Fitch (UC Irvine), Terence Hwa (UC San Diego), Luca Peliti (University Federico II), Naples Gary Stormo (Washington University School of Medicine) and Chao Tang (NEC). Overall scientific supervision was provided by David Gross, Director, ITP. The ITP has an online conference/program proceeding which consists of audio and transparencies of almost all of the talks held during this program. Over 100 talks are available on the site at http://online.kitp.ucsb.edu/online/infobio01/
Statistical and thermal physics an introduction
Hoch, Michael JR
2011-01-01
""When I started reading Michael J.R. Hoch's book Statistical and Thermal Physics: An Introduction I thought to myself that this is another book the same as a large group of others with similar content. … But during my reading this unjustified belief changed. … The main reason for this change was the way of information presentation: … the way of presentation is designed so that the reader receives only the information that is necessary to give the essence of the problem. … this book will provide an introduction to the subject especially for those who are interested in basic or applied physics.
Statistical Issues in Searches for New Physics
CERN. Geneva
2015-01-01
Given the cost, both financial and even more importantly in terms of human effort, in building High Energy Physics accelerators and detectors and running them, it is important to use good statistical techniques in analysing data. This talk covers some of the statistical issues that arise in searches for New Physics. They include topics such as: Should we insist on the 5 sigma criterion for discovery claims? What are the relative merits of a Raster Scan or a "2-D" approach? P(A|B) is not the same as P(B|A) The meaning of p-values Example of a problematic likelihood What is Wilks Theorem and when does it not apply? How should we deal with the "Look Elsewhere Effect"? Dealing with systematics such as background parametrisation Coverage: What is it and does my method have the correct coverage? The use of p0 vs. p1 plots
Statistical physics of medical ultrasonic images
Wagner, R.F.; Insana, M.F.; Brown, D.G.; Smith, S.W.
1987-01-01
The physical and statistical properties of backscattered signals in medical ultrasonic imaging are reviewed in terms of: 1) the radiofrequency signal; 2) the envelope (video or magnitude) signal; and 3) the density of samples in simple and in compounded images. There is a wealth of physical information in backscattered signals in medical ultrasound. This information is contained in the radiofrequency spectrum - which is not typically displayed to the viewer - as well as in the higher statistical moments of the envelope or video signal - which are not readily accessed by the human viewer of typical B-scans. This information may be extracted from the detected backscattered signals by straightforward signal processing techniques at low resolution
Fluctuations of physical values in statistical mechanics
Zaripov, R.G.
1999-01-01
The new matrix inequalities for the boundary of measurement accuracy of physical values in the ensemble of quantum systems were obtained. The multidimensional thermodynamical parameter measurement is estimated. The matrix inequalities obtained are quantum analogs of the Cramer-Rao information inequalities in mathematical statistics. The quantity of information in quantum mechanical measurement, connected with the boundaries of jointly measurable values in one macroscopic experiment was determined. The lower boundary of the variance of estimation of multidimensional quantum mechanical parameter was found. (author)
Nonequilibrium statistical physics a modern perspective
Livi, Roberto
2017-01-01
Statistical mechanics has been proven to be successful at describing physical systems at thermodynamic equilibrium. Since most natural phenomena occur in nonequilibrium conditions, the present challenge is to find suitable physical approaches for such conditions: this book provides a pedagogical pathway that explores various perspectives. The use of clear language, and explanatory figures and diagrams to describe models, simulations and experimental findings makes the book a valuable resource for undergraduate and graduate students, and also for lecturers organizing teaching at varying levels of experience in the field. Written in three parts, it covers basic and traditional concepts of nonequilibrium physics, modern aspects concerning nonequilibrium phase transitions, and application-orientated topics from a modern perspective. A broad range of topics is covered, including Langevin equations, Levy processes, directed percolation, kinetic roughening and pattern formation.
Statistical physics of an anyon gas
Dasnieres de Veigy, A.
1994-01-01
In quantum two-dimensional physics, anyons are particles which have an intermediate statistics between Bose-Einstein and Fermi-Dirac statistics. The wave amplitude can change by an arbitrary phase under particle exchanges. Contrary to bosons or fermions, the permutation group cannot uniquely characterize this phase and one must introduce the braid group. One shows that the statistical ''interaction'' is equivalent to an Aharonov-Bohm interaction which derives from a Chern-Simons lagrangian. The main subject of this thesis is the thermodynamics of an anyon gas. Since the complete spectrum of N anyons seems out of reach, we have done a perturbative computation of the equation of state at second order near Bose or Fermi statistics. One avoids ultraviolet divergences by noticing that the short-range singularities of the statistical interaction enforce the wave functions to vanish when two particles approach each other (statistical exclusion). The gas is confined in a harmonic well in order to obtain the thermodynamics limit when the harmonic attraction goes to zero. Infrared divergences thus cancel in this limit and a finite virial expansion is obtained. The complexity of the anyon model appears in this result. We have also computed the equation of state of an anyon gas in a magnetic field strong enough to project the system in its degenerate groundstate. This result concerns anyons with any statistics. One then finds an exclusion principle generalizing the Pauli principle to anyons. On the other hand, we have defined a model of two-dimensional particles topologically interacting at a distance. The anyon model is recovered as a particular case where all particles are identical. (orig.)
PREFACE: Statistical Physics of Complex Fluids
Golestanian, R.; Khajehpour, M. R. H.; Kolahchi, M. R.; Rouhani, S.
2005-04-01
The field of complex fluids is a rapidly developing, highly interdisciplinary field that brings together people from a plethora of backgrounds such as mechanical engineering, chemical engineering, materials science, applied mathematics, physics, chemistry and biology. In this melting pot of science, the traditional boundaries of various scientific disciplines have been set aside. It is this very property of the field that has guaranteed its richness and prosperity since the final decade of the 20th century and into the 21st. The C3 Commission of the International Union of Pure and Applied Physics (IUPAP), which is the commission for statistical physics that organizes the international STATPHYS conferences, encourages various, more focused, satellite meetings to complement the main event. For the STATPHYS22 conference in Bangalore (July 2004), Iran was recognized by the STATPHYS22 organizers as suitable to host such a satellite meeting and the Institute for Advanced Studies in Basic Sciences (IASBS) was chosen to be the site of this meeting. It was decided to organize a meeting in the field of complex fluids, which is a fairly developed field in Iran. This international meeting, and an accompanying summer school, were intended to boost international connections for both the research groups working in Iran, and several other groups working in the Middle East, South Asia and North Africa. The meeting, entitled `Statistical Physics of Complex Fluids' was held at the Institute for Advanced Studies in Basic Sciences (IASBS) in Zanjan, Iran, from 27 June to 1 July 2004. The main topics discussed at the meeting included: biological statistical physics, wetting and microfluidics, transport in complex media, soft and granular matter, and rheology of complex fluids. At this meeting, 22 invited lectures by eminent scientists were attended by 107 participants from different countries. The poster session consisted of 45 presentations which, in addition to the main topics of the
Statistical physics of hard optimization problems
Zdeborova, L.
2009-01-01
Optimization is fundamental in many areas of science, from computer science and information theory to engineering and statistical physics, as well as to biology or social sciences. It typically involves a large number of variables and a cost function depending on these variables. Optimization problems in the non-deterministic polynomial (NP)-complete class are particularly difficult, it is believed that the number of operations required to minimize the cost function is in the most difficult cases exponential in the system size. However, even in an NP-complete problem the practically arising instances might, in fact, be easy to solve. The principal question we address in this article is: How to recognize if an NP-complete constraint satisfaction problem is typically hard and what are the main reasons for this? We adopt approaches from the statistical physics of disordered systems, in particular the cavity method developed originally to describe glassy systems. We describe new properties of the space of solutions in two of the most studied constraint satisfaction problems - random satisfy ability and random graph coloring. We suggest a relation between the existence of the so-called frozen variables and the algorithmic hardness of a problem. Based on these insights, we introduce a new class of problems which we named ”locked” constraint satisfaction, where the statistical description is easily solvable, but from the algorithmic point of view they are even more challenging than the canonical satisfy ability.
Statistical physics of hard optimization problems
Zdeborova, L.
2009-01-01
Optimization is fundamental in many areas of science, from computer science and information theory to engineering and statistical physics, as well as to biology or social sciences. It typically involves a large number of variables and a cost function depending on these variables. Optimization problems in the non-deterministic polynomial-complete class are particularly difficult, it is believed that the number of operations required to minimize the cost function is in the most difficult cases exponential in the system size. However, even in an non-deterministic polynomial-complete problem the practically arising instances might, in fact, be easy to solve. The principal the question we address in the article is: How to recognize if an non-deterministic polynomial-complete constraint satisfaction problem is typically hard and what are the main reasons for this? We adopt approaches from the statistical physics of disordered systems, in particular the cavity method developed originally to describe glassy systems. We describe new properties of the space of solutions in two of the most studied constraint satisfaction problems - random satisfiability and random graph coloring. We suggest a relation between the existence of the so-called frozen variables and the algorithmic hardness of a problem. Based on these insights, we introduce a new class of problems which we named 'locked' constraint satisfaction, where the statistical description is easily solvable, but from the algorithmic point of view they are even more challenging than the canonical satisfiability (Authors)
Statistical physics of hard optimization problems
Zdeborová, Lenka
2009-06-01
Optimization is fundamental in many areas of science, from computer science and information theory to engineering and statistical physics, as well as to biology or social sciences. It typically involves a large number of variables and a cost function depending on these variables. Optimization problems in the non-deterministic polynomial (NP)-complete class are particularly difficult, it is believed that the number of operations required to minimize the cost function is in the most difficult cases exponential in the system size. However, even in an NP-complete problem the practically arising instances might, in fact, be easy to solve. The principal question we address in this article is: How to recognize if an NP-complete constraint satisfaction problem is typically hard and what are the main reasons for this? We adopt approaches from the statistical physics of disordered systems, in particular the cavity method developed originally to describe glassy systems. We describe new properties of the space of solutions in two of the most studied constraint satisfaction problems - random satisfiability and random graph coloring. We suggest a relation between the existence of the so-called frozen variables and the algorithmic hardness of a problem. Based on these insights, we introduce a new class of problems which we named "locked" constraint satisfaction, where the statistical description is easily solvable, but from the algorithmic point of view they are even more challenging than the canonical satisfiability.
Networking—a statistical physics perspective
Yeung, Chi Ho; Saad, David
2013-01-01
Networking encompasses a variety of tasks related to the communication of information on networks; it has a substantial economic and societal impact on a broad range of areas including transportation systems, wired and wireless communications and a range of Internet applications. As transportation and communication networks become increasingly more complex, the ever increasing demand for congestion control, higher traffic capacity, quality of service, robustness and reduced energy consumption requires new tools and methods to meet these conflicting requirements. The new methodology should serve for gaining better understanding of the properties of networking systems at the macroscopic level, as well as for the development of new principled optimization and management algorithms at the microscopic level. Methods of statistical physics seem best placed to provide new approaches as they have been developed specifically to deal with nonlinear large-scale systems. This review aims at presenting an overview of tools and methods that have been developed within the statistical physics community and that can be readily applied to address the emerging problems in networking. These include diffusion processes, methods from disordered systems and polymer physics, probabilistic inference, which have direct relevance to network routing, file and frequency distribution, the exploration of network structures and vulnerability, and various other practical networking applications. (topical review)
Networking—a statistical physics perspective
Yeung, Chi Ho; Saad, David
2013-03-01
Networking encompasses a variety of tasks related to the communication of information on networks; it has a substantial economic and societal impact on a broad range of areas including transportation systems, wired and wireless communications and a range of Internet applications. As transportation and communication networks become increasingly more complex, the ever increasing demand for congestion control, higher traffic capacity, quality of service, robustness and reduced energy consumption requires new tools and methods to meet these conflicting requirements. The new methodology should serve for gaining better understanding of the properties of networking systems at the macroscopic level, as well as for the development of new principled optimization and management algorithms at the microscopic level. Methods of statistical physics seem best placed to provide new approaches as they have been developed specifically to deal with nonlinear large-scale systems. This review aims at presenting an overview of tools and methods that have been developed within the statistical physics community and that can be readily applied to address the emerging problems in networking. These include diffusion processes, methods from disordered systems and polymer physics, probabilistic inference, which have direct relevance to network routing, file and frequency distribution, the exploration of network structures and vulnerability, and various other practical networking applications.
Statistical mechanics and the physics of fluids
Tosi, Mario
This volume collects the lecture notes of a course on statistical mechanics, held at Scuola Normale Superiore di Pisa for third-to-fifth year students in physics and chemistry. Three main themes are covered in the book. The first part gives a compact presentation of the foundations of statistical mechanics and their connections with thermodynamics. Applications to ideal gases of material particles and of excitation quanta are followed by a brief introduction to a real classical gas and to a weakly coupled classical plasma, and by a broad overview on the three states of matter.The second part is devoted to fluctuations around equilibrium and their correlations. Coverage of liquid structure and critical phenomena is followed by a discussion of irreversible processes as exemplified by diffusive motions and by the dynamics of density and heat fluctuations. Finally, the third part is an introduction to some advanced themes: supercooling and the glassy state, non-Newtonian fluids including polymers and liquid cryst...
Quantum theoretical physics is statistical and relativistic
Harding, C.
1980-01-01
A new theoretical framework for the quantum mechanism is presented. It is based on a strict deterministic behavior of single systems. The conventional QM equation, however, is found to describe statistical results of many classical systems. It will be seen, moreover, that a rigorous synthesis of our theory requires relativistic kinematics. So, QM is not only a classical statistical theory, it is, of necessity, a relativistic theory. The equation of the theory does not just duplicate QM, it indicates an inherent nonlinearity in QM which is subject to experimental verification. It is shown, therefore, that conventional QM is a corollary of classical deterministic principles. It is suggested that this concept of nature conflicts with that prevalent in modern physics. (author)
Statistical physics of interacting neural networks
Kinzel, Wolfgang; Metzler, Richard; Kanter, Ido
2001-12-01
Recent results on the statistical physics of time series generation and prediction are presented. A neural network is trained on quasi-periodic and chaotic sequences and overlaps to the sequence generator as well as the prediction errors are calculated numerically. For each network there exists a sequence for which it completely fails to make predictions. Two interacting networks show a transition to perfect synchronization. A pool of interacting networks shows good coordination in the minority game-a model of competition in a closed market. Finally, as a demonstration, a perceptron predicts bit sequences produced by human beings.
Statistical physics of crime: a review.
D'Orsogna, Maria R; Perc, Matjaž
2015-03-01
Containing the spread of crime in urban societies remains a major challenge. Empirical evidence suggests that, if left unchecked, crimes may be recurrent and proliferate. On the other hand, eradicating a culture of crime may be difficult, especially under extreme social circumstances that impair the creation of a shared sense of social responsibility. Although our understanding of the mechanisms that drive the emergence and diffusion of crime is still incomplete, recent research highlights applied mathematics and methods of statistical physics as valuable theoretical resources that may help us better understand criminal activity. We review different approaches aimed at modeling and improving our understanding of crime, focusing on the nucleation of crime hotspots using partial differential equations, self-exciting point process and agent-based modeling, adversarial evolutionary games, and the network science behind the formation of gangs and large-scale organized crime. We emphasize that statistical physics of crime can relevantly inform the design of successful crime prevention strategies, as well as improve the accuracy of expectations about how different policing interventions should impact malicious human activity that deviates from social norms. We also outline possible directions for future research, related to the effects of social and coevolving networks and to the hierarchical growth of criminal structures due to self-organization. Copyright © 2014 Elsevier B.V. All rights reserved.
Statistical Physics Approaches to RNA Editing
Bundschuh, Ralf
2012-02-01
The central dogma of molecular Biology states that DNA is transcribed base by base into RNA which is in turn translated into proteins. However, some organisms edit their RNA before translation by inserting, deleting, or substituting individual or short stretches of bases. In many instances the mechanisms by which an organism recognizes the positions at which to edit or by which it performs the actual editing are unknown. One model system that stands out by its very high rate of on average one out of 25 bases being edited are the Myxomycetes, a class of slime molds. In this talk we will show how the computational methods and concepts from statistical Physics can be used to analyze DNA and protein sequence data to predict editing sites in these slime molds and to guide experiments that identified previously unknown types of editing as well as the complete set of editing events in the slime mold Physarum polycephalum.
Statistical Physics Approaches to Microbial Ecology
Mehta, Pankaj
The unprecedented ability to quantitatively measure and probe complex microbial communities has renewed interest in identifying the fundamental ecological principles governing community ecology in microbial ecosystems. Here, we present work from our group and others showing how ideas from statistical physics can help us uncover these ecological principles. Two major lessons emerge from this work. First, large, ecosystems with many species often display new, emergent ecological behaviors that are absent in small ecosystems with just a few species. To paraphrase Nobel laureate Phil Anderson, ''More is Different'', especially in community ecology. Second, the lack of trophic layer separation in microbial ecology fundamentally distinguishes microbial ecology from classical paradigms of community ecology and leads to qualitative different rules for community assembly in microbes. I illustrate these ideas using both theoretical modeling and novel new experiments on large microbial ecosystems performed by our collaborators (Joshua Goldford and Alvaro Sanchez). Work supported by Simons Investigator in MMLS and NIH R35 R35 GM119461.
Statistical physics of pairwise probability models
Yasser Roudi
2009-11-01
Full Text Available Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying and using pairwise models. We build on our previous work on the subject and study the relation between different methods for fitting these models and evaluating their quality. In particular, using data from simulated cortical networks we study how the quality of various approximate methods for inferring the parameters in a pairwise model depends on the time bin chosen for binning the data. We also study the effect of the size of the time bin on the model quality itself, again using simulated data. We show that using finer time bins increases the quality of the pairwise model. We offer new ways of deriving the expressions reported in our previous work for assessing the quality of pairwise models.
Statistical physics, seismogenesis, and seismic hazard
Main, Ian
1996-11-01
The scaling properties of earthquake populations show remarkable similarities to those observed at or near the critical point of other composite systems in statistical physics. This has led to the development of a variety of different physical models of seismogenesis as a critical phenomenon, involving locally nonlinear dynamics, with simplified rheologies exhibiting instability or avalanche-type behavior, in a material composed of a large number of discrete elements. In particular, it has been suggested that earthquakes are an example of a "self-organized critical phenomenon" analogous to a sandpile that spontaneously evolves to a critical angle of repose in response to the steady supply of new grains at the summit. In this stationary state of marginal stability the distribution of avalanche energies is a power law, equivalent to the Gutenberg-Richter frequency-magnitude law, and the behavior is relatively insensitive to the details of the dynamics. Here we review the results of some of the composite physical models that have been developed to simulate seismogenesis on different scales during (1) dynamic slip on a preexisting fault, (2) fault growth, and (3) fault nucleation. The individual physical models share some generic features, such as a dynamic energy flux applied by tectonic loading at a constant strain rate, strong local interactions, and fluctuations generated either dynamically or by fixed material heterogeneity, but they differ significantly in the details of the assumed dynamics and in the methods of numerical solution. However, all exhibit critical or near-critical behavior, with behavior quantitatively consistent with many of the observed fractal or multifractal scaling laws of brittle faulting and earthquakes, including the Gutenberg-Richter law. Some of the results are sensitive to the details of the dynamics and hence are not strict examples of self-organized criticality. Nevertheless, the results of these different physical models share some
The scientifiv way of thinking in statistics, statistical physics and quantum mechanics
Săvoiu, Gheorghe
2008-01-01
This paper focuses on the way of thinking in both classical and modern Physics and Statistics, Statistical Mechanics or Statistical Physics and Quantum Mechanics. These different statistical ways of thinking and their specific methods have generated new fields for new activities and new scientific disciplines, like Econophysics (between Economics and Physics), Sociophysics (between Sociology and Physics), Mediaphysics (between all media and comunication sciences), etc. After describing some r...
The scientific way of thinking in statistics, statistical physics and quantum mechanics
Săvoiu, Gheorghe
2008-01-01
This paper focuses on the way of thinking in both classical and modern Physics and Statistics, Statistical Mechanics or Statistical Physics and Quantum Mechanics. These different statistical ways of thinking and their specific methods have generated new fields for new activities and new scientific disciplines, like Econophysics (between Economics and Physics), Sociophysics (between Sociology and Physics), Mediaphysics (between all media and comunication sciences), etc. After describing some r...
Statistical Physics of Complex Substitutive Systems
Jin, Qing
Diffusion processes are central to human interactions. Despite extensive studies that span multiple disciplines, our knowledge is limited to spreading processes in non-substitutive systems. Yet, a considerable number of ideas, products, and behaviors spread by substitution; to adopt a new one, agents must give up an existing one. This captures the spread of scientific constructs--forcing scientists to choose, for example, a deterministic or probabilistic worldview, as well as the adoption of durable items, such as mobile phones, cars, or homes. In this dissertation, I develop a statistical physics framework to describe, quantify, and understand substitutive systems. By empirically exploring three collected high-resolution datasets pertaining to such systems, I build a mechanistic model describing substitutions, which not only analytically predicts the universal macroscopic phenomenon discovered in the collected datasets, but also accurately captures the trajectories of individual items in a complex substitutive system, demonstrating a high degree of regularity and universality in substitutive systems. I also discuss the origins and insights of the parameters in the substitution model and possible generalization form of the mathematical framework. The systematical study of substitutive systems presented in this dissertation could potentially guide the understanding and prediction of all spreading phenomena driven by substitutions, from electric cars to scientific paradigms, and from renewable energy to new healthy habits.
Statistical physics of media processes: Mediaphysics
Kuznetsov, Dmitri V.; Mandel, Igor
2007-04-01
The processes of mass communications in complicated social or sociobiological systems such as marketing, economics, politics, animal populations, etc. as a subject for the special scientific subbranch-“mediaphysics”-are considered in its relation with sociophysics. A new statistical physics approach to analyze these phenomena is proposed. A keystone of the approach is an analysis of population distribution between two or many alternatives: brands, political affiliations, or opinions. Relative distances between a state of a “person's mind” and the alternatives are measures of propensity to buy (to affiliate, or to have a certain opinion). The distribution of population by those relative distances is time dependent and affected by external (economic, social, marketing, natural) and internal (influential propagation of opinions, “word of mouth”, etc.) factors, considered as fields. Specifically, the interaction and opinion-influence field can be generalized to incorporate important elements of Ising-spin-based sociophysical models and kinetic-equation ones. The distributions were described by a Schrödinger-type equation in terms of Green's functions. The developed approach has been applied to a real mass-media efficiency problem for a large company and generally demonstrated very good results despite low initial correlations of factors and the target variable.
Statistical spectroscopic studies in nuclear structure physics
Halemane, T.R.
1979-01-01
The spectral distribution theory establishes the centroid and width of the energy spectrum as quantities of fundamental importance and gives credence to a geometry associated with averages of the product of pairs of operators acting within a model space. Utilizing this fact and partitioning the model space according to different group symmetries, simple and physically meaningful expansions are obtained for the model interactions. In the process, a global measure for the goodness of group symmetries is also developed. This procedure could eventually lead to a new way of constructing model interactions for nuclear structure studies. Numerical results for six (ds)-shell interactions and for scalar-isospin, configuration-isospin, space symmetry, supermultiplet and SU(e) x SU(4) group structures are presented. The notion of simultaneous propagation of operator averages in the irreps of two or more groups (not necessarily commuting) is also introduced. The non-energy-weighted sum rule (NEWSR) for electric and magnetic multipole excitations in the (ds)-shell nuclei 20 Ne, 24 Mg, 28 Si, 32 S, and 36 Ar are evaluated. A generally applicable procedure for evaluating the eigenvalue bound to the NEWSR is presented and numerical results obtained for the said excitations and nuclei. Comparisons are made with experimental data and shell-model results. Further, a general theory is given for the linear-energy-weighted sum rule (LEWSR). When the Hamiltonian is one-body, this has a very simple form (expressible in terms of occupancies) and amounts to an extension of the Kurath sum rule to other types of excitations and to arbitrary one-body Hamiltonians. Finally, we develop a statistical approach to perturbation theory and inverse-energy-weighted sum rules, and indicate some applications
Thermodynamics and statistical physics. 2. rev. ed.
Schnakenberg, J.
2002-01-01
This textbook covers tthe following topics: Thermodynamic systems and equilibrium, irreversible thermodynamics, thermodynamic potentials, stability, thermodynamic processes, ideal systems, real gases and phase transformations, magnetic systems and Landau model, low temperature thermodynamics, canonical ensembles, statistical theory, quantum statistics, fermions and bosons, kinetic theory, Bose-Einstein condensation, photon gas
Statistical physics approaches to Alzheimer's disease
Peng, Shouyong
Alzheimer's disease (AD) is the most common cause of late life dementia. In the brain of an AD patient, neurons are lost and spatial neuronal organizations (microcolumns) are disrupted. An adequate quantitative analysis of microcolumns requires that we automate the neuron recognition stage in the analysis of microscopic images of human brain tissue. We propose a recognition method based on statistical physics. Specifically, Monte Carlo simulations of an inhomogeneous Potts model are applied for image segmentation. Unlike most traditional methods, this method improves the recognition of overlapped neurons, and thus improves the overall recognition percentage. Although the exact causes of AD are unknown, as experimental advances have revealed the molecular origin of AD, they have continued to support the amyloid cascade hypothesis, which states that early stages of aggregation of amyloid beta (Abeta) peptides lead to neurodegeneration and death. X-ray diffraction studies reveal the common cross-beta structural features of the final stable aggregates-amyloid fibrils. Solid-state NMR studies also reveal structural features for some well-ordered fibrils. But currently there is no feasible experimental technique that can reveal the exact structure or the precise dynamics of assembly and thus help us understand the aggregation mechanism. Computer simulation offers a way to understand the aggregation mechanism on the molecular level. Because traditional all-atom continuous molecular dynamics simulations are not fast enough to investigate the whole aggregation process, we apply coarse-grained models and discrete molecular dynamics methods to increase the simulation speed. First we use a coarse-grained two-bead (two beads per amino acid) model. Simulations show that peptides can aggregate into multilayer beta-sheet structures, which agree with X-ray diffraction experiments. To better represent the secondary structure transition happening during aggregation, we refine the
Brownian quasi-particles in statistical physics
Tellez-Arenas, A.; Fronteau, J.; Combis, P.
1979-01-01
The idea of a Brownian quasi-particle and the associated differentiable flow (with nonselfadjoint forces) are used here in the context of a stochastic description of the approach towards statistical equilibrium. We show that this quasi-particle flow acquires, at equilibrium, the principal properties of a conservative Hamiltonian flow. Thus the model of Brownian quasi-particles permits us to establish a link between the stochastic description and the Gibbs description of statistical equilibrium
Nonextensive statistical mechanics and high energy physics
Tsallis Constantino
2014-04-01
Full Text Available The use of the celebrated Boltzmann-Gibbs entropy and statistical mechanics is justified for ergodic-like systems. In contrast, complex systems typically require more powerful theories. We will provide a brief introduction to nonadditive entropies (characterized by indices like q, which, in the q → 1 limit, recovers the standard Boltzmann-Gibbs entropy and associated nonextensive statistical mechanics. We then present somerecent applications to systems such as high-energy collisions, black holes and others. In addition to that, we clarify and illustrate the neat distinction that exists between Lévy distributions and q-exponential ones, a point which occasionally causes some confusion in the literature, very particularly in the LHC literature
On fractional spin symmetries and statistical physics
Saidi, E.H.
1995-09-01
The partition function Z and the quantum distribution of systems Σ of identical particles of fractional spin s = 1/k mod 1, k ≥ 2, generalizing the well-known Bose and Fermi ones, are derived. The generalized Sommerfeld development of the distribution around T = O deg. K is given. The low temperature analysis of statistical systems Σ is made. Known results are recovered. (author). 26 refs, 6 figs
Statistical and particle physics: Common problems and techniques
Bowler, K.C.; Mc Kane, A.J.
1984-01-01
These proceedings contain statistical mechanical studies in condensed matter physics; interfacial problems in statistical physics; string theory; general monte carlo methods and their application to Lattice gauge theories; topological excitations in field theory; phase transformation kinetics; and studies of chaotic systems
Statistics a guide to the use of statistical methods in the physical sciences
Barlow, Roger J
1989-01-01
The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition F. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A.C. Phillips Computing for Scienti
Safety bey statistics? A critical view on statistical methods applied in health physics
Kraut, W.
2016-01-01
The only proper way to describe uncertainties in health physics is by statistical means. But statistics never can replace Your personal evaluation of effect, nor can statistics transmute randomness into certainty like an ''uncertainty laundry''. The paper discusses these problems in routine practical work.
Statistical Physics and Light-Front Quantization
Raufeisen, J
2004-08-12
Light-front quantization has important advantages for describing relativistic statistical systems, particularly systems for which boost invariance is essential, such as the fireball created in a heavy ion collisions. In this paper the authors develop light-front field theory at finite temperature and density with special attention to quantum chromodynamics. They construct the most general form of the statistical operator allowed by the Poincare algebra and show that there are no zero-mode related problems when describing phase transitions. They then demonstrate a direct connection between densities in light-front thermal field theory and the parton distributions measured in hard scattering experiments. The approach thus generalizes the concept of a parton distribution to finite temperature. In light-front quantization, the gauge-invariant Green's functions of a quark in a medium can be defined in terms of just 2-component spinors and have a much simpler spinor structure than the equal-time fermion propagator. From the Green's function, the authors introduce the new concept of a light-front density matrix, whose matrix elements are related to forward and to off-diagonal parton distributions. Furthermore, they explain how thermodynamic quantities can be calculated in discretized light-cone quantization, which is applicable at high chemical potential and is not plagued by the fermion-doubling problems.
Methods of contemporary mathematical statistical physics
2009-01-01
This volume presents a collection of courses introducing the reader to the recent progress with attention being paid to laying solid grounds and developing various basic tools. An introductory chapter on lattice spin models is useful as a background for other lectures of the collection. The topics include new results on phase transitions for gradient lattice models (with introduction to the techniques of the reflection positivity), stochastic geometry reformulation of classical and quantum Ising models, the localization/delocalization transition for directed polymers. A general rigorous framework for theory of metastability is presented and particular applications in the context of Glauber and Kawasaki dynamics of lattice models are discussed. A pedagogical account of several recently discussed topics in nonequilibrium statistical mechanics with an emphasis on general principles is followed by a discussion of kinetically constrained spin models that are reflecting important peculiar features of glassy dynamic...
Statistical physics, neural networks, brain studies
Toulouse, G.
1999-01-01
An overview of some aspects of a vast domain, located at the crossroads of physics, biology and computer science is presented: (1) During the last fifteen years, physicists advancing along various pathways have come into contact with biology (computational neurosciences) and engineering (formal neural nets). (2) This move may actually be viewed as one component in a larger picture. A prominent trend of recent years, observable over many countries, has been the establishment of interdisciplinary centers devoted to the study of: cognitive sciences; natural and artificial intelligence; brain, mind and behaviour; perception and action; learning and memory; robotics; man-machine communication, etc. What are the promising lines of development? What opportunities for physicists? An attempt will be made to address such questions and related issues
Probability and statistics for particle physics
Mana, Carlos
2017-01-01
This book comprehensively presents the basic concepts of probability and Bayesian inference with sufficient generality to make them applicable to current problems in scientific research. The first chapter provides the fundamentals of probability theory that are essential for the analysis of random phenomena. The second chapter includes a full and pragmatic review of the Bayesian methods that constitute a natural and coherent framework with enough freedom to analyze all the information available from experimental data in a conceptually simple manner. The third chapter presents the basic Monte Carlo techniques used in scientific research, allowing a large variety of problems to be handled difficult to tackle by other procedures. The author also introduces a basic algorithm, which enables readers to simulate samples from simple distribution, and describes useful cases for researchers in particle physics.The final chapter is devoted to the basic ideas of Information Theory, which are important in the Bayesian me...
What can we learn from noise? - Mesoscopic nonequilibrium statistical physics.
Kobayashi, Kensuke
2016-01-01
Mesoscopic systems - small electric circuits working in quantum regime - offer us a unique experimental stage to explorer quantum transport in a tunable and precise way. The purpose of this Review is to show how they can contribute to statistical physics. We introduce the significance of fluctuation, or equivalently noise, as noise measurement enables us to address the fundamental aspects of a physical system. The significance of the fluctuation theorem (FT) in statistical physics is noted. We explain what information can be deduced from the current noise measurement in mesoscopic systems. As an important application of the noise measurement to statistical physics, we describe our experimental work on the current and current noise in an electron interferometer, which is the first experimental test of FT in quantum regime. Our attempt will shed new light in the research field of mesoscopic quantum statistical physics.
Kuo, Chi-Mei; Chien, Wu-Hsiung; Shen, Hsi-Che; Hu, Yi-Chun; Chen, Yu-Fen; Tung, Tao-Hsin
2013-01-01
To quantify the prevalence of and associated factors for chronic kidney disease (CKD) among male elderly fishing and agricultural population in Taipei, Taiwan. Subjects (n = 2,766) aged 65 years and over voluntarily admitted to a teaching hospital for a physical checkup were collected in 2010. CKD was defined as an estimated glomerular filtration rate agricultural population.
Symmetry, Invariance and Ontology in Physics and Statistics
Julio Michael Stern
2011-09-01
Full Text Available This paper has three main objectives: (a Discuss the formal analogy between some important symmetry-invariance arguments used in physics, probability and statistics. Specifically, we will focus on Noether’s theorem in physics, the maximum entropy principle in probability theory, and de Finetti-type theorems in Bayesian statistics; (b Discuss the epistemological and ontological implications of these theorems, as they are interpreted in physics and statistics. Specifically, we will focus on the positivist (in physics or subjective (in statistics interpretations vs. objective interpretations that are suggested by symmetry and invariance arguments; (c Introduce the cognitive constructivism epistemological framework as a solution that overcomes the realism-subjectivism dilemma and its pitfalls. The work of the physicist and philosopher Max Born will be particularly important in our discussion.
Statistical physics and thermodynamics an introduction to key concepts
Rau, Jochen
2017-01-01
Statistical physics and thermodynamics describe the behaviour of systems on the macroscopic scale. Their methods are applicable to a wide range of phenomena: from refrigerators to the interior of stars, from chemical reactions to magnetism. Indeed, of all physical laws, the laws of thermodynamics are perhaps the most universal. This text provides a concise yet thorough introduction to the key concepts which underlie statistical physics and thermodynamics. It begins with a review of classical probability theory and quantum theory, as well as a careful discussion of the notions of information and entropy, prior to embarking on the development of statistical physics proper. The crucial steps leading from the microscopic to the macroscopic domain are rendered transparent. In particular, the laws of thermodynamics are shown to emerge as natural consequences of the statistical framework. While the emphasis is on clarifying the basic concepts, the text also contains many applications and classroom-tested exercises,...
1. Warsaw School of Statistical Physics - Poster Abstracts
2005-01-01
The abstracts of information presented in posters during '1st Warsaw School of Statistical Physics' which held in Kazimierz Dolny - Poland are presented. They cover different aspects of statistical processes like diffusion, fluid hydrodynamics as well as modern quantum mechanical methods of their solutions
Correlated randomness: Some examples of exotic statistical physics
journal of. May 2005 physics pp. 645–660. Correlated randomness: Some examples of exotic statistical physics .... The key idea is that scale invariance is a statement not about algebraic .... Very recently an article appeared in Phys. Rev. ... One quarter of any newspaper with a financial section is filled with economic fluc-.
Statistical physics of human beings in games: Controlled experiments
Liang Yuan; Huang Ji-Ping
2014-01-01
It is important to know whether the laws or phenomena in statistical physics for natural systems with non-adaptive agents still hold for social human systems with adaptive agents, because this implies whether it is possible to study or understand social human systems by using statistical physics originating from natural systems. For this purpose, we review the role of human adaptability in four kinds of specific human behaviors, namely, normal behavior, herd behavior, contrarian behavior, and hedge behavior. The approach is based on controlled experiments in the framework of market-directed resource-allocation games. The role of the controlled experiments could be at least two-fold: adopting the real human decision-making process so that the system under consideration could reflect the performance of genuine human beings; making it possible to obtain macroscopic physical properties of a human system by tuning a particular factor of the system, thus directly revealing cause and effect. As a result, both computer simulations and theoretical analyses help to show a few counterparts of some laws or phenomena in statistical physics for social human systems: two-phase phenomena or phase transitions, entropy-related phenomena, and a non-equilibrium steady state. This review highlights the role of human adaptability in these counterparts, and makes it possible to study or understand some particular social human systems by means of statistical physics coming from natural systems. (topical review - statistical physics and complex systems)
A Concise Introduction to the Statistical Physics of Complex Systems
Bertin, Eric
2012-01-01
This concise primer (based on lectures given at summer schools on complex systems and on a masters degree course in complex systems modeling) will provide graduate students and newcomers to the field with the basic knowledge of the concepts and methods of statistical physics and its potential for application to interdisciplinary topics. Indeed, in recent years, statistical physics has begun to attract the interest of a broad community of researchers in the field of complex system sciences, ranging from biology to the social sciences, economics and computer science. More generally, a growing number of graduate students and researchers feel the need to learn some basic concepts and questions originating in other disciplines without necessarily having to master all of the corresponding technicalities and jargon. Generally speaking, the goals of statistical physics may be summarized as follows: on the one hand to study systems composed of a large number of interacting ‘entities’, and on the other to predict...
Statistical physics of complex systems a concise introduction
Bertin, Eric
2016-01-01
This course-tested primer provides graduate students and non-specialists with a basic understanding of the concepts and methods of statistical physics and demonstrates their wide range of applications to interdisciplinary topics in the field of complex system sciences, including selected aspects of theoretical modeling in biology and the social sciences. Generally speaking, the goals of statistical physics may be summarized as follows: on the one hand to study systems composed of a large number of interacting units, and on the other to predict the macroscopic, collective behavior of the system considered from the perspective of the microscopic laws governing the dynamics of the individual entities. These two goals are essentially also shared by what is now called 'complex systems science', and as such, systems studied in the framework of statistical physics may be considered to be among the simplest examples of complex systems – while also offering a rather well developed mathematical treatment. The second ...
Heuristic versus statistical physics approach to optimization problems
Jedrzejek, C.; Cieplinski, L.
1995-01-01
Optimization is a crucial ingredient of many calculation schemes in science and engineering. In this paper we assess several classes of methods: heuristic algorithms, methods directly relying on statistical physics such as the mean-field method and simulated annealing; and Hopfield-type neural networks and genetic algorithms partly related to statistical physics. We perform the analysis for three types of problems: (1) the Travelling Salesman Problem, (2) vector quantization, and (3) traffic control problem in multistage interconnection network. In general, heuristic algorithms perform better (except for genetic algorithms) and much faster but have to be specific for every problem. The key to improving the performance could be to include heuristic features into general purpose statistical physics methods. (author)
A statistical physics perspective on criticality in financial markets
Bury, Thomas
2013-01-01
Stock markets are complex systems exhibiting collective phenomena and particular features such as synchronization, fluctuations distributed as power-laws, non-random structures and similarity to neural networks. Such specific properties suggest that markets operate at a very special point. Financial markets are believed to be critical by analogy to physical systems, but little statistically founded evidence has been given. Through a data-based methodology and comparison to simulations inspired by the statistical physics of complex systems, we show that the Dow Jones and index sets are not rigorously critical. However, financial systems are closer to criticality in the crash neighborhood. (paper)
Statistical physics of human beings in games: Controlled experiments
Liang, Yuan; Huang, Ji-Ping
2014-07-01
It is important to know whether the laws or phenomena in statistical physics for natural systems with non-adaptive agents still hold for social human systems with adaptive agents, because this implies whether it is possible to study or understand social human systems by using statistical physics originating from natural systems. For this purpose, we review the role of human adaptability in four kinds of specific human behaviors, namely, normal behavior, herd behavior, contrarian behavior, and hedge behavior. The approach is based on controlled experiments in the framework of market-directed resource-allocation games. The role of the controlled experiments could be at least two-fold: adopting the real human decision-making process so that the system under consideration could reflect the performance of genuine human beings; making it possible to obtain macroscopic physical properties of a human system by tuning a particular factor of the system, thus directly revealing cause and effect. As a result, both computer simulations and theoretical analyses help to show a few counterparts of some laws or phenomena in statistical physics for social human systems: two-phase phenomena or phase transitions, entropy-related phenomena, and a non-equilibrium steady state. This review highlights the role of human adaptability in these counterparts, and makes it possible to study or understand some particular social human systems by means of statistical physics coming from natural systems.
Non-equilibrium statistical physics with application to disordered systems
Cáceres, Manuel Osvaldo
2017-01-01
This textbook is the result of the enhancement of several courses on non-equilibrium statistics, stochastic processes, stochastic differential equations, anomalous diffusion and disorder. The target audience includes students of physics, mathematics, biology, chemistry, and engineering at undergraduate and graduate level with a grasp of the basic elements of mathematics and physics of the fourth year of a typical undergraduate course. The little-known physical and mathematical concepts are described in sections and specific exercises throughout the text, as well as in appendices. Physical-mathematical motivation is the main driving force for the development of this text. It presents the academic topics of probability theory and stochastic processes as well as new educational aspects in the presentation of non-equilibrium statistical theory and stochastic differential equations.. In particular it discusses the problem of irreversibility in that context and the dynamics of Fokker-Planck. An introduction on fluc...
Statistical physics of hard combinatorial optimization: Vertex cover problem
Zhao, Jin-Hua; Zhou, Hai-Jun
2014-07-01
Typical-case computation complexity is a research topic at the boundary of computer science, applied mathematics, and statistical physics. In the last twenty years, the replica-symmetry-breaking mean field theory of spin glasses and the associated message-passing algorithms have greatly deepened our understanding of typical-case computation complexity. In this paper, we use the vertex cover problem, a basic nondeterministic-polynomial (NP)-complete combinatorial optimization problem of wide application, as an example to introduce the statistical physical methods and algorithms. We do not go into the technical details but emphasize mainly the intuitive physical meanings of the message-passing equations. A nonfamiliar reader shall be able to understand to a large extent the physics behind the mean field approaches and to adjust the mean field methods in solving other optimization problems.
"Statistical Techniques for Particle Physics" (2/4)
CERN. Geneva
2009-01-01
This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...
"Statistical Techniques for Particle Physics" (1/4)
CERN. Geneva
2009-01-01
This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...
"Statistical Techniques for Particle Physics" (4/4)
CERN. Geneva
2009-01-01
This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...
"Statistical Techniques for Particle Physics" (3/4)
CERN. Geneva
2009-01-01
This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...
Statistical Physics in the Era of Big Data
Wang, Dashun
2013-01-01
With the wealth of data provided by a wide range of high-throughout measurement tools and technologies, statistical physics of complex systems is entering a new phase, impacting in a meaningful fashion a wide range of fields, from cell biology to computer science to economics. In this dissertation, by applying tools and techniques developed in…
Renormalization group in statistical physics - momentum and real spaces
Yukalov, V.I.
1988-01-01
Two variants of the renormalization group approach in statistical physics are considered, the renormalization group in the momentum and the renormalization group in the real spaces. Common properties of these methods and their differences are cleared up. A simple model for investigating the crossover between different universality classes is suggested. 27 refs
Academic Training Lecture: Statistical Methods for Particle Physics
PH Department
2012-01-01
2, 3, 4 and 5 April 2012 Academic Training Lecture Regular Programme from 11:00 to 12:00 - Bldg. 222-R-001 - Filtration Plant Statistical Methods for Particle Physics by Glen Cowan (Royal Holloway) The series of four lectures will introduce some of the important statistical methods used in Particle Physics, and should be particularly relevant to those involved in the analysis of LHC data. The lectures will include an introduction to statistical tests, parameter estimation, and the application of these tools to searches for new phenomena. Both frequentist and Bayesian methods will be described, with particular emphasis on treatment of systematic uncertainties. The lectures will also cover unfolding, that is, estimation of a distribution in binned form where the variable in question is subject to measurement errors.
Statistical physics a prelude and fugue for engineers
Piazza, Roberto
2017-01-01
This book, provides a general introduction to the ideas and methods of statistical mechanics with the principal aim of meeting the needs of Master’s students in chemical, mechanical, and materials science engineering. Extensive introductory information is presented on many general physics topics in which students in engineering are inadequately trained, ranging from the Hamiltonian formulation of classical mechanics to basic quantum mechanics, electromagnetic fields in matter, intermolecular forces, and transport phenomena. Since engineers should be able to apply physical concepts, the book also focuses on the practical applications of statistical physics to material science and to cutting-edge technologies, with brief but informative sections on, for example, interfacial properties, disperse systems, nucleation, magnetic materials, superfluidity, and ultralow temperature technologies. The book adopts a graded approach to learning, the opening four basic-level chapters being followed by advanced “starred�...
Statistical Methods for Particle Physics (4/4)
CERN. Geneva
2012-01-01
The series of four lectures will introduce some of the important statistical methods used in Particle Physics, and should be particularly relevant to those involved in the analysis of LHC data. The lectures will include an introduction to statistical tests, parameter estimation, and the application of these tools to searches for new phenomena. Both frequentist and Bayesian methods will be described, with particular emphasis on treatment of systematic uncertainties. The lectures will also cover unfolding, that is, estimation of a distribution in binned form where the variable in question is subject to measurement errors.
Statistical Methods for Particle Physics (1/4)
CERN. Geneva
2012-01-01
The series of four lectures will introduce some of the important statistical methods used in Particle Physics, and should be particularly relevant to those involved in the analysis of LHC data. The lectures will include an introduction to statistical tests, parameter estimation, and the application of these tools to searches for new phenomena. Both frequentist and Bayesian methods will be described, with particular emphasis on treatment of systematic uncertainties. The lectures will also cover unfolding, that is, estimation of a distribution in binned form where the variable in question is subject to measurement errors.
Statistical Methods for Particle Physics (2/4)
CERN. Geneva
2012-01-01
The series of four lectures will introduce some of the important statistical methods used in Particle Physics, and should be particularly relevant to those involved in the analysis of LHC data. The lectures will include an introduction to statistical tests, parameter estimation, and the application of these tools to searches for new phenomena. Both frequentist and Bayesian methods will be described, with particular emphasis on treatment of systematic uncertainties. The lectures will also cover unfolding, that is, estimation of a distribution in binned form where the variable in question is subject to measurement errors.
Statistical Methods for Particle Physics (3/4)
CERN. Geneva
2012-01-01
The series of four lectures will introduce some of the important statistical methods used in Particle Physics, and should be particularly relevant to those involved in the analysis of LHC data. The lectures will include an introduction to statistical tests, parameter estimation, and the application of these tools to searches for new phenomena. Both frequentist and Bayesian methods will be described, with particular emphasis on treatment of systematic uncertainties. The lectures will also cover unfolding, that is, estimation of a distribution in binned form where the variable in question is subject to measurement errors.
Monte Carlo Simulation in Statistical Physics An Introduction
Binder, Kurt
2010-01-01
Monte Carlo Simulation in Statistical Physics deals with the computer simulation of many-body systems in condensed-matter physics and related fields of physics, chemistry and beyond, to traffic flows, stock market fluctuations, etc.). Using random numbers generated by a computer, probability distributions are calculated, allowing the estimation of the thermodynamic properties of various systems. This book describes the theoretical background to several variants of these Monte Carlo methods and gives a systematic presentation from which newcomers can learn to perform such simulations and to analyze their results. The fifth edition covers Classical as well as Quantum Monte Carlo methods. Furthermore a new chapter on the sampling of free-energy landscapes has been added. To help students in their work a special web server has been installed to host programs and discussion groups (http://wwwcp.tphys.uni-heidelberg.de). Prof. Binder was awarded the Berni J. Alder CECAM Award for Computational Physics 2001 as well ...
Askerov, Bahram M
2010-01-01
This book deals with theoretical thermodynamics and the statistical physics of electron and particle gases. While treating the laws of thermodynamics from both classical and quantum theoretical viewpoints, it posits that the basis of the statistical theory of macroscopic properties of a system is the microcanonical distribution of isolated systems, from which all canonical distributions stem. To calculate the free energy, the Gibbs method is applied to ideal and non-ideal gases, and also to a crystalline solid. Considerable attention is paid to the Fermi-Dirac and Bose-Einstein quantum statistics and its application to different quantum gases, and electron gas in both metals and semiconductors is considered in a nonequilibrium state. A separate chapter treats the statistical theory of thermodynamic properties of an electron gas in a quantizing magnetic field.
On Dobrushin's way from probability theory to statistical physics
Minlos, R A; Suhov, Yu M; Suhov, Yu
2000-01-01
R. Dobrushin worked in several branches of mathematics (probability theory, information theory), but his deepest influence was on mathematical physics. He was one of the founders of the rigorous study of statistical physics. When Dobrushin began working in that direction in the early sixties, only a few people worldwide were thinking along the same lines. Now there is an army of researchers in the field. This collection is devoted to the memory of R. L. Dobrushin. The authors who contributed to this collection knew him quite well and were his colleagues. The title, "On Dobrushin's Way", is mea
Foundations of Complex Systems Nonlinear Dynamics, Statistical Physics, and Prediction
Nicolis, Gregoire
2007-01-01
Complexity is emerging as a post-Newtonian paradigm for approaching a large body of phenomena of concern at the crossroads of physical, engineering, environmental, life and human sciences from a unifying point of view. This book outlines the foundations of modern complexity research as it arose from the cross-fertilization of ideas and tools from nonlinear science, statistical physics and numerical simulation. It is shown how these developments lead to an understanding, both qualitative and quantitative, of the complex systems encountered in nature and in everyday experience and, conversely, h
Statistical physics and computational methods for evolutionary game theory
Javarone, Marco Alberto
2018-01-01
This book presents an introduction to Evolutionary Game Theory (EGT) which is an emerging field in the area of complex systems attracting the attention of researchers from disparate scientific communities. EGT allows one to represent and study several complex phenomena, such as the emergence of cooperation in social systems, the role of conformity in shaping the equilibrium of a population, and the dynamics in biological and ecological systems. Since EGT models belong to the area of complex systems, statistical physics constitutes a fundamental ingredient for investigating their behavior. At the same time, the complexity of some EGT models, such as those realized by means of agent-based methods, often require the implementation of numerical simulations. Therefore, beyond providing an introduction to EGT, this book gives a brief overview of the main statistical physics tools (such as phase transitions and the Ising model) and computational strategies for simulating evolutionary games (such as Monte Carlo algor...
STATISTICAL CHALLENGES FOR SEARCHES FOR NEW PHYSICS AT THE LHC.
CRANMER, K.
2005-09-12
Because the emphasis of the LHC is on 5{sigma} discoveries and the LHC environment induces high systematic errors, many of the common statistical procedures used in High Energy Physics are not adequate. I review the basic ingredients of LHC searches, the sources of systematics, and the performance of several methods. Finally, I indicate the methods that seem most promising for the LHC and areas that are in need of further study.
Representative volume size: A comparison of statistical continuum mechanics and statistical physics
AIDUN,JOHN B.; TRUCANO,TIMOTHY G.; LO,CHI S.; FYE,RICHARD M.
1999-05-01
In this combination background and position paper, the authors argue that careful work is needed to develop accurate methods for relating the results of fine-scale numerical simulations of material processes to meaningful values of macroscopic properties for use in constitutive models suitable for finite element solid mechanics simulations. To provide a definite context for this discussion, the problem is couched in terms of the lack of general objective criteria for identifying the size of the representative volume (RV) of a material. The objective of this report is to lay out at least the beginnings of an approach for applying results and methods from statistical physics to develop concepts and tools necessary for determining the RV size, as well as alternatives to RV volume-averaging for situations in which the RV is unmanageably large. The background necessary to understand the pertinent issues and statistical physics concepts is presented.
Tadaki, Kohtaro
2010-01-01
The statistical mechanical interpretation of algorithmic information theory (AIT, for short) was introduced and developed by our former works [K. Tadaki, Local Proceedings of CiE 2008, pp. 425-434, 2008] and [K. Tadaki, Proceedings of LFCS'09, Springer's LNCS, vol. 5407, pp. 422-440, 2009], where we introduced the notion of thermodynamic quantities, such as partition function Z(T), free energy F(T), energy E(T), statistical mechanical entropy S(T), and specific heat C(T), into AIT. We then discovered that, in the interpretation, the temperature T equals to the partial randomness of the values of all these thermodynamic quantities, where the notion of partial randomness is a stronger representation of the compression rate by means of program-size complexity. Furthermore, we showed that this situation holds for the temperature T itself, which is one of the most typical thermodynamic quantities. Namely, we showed that, for each of the thermodynamic quantities Z(T), F(T), E(T), and S(T) above, the computability of its value at temperature T gives a sufficient condition for T is an element of (0,1) to satisfy the condition that the partial randomness of T equals to T. In this paper, based on a physical argument on the same level of mathematical strictness as normal statistical mechanics in physics, we develop a total statistical mechanical interpretation of AIT which actualizes a perfect correspondence to normal statistical mechanics. We do this by identifying a microcanonical ensemble in the framework of AIT. As a result, we clarify the statistical mechanical meaning of the thermodynamic quantities of AIT.
Advanced statistics to improve the physical interpretation of atomization processes
Panão, Miguel R.O.; Radu, Lucian
2013-01-01
Highlights: ► Finite pdf mixtures improves physical interpretation of sprays. ► Bayesian approach using MCMC algorithm is used to find the best finite mixture. ► Statistical method identifies multiple droplet clusters in a spray. ► Multiple drop clusters eventually associated with multiple atomization mechanisms. ► Spray described by drop size distribution and not only its moments. -- Abstract: This paper reports an analysis of the physics of atomization processes using advanced statistical tools. Namely, finite mixtures of probability density functions, which best fitting is found using a Bayesian approach based on a Markov chain Monte Carlo (MCMC) algorithm. This approach takes into account eventual multimodality and heterogeneities in drop size distributions. Therefore, it provides information about the complete probability density function of multimodal drop size distributions and allows the identification of subgroups in the heterogeneous data. This allows improving the physical interpretation of atomization processes. Moreover, it also overcomes the limitations induced by analyzing the spray droplets characteristics through moments alone, particularly, the hindering of different natures of droplet formation. Finally, the method is applied to physically interpret a case-study based on multijet atomization processes
Statistical physics of networks, information and complex systems
Ecke, Robert E [Los Alamos National Laboratory
2009-01-01
In this project we explore the mathematical methods and concepts of statistical physics that are fmding abundant applications across the scientific and technological spectrum from soft condensed matter systems and bio-infonnatics to economic and social systems. Our approach exploits the considerable similarity of concepts between statistical physics and computer science, allowing for a powerful multi-disciplinary approach that draws its strength from cross-fertilization and mUltiple interactions of researchers with different backgrounds. The work on this project takes advantage of the newly appreciated connection between computer science and statistics and addresses important problems in data storage, decoding, optimization, the infonnation processing properties of the brain, the interface between quantum and classical infonnation science, the verification of large software programs, modeling of complex systems including disease epidemiology, resource distribution issues, and the nature of highly fluctuating complex systems. Common themes that the project has been emphasizing are (i) neural computation, (ii) network theory and its applications, and (iii) a statistical physics approach to infonnation theory. The project's efforts focus on the general problem of optimization and variational techniques, algorithm development and infonnation theoretic approaches to quantum systems. These efforts are responsible for fruitful collaborations and the nucleation of science efforts that span multiple divisions such as EES, CCS, 0 , T, ISR and P. This project supports the DOE mission in Energy Security and Nuclear Non-Proliferation by developing novel infonnation science tools for communication, sensing, and interacting complex networks such as the internet or energy distribution system. The work also supports programs in Threat Reduction and Homeland Security.
4 GHz ionospheric scintillations observed at Taipei
Huang, Y.N.; Jeng, B.S.
1978-01-01
In a study of ionospheric scintillations 3950 MHz beacon signals from geostationary communication satellites Intelsat-IV-F8 and Intelsat-IV-F1 were recorded on a strip chart and magnetic tape at the Taipei Earth Station. While the strip charts were used to monitor the occurrence of the scintillation, the magnetic tape output was digitized and processed by a computerized system to yield a detailed analysis of scintillation events. It was found that diurnal variations were similar to the diurnal patterns of sporadic E at greater than 5 MHz and VHF band ionospheric scintillations during daytime as reported by Huang (1978). Eight typical scintillation events were selected for the calculation of the scintillation index, S4, and other parameters. The mean S4 index for the 8 events was found to be 0.15. Numerical and graphic results are presented for the cumulative amplitude distributions, message reliability, autocorrelation functions and power spectra
Literature in Focus: Statistical Methods in Experimental Physics
2007-01-01
Frederick James was a high-energy physicist who became the CERN "expert" on statistics and is now well-known around the world, in part for this famous text. The first edition of Statistical Methods in Experimental Physics was originally co-written with four other authors and was published in 1971 by North Holland (now an imprint of Elsevier). It became such an important text that demand for it has continued for more than 30 years. Fred has updated it and it was released in a second edition by World Scientific in 2006. It is still a top seller and there is no exaggeration in calling it «the» reference on the subject. A full review of the title appeared in the October CERN Courier.Come and meet the author to hear more about how this book has flourished during its 35-year lifetime. Frederick James Statistical Methods in Experimental Physics Monday, 26th of November, 4 p.m. Council Chamber (Bldg. 503-1-001) The author will be introduced...
Statistical methods for data analysis in particle physics
Lista, Luca
2017-01-01
This concise set of course-based notes provides the reader with the main concepts and tools needed to perform statistical analyses of experimental data, in particular in the field of high-energy physics (HEP). First, the book provides an introduction to probability theory and basic statistics, mainly intended as a refresher from readers’ advanced undergraduate studies, but also to help them clearly distinguish between the Frequentist and Bayesian approaches and interpretations in subsequent applications. More advanced concepts and applications are gradually introduced, culminating in the chapter on both discoveries and upper limits, as many applications in HEP concern hypothesis testing, where the main goal is often to provide better and better limits so as to eventually be able to distinguish between competing hypotheses, or to rule out some of them altogether. Many worked-out examples will help newcomers to the field and graduate students alike understand the pitfalls involved in applying theoretical co...
Inverse statistical physics of protein sequences: a key issues review.
Cocco, Simona; Feinauer, Christoph; Figliuzzi, Matteo; Monasson, Rémi; Weigt, Martin
2018-03-01
In the course of evolution, proteins undergo important changes in their amino acid sequences, while their three-dimensional folded structure and their biological function remain remarkably conserved. Thanks to modern sequencing techniques, sequence data accumulate at unprecedented pace. This provides large sets of so-called homologous, i.e. evolutionarily related protein sequences, to which methods of inverse statistical physics can be applied. Using sequence data as the basis for the inference of Boltzmann distributions from samples of microscopic configurations or observables, it is possible to extract information about evolutionary constraints and thus protein function and structure. Here we give an overview over some biologically important questions, and how statistical-mechanics inspired modeling approaches can help to answer them. Finally, we discuss some open questions, which we expect to be addressed over the next years.
Statistical methods for data analysis in particle physics
AUTHOR|(CDS)2070643
2015-01-01
This concise set of course-based notes provides the reader with the main concepts and tools to perform statistical analysis of experimental data, in particular in the field of high-energy physics (HEP). First, an introduction to probability theory and basic statistics is given, mainly as reminder from advanced undergraduate studies, yet also in view to clearly distinguish the Frequentist versus Bayesian approaches and interpretations in subsequent applications. More advanced concepts and applications are gradually introduced, culminating in the chapter on upper limits as many applications in HEP concern hypothesis testing, where often the main goal is to provide better and better limits so as to be able to distinguish eventually between competing hypotheses or to rule out some of them altogether. Many worked examples will help newcomers to the field and graduate students to understand the pitfalls in applying theoretical concepts to actual data
Topics in statistical data analysis for high-energy physics
Cowan, G.
2011-01-01
These lectures concert two topics that are becoming increasingly important in the analysis of high-energy physics data: Bayesian statistics and multivariate methods. In the Bayesian approach, we extend the interpretation of probability not only to cover the frequency of repeatable outcomes but also to include a degree of belief. In this way we are able to associate probability with a hypothesis and thus to answer directly questions that cannot be addressed easily with traditional frequentist methods. In multivariate analysis, we try to exploit as much information as possible from the characteristics that we measure for each event to distinguish between event types. In particular we will look at a method that has gained popularity in high-energy physics in recent years: the boosted decision tree. Finally, we give a brief sketch of how multivariate methods may be applied in a search for a new signal process. (author)
Introduction to statistical physics and to computer simulations
Casquilho, João Paulo
2015-01-01
Rigorous and comprehensive, this textbook introduces undergraduate students to simulation methods in statistical physics. The book covers a number of topics, including the thermodynamics of magnetic and electric systems; the quantum-mechanical basis of magnetism; ferrimagnetism, antiferromagnetism, spin waves and magnons; liquid crystals as a non-ideal system of technological relevance; and diffusion in an external potential. It also covers hot topics such as cosmic microwave background, magnetic cooling and Bose-Einstein condensation. The book provides an elementary introduction to simulation methods through algorithms in pseudocode for random walks, the 2D Ising model, and a model liquid crystal. Any formalism is kept simple and derivations are worked out in detail to ensure the material is accessible to students from subjects other than physics.
Implementation of statistical analysis methods for medical physics data
Teixeira, Marilia S.; Pinto, Nivia G.P.; Barroso, Regina C.; Oliveira, Luis F.
2009-01-01
The objective of biomedical research with different radiation natures is to contribute for the understanding of the basic physics and biochemistry of the biological systems, the disease diagnostic and the development of the therapeutic techniques. The main benefits are: the cure of tumors through the therapy, the anticipated detection of diseases through the diagnostic, the using as prophylactic mean for blood transfusion, etc. Therefore, for the better understanding of the biological interactions occurring after exposure to radiation, it is necessary for the optimization of therapeutic procedures and strategies for reduction of radioinduced effects. The group pf applied physics of the Physics Institute of UERJ have been working in the characterization of biological samples (human tissues, teeth, saliva, soil, plants, sediments, air, water, organic matrixes, ceramics, fossil material, among others) using X-rays diffraction and X-ray fluorescence. The application of these techniques for measurement, analysis and interpretation of the biological tissues characteristics are experimenting considerable interest in the Medical and Environmental Physics. All quantitative data analysis must be initiated with descriptive statistic calculation (means and standard deviations) in order to obtain a previous notion on what the analysis will reveal. It is well known que o high values of standard deviation found in experimental measurements of biologicals samples can be attributed to biological factors, due to the specific characteristics of each individual (age, gender, environment, alimentary habits, etc). This work has the main objective the development of a program for the use of specific statistic methods for the optimization of experimental data an analysis. The specialized programs for this analysis are proprietary, another objective of this work is the implementation of a code which is free and can be shared by the other research groups. As the program developed since the
A New Approach to Monte Carlo Simulations in Statistical Physics
Landau, David P.
2002-08-01
Monte Carlo simulations [1] have become a powerful tool for the study of diverse problems in statistical/condensed matter physics. Standard methods sample the probability distribution for the states of the system, most often in the canonical ensemble, and over the past several decades enormous improvements have been made in performance. Nonetheless, difficulties arise near phase transitions-due to critical slowing down near 2nd order transitions and to metastability near 1st order transitions, and these complications limit the applicability of the method. We shall describe a new Monte Carlo approach [2] that uses a random walk in energy space to determine the density of states directly. Once the density of states is known, all thermodynamic properties can be calculated. This approach can be extended to multi-dimensional parameter spaces and should be effective for systems with complex energy landscapes, e.g., spin glasses, protein folding models, etc. Generalizations should produce a broadly applicable optimization tool. 1. A Guide to Monte Carlo Simulations in Statistical Physics, D. P. Landau and K. Binder (Cambridge U. Press, Cambridge, 2000). 2. Fugao Wang and D. P. Landau, Phys. Rev. Lett. 86, 2050 (2001); Phys. Rev. E64, 056101-1 (2001).
Statistical physics approach to earthquake occurrence and forecasting
Arcangelis, Lucilla de [Department of Industrial and Information Engineering, Second University of Naples, Aversa (CE) (Italy); Godano, Cataldo [Department of Mathematics and Physics, Second University of Naples, Caserta (Italy); Grasso, Jean Robert [ISTerre, IRD-CNRS-OSUG, University of Grenoble, Saint Martin d’Héres (France); Lippiello, Eugenio, E-mail: eugenio.lippiello@unina2.it [Department of Mathematics and Physics, Second University of Naples, Caserta (Italy)
2016-04-25
There is striking evidence that the dynamics of the Earth crust is controlled by a wide variety of mutually dependent mechanisms acting at different spatial and temporal scales. The interplay of these mechanisms produces instabilities in the stress field, leading to abrupt energy releases, i.e., earthquakes. As a consequence, the evolution towards instability before a single event is very difficult to monitor. On the other hand, collective behavior in stress transfer and relaxation within the Earth crust leads to emergent properties described by stable phenomenological laws for a population of many earthquakes in size, time and space domains. This observation has stimulated a statistical mechanics approach to earthquake occurrence, applying ideas and methods as scaling laws, universality, fractal dimension, renormalization group, to characterize the physics of earthquakes. In this review we first present a description of the phenomenological laws of earthquake occurrence which represent the frame of reference for a variety of statistical mechanical models, ranging from the spring-block to more complex fault models. Next, we discuss the problem of seismic forecasting in the general framework of stochastic processes, where seismic occurrence can be described as a branching process implementing space–time-energy correlations between earthquakes. In this context we show how correlations originate from dynamical scaling relations between time and energy, able to account for universality and provide a unifying description for the phenomenological power laws. Then we discuss how branching models can be implemented to forecast the temporal evolution of the earthquake occurrence probability and allow to discriminate among different physical mechanisms responsible for earthquake triggering. In particular, the forecasting problem will be presented in a rigorous mathematical framework, discussing the relevance of the processes acting at different temporal scales for
New Directions in Statistical Physics: Econophysics, Bioinformatics, and Pattern Recognition
Grassberger, P
2004-01-01
This book contains 18 contributions from different authors. Its subtitle 'Econophysics, Bioinformatics, and Pattern Recognition' says more precisely what it is about: not so much about central problems of conventional statistical physics like equilibrium phase transitions and critical phenomena, but about its interdisciplinary applications. After a long period of specialization, physicists have, over the last few decades, found more and more satisfaction in breaking out of the limitations set by the traditional classification of sciences. Indeed, this classification had never been strict, and physicists in particular had always ventured into other fields. Helmholtz, in the middle of the 19th century, had considered himself a physicist when working on physiology, stressing that the physics of animate nature is as much a legitimate field of activity as the physics of inanimate nature. Later, Max Delbrueck and Francis Crick did for experimental biology what Schroedinger did for its theoretical foundation. And many of the experimental techniques used in chemistry, biology, and medicine were developed by a steady stream of talented physicists who left their proper discipline to venture out into the wider world of science. The development we have witnessed over the last thirty years or so is different. It started with neural networks where methods could be applied which had been developed for spin glasses, but todays list includes vehicular traffic (driven lattice gases), geology (self-organized criticality), economy (fractal stochastic processes and large scale simulations), engineering (dynamical chaos), and many others. By staying in the physics departments, these activities have transformed the physics curriculum and the view physicists have of themselves. In many departments there are now courses on econophysics or on biological physics, and some universities offer degrees in the physics of traffic or in econophysics. In order to document this change of attitude
Lead Isotope Characterization of Petroleum Fuels in Taipei, Taiwan
Pei-Hsuan Yao
2015-04-01
Full Text Available Leaded gasoline in Taiwan was gradually phased out from 1983 to 2000. However, it is unclear whether unleaded gasoline still contributes to atmospheric lead (Pb exposure in urban areas. In this study, Pb isotopic compositions of unleaded gasolines, with octane numbers of 92, 95, 98, and diesel from two local suppliers in Taipei were determined by multi-collector inductively coupled plasma mass spectrometry with a two-sigma uncertainty of ± 0.02 %. Lead isotopic ratios of vehicle exhaust (208Pb/207Pb: 2.427, 206Pb/207Pb: 1.148, as estimated from petroleum fuels overlap with the reported aerosol data. This agreement indicates that local unleaded petroleum fuels, containing 10–45 ng·Pb·g−1, are merely one contributor among various sources to urban aerosol Pb. Additionally, the distinction between the products of the two companies is statistically significant in their individual 208Pb/206Pb ratios (p-value < 0.001, t test. Lead isotopic characterization appears to be applicable as a “fingerprinting” tool for tracing the sources of Pb pollution.
GPU-computing in econophysics and statistical physics
Preis, T.
2011-03-01
A recent trend in computer science and related fields is general purpose computing on graphics processing units (GPUs), which can yield impressive performance. With multiple cores connected by high memory bandwidth, today's GPUs offer resources for non-graphics parallel processing. This article provides a brief introduction into the field of GPU computing and includes examples. In particular computationally expensive analyses employed in financial market context are coded on a graphics card architecture which leads to a significant reduction of computing time. In order to demonstrate the wide range of possible applications, a standard model in statistical physics - the Ising model - is ported to a graphics card architecture as well, resulting in large speedup values.
Graphene growth process modeling: a physical-statistical approach
Wu, Jian; Huang, Qiang
2014-09-01
As a zero-band semiconductor, graphene is an attractive material for a wide variety of applications such as optoelectronics. Among various techniques developed for graphene synthesis, chemical vapor deposition on copper foils shows high potential for producing few-layer and large-area graphene. Since fabrication of high-quality graphene sheets requires the understanding of growth mechanisms, and methods of characterization and control of grain size of graphene flakes, analytical modeling of graphene growth process is therefore essential for controlled fabrication. The graphene growth process starts with randomly nucleated islands that gradually develop into complex shapes, grow in size, and eventually connect together to cover the copper foil. To model this complex process, we develop a physical-statistical approach under the assumption of self-similarity during graphene growth. The growth kinetics is uncovered by separating island shapes from area growth rate. We propose to characterize the area growth velocity using a confined exponential model, which not only has clear physical explanation, but also fits the real data well. For the shape modeling, we develop a parametric shape model which can be well explained by the angular-dependent growth rate. This work can provide useful information for the control and optimization of graphene growth process on Cu foil.
Stochastic Spatial Models in Ecology: A Statistical Physics Approach
Pigolotti, Simone; Cencini, Massimo; Molina, Daniel; Muñoz, Miguel A.
2017-11-01
Ecosystems display a complex spatial organization. Ecologists have long tried to characterize them by looking at how different measures of biodiversity change across spatial scales. Ecological neutral theory has provided simple predictions accounting for general empirical patterns in communities of competing species. However, while neutral theory in well-mixed ecosystems is mathematically well understood, spatial models still present several open problems, limiting the quantitative understanding of spatial biodiversity. In this review, we discuss the state of the art in spatial neutral theory. We emphasize the connection between spatial ecological models and the physics of non-equilibrium phase transitions and how concepts developed in statistical physics translate in population dynamics, and vice versa. We focus on non-trivial scaling laws arising at the critical dimension D = 2 of spatial neutral models, and their relevance for biological populations inhabiting two-dimensional environments. We conclude by discussing models incorporating non-neutral effects in the form of spatial and temporal disorder, and analyze how their predictions deviate from those of purely neutral theories.
Statistical classification techniques in high energy physics (SDDT algorithm)
Bouř, Petr; Kůs, Václav; Franc, Jiří
2016-01-01
We present our proposal of the supervised binary divergence decision tree with nested separation method based on the generalized linear models. A key insight we provide is the clustering driven only by a few selected physical variables. The proper selection consists of the variables achieving the maximal divergence measure between two different classes. Further, we apply our method to Monte Carlo simulations of physics processes corresponding to a data sample of top quark-antiquark pair candidate events in the lepton+jets decay channel. The data sample is produced in pp̅ collisions at √S = 1.96 TeV. It corresponds to an integrated luminosity of 9.7 fb"-"1 recorded with the D0 detector during Run II of the Fermilab Tevatron Collider. The efficiency of our algorithm achieves 90% AUC in separating signal from background. We also briefly deal with the modification of statistical tests applicable to weighted data sets in order to test homogeneity of the Monte Carlo simulations and measured data. The justification of these modified tests is proposed through the divergence tests. (paper)
The interaction of physical properties of seawater via statistical approach
Hamzah, Firdaus Mohamad; Jaafar, Othman; Sabri, Samsul Rijal Mohd; Ismail, Mohd Tahir; Jaafar, Khamisah; Arbin, Norazman
2015-09-01
It is of importance to determine the relationships between physical parameters in marine ecology. Model and expert opinion are needed for exploration of the form of relationship between two parameters due to the complexity of the ecosystems. These need justification with observed data over a particular periods. Novel statistical techniques such as nonparametric regression is presented to investigate the ecological relationships. These are achieved by demonstrating the features of pH, salinity and conductivity at in Straits of Johor. The monthly data measurements from 2004 until 2013 at a chosen sampling location are examined. Testing for no-effect followed by linearity testing for the relationships between salinity and pH; conductivity and pH, and conductivity and salinity are carried out, with the ecological objectives of investigating the evidence of changes in each of the above physical parameters. The findings reveal the appropriateness of smooth function to explain the variation of pH in response to the changes in salinity whilst the changes in conductivity with regards to different concentrations of salinity could be modelled parametrically. The analysis highlights the importance of both parametric and nonparametric models for assessing ecological response to environmental change in seawater.
Application of statistical physics approaches to complex organizations
Matia, Kaushik
The first part of this thesis studies two different kinds of financial markets, namely, the stock market and the commodity market. Stock price fluctuations display certain scale-free statistical features that are not unlike those found in strongly-interacting physical systems. The possibility that new insights can be gained using concepts and methods developed to understand scale-free physical phenomena has stimulated considerable research activity in the physics community. In the first part of this thesis a comparative study of stocks and commodities is performed in terms of probability density function and correlations of stock price fluctuations. It is found that the probability density of the stock price fluctuation has a power law functional form with an exponent 3, which is similar across different markets around the world. We present an autoregressive model to explain the origin of the power law functional form of the probability density function of the price fluctuation. The first part also presents the discovery of unique features of the Indian economy, which we find displays a scale-dependent probability density function. In the second part of this thesis we quantify the statistical properties of fluctuations of complex systems like business firms and world scientific publications. We analyze class size of these systems mentioned above where units agglomerate to form classes. We find that the width of the probability density function of growth rate decays with the class size as a power law with an exponent beta which is universal in the sense that beta is independent of the system studied. We also identify two other scaling exponents, gamma connecting the unit size to the class size and gamma connecting the number of units to the class size, where products are units and firms are classes. Finally we propose a generalized preferential attachment model to describe the class size distribution. This model is successful in explaining the growth rate and class
Applications of statistical physics to the social and economic sciences
Petersen, Alexander M.
2011-12-01
This thesis applies statistical physics concepts and methods to quantitatively analyze socioeconomic systems. For each system we combine theoretical models and empirical data analysis in order to better understand the real-world system in relation to the complex interactions between the underlying human agents. This thesis is separated into three parts: (i) response dynamics in financial markets, (ii) dynamics of career trajectories, and (iii) a stochastic opinion model with quenched disorder. In Part I we quantify the response of U.S. markets to financial shocks, which perturb markets and trigger "herding behavior" among traders. We use concepts from earthquake physics to quantify the decay of volatility shocks after the "main shock." We also find, surprisingly, that we can make quantitative statements even before the main shock. In order to analyze market behavior before as well as after "anticipated news" we use Federal Reserve interest-rate announcements, which are regular events that are also scheduled in advance. In Part II we analyze the statistical physics of career longevity. We construct a stochastic model for career progress which has two main ingredients: (a) random forward progress in the career and (b) random termination of the career. We incorporate the rich-get-richer (Matthew) effect into ingredient (a), meaning that it is easier to move forward in the career the farther along one is in the career. We verify the model predictions analyzing data on 400,000 scientific careers and 20,000 professional sports careers. Our model highlights the importance of early career development, showing that many careers are stunted by the relative disadvantage associated with inexperience. In Part III we analyze a stochastic two-state spin model which represents a system of voters embedded on a network. We investigate the role in consensus formation of "zealots", which are agents with time-independent opinion. Our main result is the unexpected finding that it is the
Younger Dryas Boundary (YDB) impact : physical and statistical impossibility.
Boslough, Mark Bruce Elrick
2010-08-01
The YDB impact hypothesis of Firestone et al. (2007) is so extremely improbable it can be considered statistically impossible in addition to being physically impossible. Comets make up only about 1% of the population of Earth-crossing objects. Broken comets are a vanishingly small fraction, and only exist as Earth-sized clusters for a very short period of time. Only a small fraction of impacts occur at angles as shallow as proposed by the YDB impact authors. Events that are exceptionally unlikely to take place in the age of the Universe are 'statistically impossible'. The size distribution of Earth-crossing asteroids is well-constrained by astronomical observations, DoD satellite bolide frequencies, and the cratering record. This distribution can be transformed to a probability density function (PDF) for the largest expected impact of the past 20,000 years. The largest impact of any kind expected over the period of interest is 250 m. Anything larger than 2 km is exceptionally unlikely (probability less than 1%). The impact hypothesis does not rely on any sound physical model. A 4-km diameter comet, even if it fragmented upon entry, would not disperse or explode in the atmosphere. It would generate a crater about 50 km in diameter with a transient cavity as deep as 10 km. There is no evidence for such a large, young crater associated with the YDB. There is no model to suggest that a comet impact of this size is capable of generating continental-wide fires or blast damage, and there is no physical mechanism that could cause a 4-km comet to explode at the optimum height of 500 km. The highest possible altitude for a cometary optimum height is about 15 km, for a 120-m diameter comet. To maximize blast and thermal damage, a 4-km comet would have to break into tens of thousands fragments of this size and spread out over the entire continent, but that would require lateral forces that greatly exceed the drag force, and would not conserve energy. Airbursts are
Tokuyama, M.; Stanley, H.E.
2000-01-01
The main purpose of the Tohwa University International Conference on Statistical Physics is to provide an opportunity for an international group of experimentalists, theoreticians, and computational scientists who are working on various fields of statistical physics to gather together and discuss their recent advances. The conference covered six topics: complex systems, general methods of statistical physics, biological physics, cross-disciplinary physics, information science, and econophysics
Worldwide seismicity in view of non-extensive statistical physics
Chochlaki, Kaliopi; Vallianatos, Filippos; Michas, George
2014-05-01
In the present work we study the distribution of worldwide shallow seismic events occurred from 1981 to 2011 extracted from the CMT catalog, with magnitude equal or greater than Mw 5.0. Our analysis based on the subdivision of the Earth surface into seismic zones that are homogeneous with regards to seismic activity and orientation of the predominant stress field. To this direction we use the Flinn-Engdahl regionalization (Flinn and Engdahl, 1965), which consists of 50 seismic zones as modified by Lombardi and Marzocchi (2007), where grouped the 50 FE zones into larger tectonically homogeneous ones, utilizing the cumulative moment tensor method. As a result Lombardi and Marzocchi (2007), limit the initial 50 regions to 39 ones, in which we apply the non- extensive statistical physics approach. The non-extensive statistical physics seems to be the most adequate and promising methodological tool for analyzing complex systems, such as the Earth's interior. In this frame, we introduce the q-exponential formulation as the expression of probability distribution function that maximizes the Sq entropy as defined by Tsallis, (1988). In the present work we analyze the interevent time distribution between successive earthquakes by a q-exponential function in each of the seismic zones defined by Lombardi and Marzocchi (2007).confirming the importance of long-range interactions and the existence of a power-law approximation in the distribution of the interevent times. Our findings supports the ideas of universality within the Tsallis approach to describe Earth's seismicity and present strong evidence on temporal clustering of seismic activity in each of the tectonic zones analyzed. Our analysis as applied in worldwide seismicity with magnitude equal or greater than Mw 5.5 and 6.) is presented and the dependence of our result on the cut-off magnitude is discussed. This research has been funded by the European Union (European Social Fund) and Greek national resources under the
Is poker a skill game? New insights from statistical physics
Javarone, Marco Alberto
2015-06-01
During last years poker has gained a lot of prestige in several countries and, besides being one of the most famous card games, it represents a modern challenge for scientists belonging to different communities, spanning from artificial intelligence to physics and from psychology to mathematics. Unlike games like chess, the task of classifying the nature of poker (i.e., as “skill game” or gambling) seems really hard and it also constitutes a current problem, whose solution has several implications. In general, gambling offers equal winning probabilities both to rational players (i.e., those that use a strategy) and to irrational ones (i.e., those without a strategy). Therefore, in order to uncover the nature of poker, a viable way is comparing performances of rational vs. irrational players during a series of challenges. Recently, a work on this topic revealed that rationality is a fundamental ingredient to succeed in poker tournaments. In this study we analyze a simple model of poker challenges by a statistical physics approach, with the aim to uncover the nature of this game. As main result we found that, under particular conditions, few irrational players can turn poker into gambling. Therefore, although rationality is a key ingredient to succeed in poker, also the format of challenges has an important role in these dynamics, as it can strongly influence the underlying nature of the game. The importance of our results lies on the related implications, as for instance in identifying the limits within which poker can be considered as a “skill game” and, as a consequence, which kind of format must be chosen to devise algorithms able to face humans.
A statistical physics of stationary and metastable states
Cabo, A; González, A; Curilef, S; Cabo-Bizet, N G; Vera, C A
2011-01-01
We present a generalization of Gibbs statistical mechanics designed to describe a general class of stationary and metastable equilibrium states. It is assumed that the physical system maximizes the entropy functional S subject to the standard conditions plus an extra conserved constraint function F, imposed to force the system to remain in the metastable configuration. After requiring additivity for two quasi-independent subsystems, and the commutation of the new constraint with the density matrix ρ, it is argued that F should be a homogeneous function of ρ, at least for systems in which the spectrum is sufficiently dense to be considered as continuous. Therefore, surprisingly, the analytic form of F turns out to be of the kind F(p i ) = p i q , where the p i are the eigenvalues of the density matrix and q is a real number to be determined. Thus, the discussion identifies the physical relevance of Lagrange multiplier constraints of the Tsallis kind and their q parameter, as enforced by the additivity of the constraint F which fixes the metastable state. An approximate analytic solution for the probability density is found for q close to unity. The procedure is applied to describe the results from the plasma experiment of Huang and Driscoll. For small and medium values of the radial distance, the measured density is predicted with a precision similar to that achieved by minimal enstrophy and Tsallis procedures. Also, the particle density is predicted at all the radial positions. Thus, the discussion gives a solution to the conceptual difficulties of the two above mentioned approaches as applied to this problem, which both predict a non-analytic abrupt vanishing of the density above a critical radial distance
Statistical physics of medical diagnostics: Study of a probabilistic model.
Mashaghi, Alireza; Ramezanpour, Abolfazl
2018-03-01
We study a diagnostic strategy which is based on the anticipation of the diagnostic process by simulation of the dynamical process starting from the initial findings. We show that such a strategy could result in more accurate diagnoses compared to a strategy that is solely based on the direct implications of the initial observations. We demonstrate this by employing the mean-field approximation of statistical physics to compute the posterior disease probabilities for a given subset of observed signs (symptoms) in a probabilistic model of signs and diseases. A Monte Carlo optimization algorithm is then used to maximize an objective function of the sequence of observations, which favors the more decisive observations resulting in more polarized disease probabilities. We see how the observed signs change the nature of the macroscopic (Gibbs) states of the sign and disease probability distributions. The structure of these macroscopic states in the configuration space of the variables affects the quality of any approximate inference algorithm (so the diagnostic performance) which tries to estimate the sign-disease marginal probabilities. In particular, we find that the simulation (or extrapolation) of the diagnostic process is helpful when the disease landscape is not trivial and the system undergoes a phase transition to an ordered phase.
Statistical physics of medical diagnostics: Study of a probabilistic model
Mashaghi, Alireza; Ramezanpour, Abolfazl
2018-03-01
We study a diagnostic strategy which is based on the anticipation of the diagnostic process by simulation of the dynamical process starting from the initial findings. We show that such a strategy could result in more accurate diagnoses compared to a strategy that is solely based on the direct implications of the initial observations. We demonstrate this by employing the mean-field approximation of statistical physics to compute the posterior disease probabilities for a given subset of observed signs (symptoms) in a probabilistic model of signs and diseases. A Monte Carlo optimization algorithm is then used to maximize an objective function of the sequence of observations, which favors the more decisive observations resulting in more polarized disease probabilities. We see how the observed signs change the nature of the macroscopic (Gibbs) states of the sign and disease probability distributions. The structure of these macroscopic states in the configuration space of the variables affects the quality of any approximate inference algorithm (so the diagnostic performance) which tries to estimate the sign-disease marginal probabilities. In particular, we find that the simulation (or extrapolation) of the diagnostic process is helpful when the disease landscape is not trivial and the system undergoes a phase transition to an ordered phase.
Statistical Physics of Neural Systems with Nonadditive Dendritic Coupling
David Breuer
2014-03-01
Full Text Available How neurons process their inputs crucially determines the dynamics of biological and artificial neural networks. In such neural and neural-like systems, synaptic input is typically considered to be merely transmitted linearly or sublinearly by the dendritic compartments. Yet, single-neuron experiments report pronounced supralinear dendritic summation of sufficiently synchronous and spatially close-by inputs. Here, we provide a statistical physics approach to study the impact of such nonadditive dendritic processing on single-neuron responses and the performance of associative-memory tasks in artificial neural networks. First, we compute the effect of random input to a neuron incorporating nonlinear dendrites. This approach is independent of the details of the neuronal dynamics. Second, we use those results to study the impact of dendritic nonlinearities on the network dynamics in a paradigmatic model for associative memory, both numerically and analytically. We find that dendritic nonlinearities maintain network convergence and increase the robustness of memory performance against noise. Interestingly, an intermediate number of dendritic branches is optimal for memory functionality.
Numerical solution of optimal departure frequency of Taipei TMS
Young, Lih-jier; Chiu, Chin-Hsin
2016-05-01
Route Number 5 (Bannan Line) of Taipei Mass Rapid Transit (MRT) is the most popular line in the Taipei Metro System especially during rush hours periods. It has been estimated there are more than 8,000 passengers on the ticket platform during 18:00∼19:00 at Taipei main station. The purpose of this research is to predict a specific departure frequency of passengers per train. Monte Carlo Simulation will be used to optimize departure frequency according to the passenger information provided by 22 stations, i.e., 22 random variables of route number 5. It is worth mentioning that we used 30,000 iterations to get the different samples of the optimization departure frequency, i.e., 10 trains/hr which matches the practical situation.
Air Pollution and Daily Clinic Visits for Headache in a Subtropical City: Taipei, Taiwan
Hui-Fen Chiu
2015-02-01
Full Text Available This study was undertaken to determine whether there was an association between air pollutant levels and daily clinic visits for headache in Taipei, Taiwan. Daily clinic visits for headache and ambient air pollution data for Taipei were obtained for the period from 2006–2011. The odds ratio of clinic visits for headache was estimated using a case-crossover approach, controlling for weather variables, day of the week, seasonality, and long-term time trends. In the single pollutant models, on warm days (>23 °C statistically significant positive associations were found for increased rate of headache occurrence and levels of particulate matter (PM10, sulfur dioxide (SO2, nitrogen dioxide (NO2, carbon monoxide (CO, and ozone (O3. On cool days (<23 °C, all pollutants were significantly associated with increased headache visits except SO2. For the two-pollutant models, PM10, O3 and NO2 were significant for higher rate of headache visits in combination with each of the other four pollutants on cool days. On warm days, CO remained statistically significant in all two-pollutant models. This study provides evidence that higher levels of ambient air pollutants increase the risk of clinic visits for headache.
Microzonation of Seismic Hazard Potential in Taipei, Taiwan
Liu, K. S.; Lin, Y. P.
2017-12-01
The island of Taiwan lies at the boundary between the Philippine Sea plate and the Eurasia plate. Accordingly, the majority of seismic energy release near Taiwan originates from the two subduction zones. It is therefore not surprising that Taiwan has repeatedly been struck by large earthquakes such as 1986 Hualien earthquake, 1999 Chi Chi and 2002 Hualien earthquake. Microzonation of seismic hazard potential becomes necessary in Taipei City for the Central Geological Survey announced the Sanchiao active fault as Category II. In this study, a catalog of more than 2000 shallow earthquakes occurred from 1900 to 2015 with Mw magnitudes ranging from 5.0 to 8.2, and 11 disastrous earthquakes occurred from 1683-1899, as well as Sanchiao active fault in the vicinity are used to estimate the seismic hazard potential in Taipei City for seismic microzonation. Furthermore, the probabilities of seismic intensity exceeding CWB intensity 5, 6, 7 and MMI VI, VII, VIII in 10, 30, and 50-year periods in the above areas are also analyzed for the seismic microzonation. Finally, by comparing with the seismic zoning map of Taiwan in current building code that was revised after 921 earthquakes, Results of this study will show which areas with higher earthquake hazard potential in Taipei City. They provide a valuable database for the seismic design of critical facilities. It will help mitigate Taipei City earthquake disaster loss in the future, as well as provide critical information for emergency response plans.
Parallelization of the Physical-Space Statistical Analysis System (PSAS)
Larson, J. W.; Guo, J.; Lyster, P. M.
1999-01-01
Atmospheric data assimilation is a method of combining observations with model forecasts to produce a more accurate description of the atmosphere than the observations or forecast alone can provide. Data assimilation plays an increasingly important role in the study of climate and atmospheric chemistry. The NASA Data Assimilation Office (DAO) has developed the Goddard Earth Observing System Data Assimilation System (GEOS DAS) to create assimilated datasets. The core computational components of the GEOS DAS include the GEOS General Circulation Model (GCM) and the Physical-space Statistical Analysis System (PSAS). The need for timely validation of scientific enhancements to the data assimilation system poses computational demands that are best met by distributed parallel software. PSAS is implemented in Fortran 90 using object-based design principles. The analysis portions of the code solve two equations. The first of these is the "innovation" equation, which is solved on the unstructured observation grid using a preconditioned conjugate gradient (CG) method. The "analysis" equation is a transformation from the observation grid back to a structured grid, and is solved by a direct matrix-vector multiplication. Use of a factored-operator formulation reduces the computational complexity of both the CG solver and the matrix-vector multiplication, rendering the matrix-vector multiplications as a successive product of operators on a vector. Sparsity is introduced to these operators by partitioning the observations using an icosahedral decomposition scheme. PSAS builds a large (approx. 128MB) run-time database of parameters used in the calculation of these operators. Implementing a message passing parallel computing paradigm into an existing yet developing computational system as complex as PSAS is nontrivial. One of the technical challenges is balancing the requirements for computational reproducibility with the need for high performance. The problem of computational
Becchi, Carlo Maria
2016-01-01
This is the third edition of a well-received textbook on modern physics theory. This book provides an elementary but rigorous and self-contained presentation of the simplest theoretical framework that will meet the needs of undergraduate students. In addition, a number of examples of relevant applications and an appropriate list of solved problems are provided.Apart from a substantial extension of the proposed problems, the new edition provides more detailed discussion on Lorentz transformations and their group properties, a deeper treatment of quantum mechanics in a central potential, and a closer comparison of statistical mechanics in classical and in quantum physics. The first part of the book is devoted to special relativity, with a particular focus on space-time relativity and relativistic kinematics. The second part deals with Schrödinger's formulation of quantum mechanics. The presentation concerns mainly one-dimensional problems, but some three-dimensional examples are discussed in detail. The third...
Physical Research Program: research contracts and statistical summary
1975-01-01
The physical research program consists of fundamental theoretical and experimental investigations designed to support the objectives of ERDA. The program is directed toward discovery of natural laws and new knowledge, and to improved understanding of the physical sciences as related to the development, use, and control of energy. The ultimate goal is to develop a scientific underlay for the overall ERDA effort and the fundamental principles of natural phenomena so that these phenomena may be understood and new principles, formulated. The physical research program is organized into four functional subprograms, high-energy physics, nuclear sciences, materials sciences, and molecular sciences. Approximately four-fifths of the total physical research program costs are associated with research conducted in ERDA-owned, contractor-operated federally funded research and development centers. A little less than one-fifth of the costs are associated with the support of research conducted in other laboratories
Tenenbaum, Joel
This thesis applies statistical physics concepts and methods to quantitatively analyze complex systems. This thesis is separated into four parts: (i) characteristics of earthquake systems (ii) memory and volatility in data time series (iii) the application of part (ii) to world financial markets, and (iv) statistical observations on the evolution of word usage. In Part I, we observe statistical patterns in the occurrence of earthquakes. We select a 14-year earthquake catalog covering the archipelago of Japan. We find that regions traditionally thought of as being too distant from one another for causal contact display remarkably high correlations, and the networks that result have a tendency to link highly connected areas with other highly connected areas. In Part II, we introduce and apply the concept of "volatility asymmetry", the primary use of which is in financial data. We explain the relation between memory and "volatility asymmetry" in terms of an asymmetry parameter lambda. We define a litmus test for determining whether lambda is statistically significant and propose a stochastic model based on this parameter and use the model to further explain empirical data. In Part III, we expand on volatility asymmetry. Importing the concepts of time dependence and universality from physics, we explore the aspects of emerging (or "transition") economies in Eastern Europe as they relate to asymmetry. We find that these emerging markets in some instances behave like developed markets and in other instances do not, and that the distinction is a matter both of country and a matter of time period, crisis periods showing different asymmetry characteristics than "healthy" periods. In Part IV, we take note of a series of findings in econophysics, showing statistical growth similarities between a variety of different areas that all have in common the fact of taking place in areas that are both (i) competing and (ii) dynamic. We show that this same growth distribution can be
Effects of Asian dust storm events on daily mortality in Taipei, Taiwan
Chen, Y.-S.; Sheen, P.-C.; Chen, E.-R.; Liu, Y.-K.; Wu, T.-N.; Yang, C.-Y.
2004-01-01
In spring, windblown dust storms originating in the deserts of Mongolia and China make their way to Taipei City. These occurrences are known as Asian dust storm events. The objective of this study was to assess the possible effects of Asian dust storms on the mortality of residents in Taipei, Taiwan, during the period from 1995 to 2000. We identified 39 dust storm episodes, which were classified as index days. Daily deaths on the index days were compared with deaths on the comparison days. We selected two comparison days for each index day, 7 days before the index day and 7 days after the index day. The strongest estimated effects of dust storms were increases of 7.66% in risk for respiratory disease 1 day after the event, 4.92% for total deaths 2 days following the dust storms and 2.59% for circulatory diseases 2 days following the dust storms. However, none of these effects were statistically significant. This study found greater specificity for associations with respiratory deaths, and this increases the likelihood that the association between dust events and daily mortality represents a causal relationship
Becchi, Carlo Maria
2007-01-01
These notes are designed as a text book for a course on the Modern Physics Theory for undergraduate students. The purpose is providing a rigorous and self-contained presentation of the simplest theoretical framework using elementary mathematical tools. A number of examples of relevant applications and an appropriate list of exercises and answered questions are also given. The first part is devoted to Special Relativity concerning in particular space-time relativity and relativistic kinematics. The second part deals with Schroedinger's formulation of quantum mechanics. The presentation concerns mainly one dimensional problems, in particular tunnel effect, discrete energy levels and band spectra. The third part concerns the application of Gibbs statistical methods to quantum systems and in particular to Bose and Fermi gasses.
Statistical physics of black holes as quantum-mechanical systems
Giddings, Steven B.
2013-01-01
Some basic features of black-hole statistical mechanics are investigated, assuming that black holes respect the principles of quantum mechanics. Care is needed in defining an entropy S_bh corresponding to the number of microstates of a black hole, given that the black hole interacts with its surroundings. An open question is then the relationship between this entropy and the Bekenstein-Hawking entropy S_BH. For a wide class of models with interactions needed to ensure unitary quantum evolutio...
Statistical Analysis of Questionnaire on Physical Rehabilitation in Multiple Sclerosis
Martinková, Patrícia; Řasová, K.
-, č. 3 (2010), S340 ISSN 1210-7859. [Obnovené neuroimunologickjé a likvorologické dny. 21.05.2010-22.05.2010, Praha] R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : questionnaire * physical rehabilitation * multiple sclerosis Subject RIV: IN - Informatics, Computer Science
Some applications of multivariate statistics to physical anthropology
van Vark, GN
This paper presents some of the results of the cooperation between the author, a physical anthropologist, and Willem Schaafsma. The subjects of study to be discussed in this paper all refer to human evolution, in particular to the process of hominisation. It is described how the interest of the
A new formalism for non extensive physical systems: Tsallis Thermo statistics
Tirnakli, U.; Bueyuekkilic, F.; Demirhan, D.
1999-01-01
Although Boltzmann-Gibbs (BG) statistics provides a suitable tool which enables us to handle a large number of physical systems satisfactorily, it has some basic restrictions. Recently a non extensive thermo statistics has been proposed by C.Tsallis to handle the non extensive physical systems and up to now, besides the generalization of some of the conventional concepts, the formalism has been prosperous in some of the physical applications. In this study, our effort is to introduce Tsallis thermo statistics in some details and to emphasize its achievements on physical systems by noting the recent developments on this line
Statistical Physics of Nanoparticles in the Gas Phase
Hansen, Klavs
2013-01-01
Thermal processes are ubiquitous and an understanding of thermal phenomena is essential for a complete description of the physics of nanoparticles, both for the purpose of modeling the dynamics of the particles and for the correct interpretation of experimental data. This book has the twofold aim to present coherently the relevant results coming from the recent scientific literature and to guide the readers through the process of deriving results, enabling them to explore the limits of the mathematical approximations and test the power of the method. The book is focused on the fundamental properties of nanosystems in the gas phase. For this reason there is a strong emphasis on microcanonical physics. Each chapter is enriched with exercises and 3 Appendices provide additional useful materials.
Surveillance of Tuberculosis in Taipei: The Influence of Nontuberculous Mycobacteria
Chiang, Chen-Yuan; Yu, Ming-Chih; Yang, Shiang-Lin; Yen, Muh-Yong; Bai, Kuan-Jen
2015-01-01
Background Notification of tuberculosis (TB) but not nontuberculous mycobacteria (NTM) is mandatory in Taiwan. Partly due to the strict regulation on TB notification, several patients infected with NTM were notified as TB cases. Notification of patients infected with NTM as TB cases can trigger public health actions and impose additional burdens on the public health system. We conducted a study to assess the influence of NTM infection on surveillance of TB in Taipei. Methodology/Principal Fin...
Statistical physics of learning from examples: a brief introduction
Broeck, C. van den
1994-01-01
The problem of how one can learn from examples is illustrated on the case of a student perception trained by the Hebb rule on examples generated by a teacher perception. Two basic quantities are calculated: the training error and the generalization error. The obtained results are found to be typical. Other training rules are discussed. For the case of an Ising student with an Ising teacher, the existence of a first order phase transition is shown. Special effects such as dilution, queries, rejection, etc. are discussed and some results for multilayer networks are reviewed. In particular, the properties of a self-similar committee machine are derived. Finally, we discuss the statistic of generalization, with a review of the Hoeffding inequality, the Dvoretzky Kiefer Wolfowitz theorem and the Vapnik Chervonenkis theorem. (author). 29 refs, 6 figs
Monte Carlo simulation in statistical physics an introduction
Binder, Kurt
1992-01-01
The Monte Carlo method is a computer simulation method which uses random numbers to simulate statistical fluctuations The method is used to model complex systems with many degrees of freedom Probability distributions for these systems are generated numerically and the method then yields numerically exact information on the models Such simulations may be used tosee how well a model system approximates a real one or to see how valid the assumptions are in an analyical theory A short and systematic theoretical introduction to the method forms the first part of this book The second part is a practical guide with plenty of examples and exercises for the student Problems treated by simple sampling (random and self-avoiding walks, percolation clusters, etc) are included, along with such topics as finite-size effects and guidelines for the analysis of Monte Carlo simulations The two parts together provide an excellent introduction to the theory and practice of Monte Carlo simulations
Statistical and physical study of one-sided planetary nebulae.
Ali, A.; El-Nawawy, M. S.; Pfleiderer, J.
The authors have investigated the spatial orientation of one-sided planetary nebulae. Most of them if not all are interacting with the interstellar medium. Seventy percent of the nebulae in the sample have inclination angles larger than 45° to the Galactic plane and 30% of the inclination angles are less than 45°. Most of the selected objects are old, evolved planetary nebulae with large dimensions, and not far away from the Galactic plane. Seventy-five percent of the objects are within 160 pc from the Galactic plane. The enhanced concavity arc can be explained physically as a result of the 'planetary nebulae-interstellar matter' interaction. The authors discuss the possible effect of the interstellar magnetic field in the concavity regions.
Excel 2013 for physical sciences statistics a guide to solving practical problems
Quirk, Thomas J; Horton, Howard F
2016-01-01
This book shows the capabilities of Microsoft Excel in teaching physical sciences statistics effectively. Similar to the previously published Excel 2010 for Physical Sciences Statistics, this book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical science problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in science courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2013 for Physical Sciences Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their ...
Wang, Jeen-Hwa
Strong collision between the Eurasian and Philippine Sea Plates causes high seismicity in the Taiwan region, which is often attacked by large earthquakes. Several cities, including three mega-cities, i.e., Taipei, Taichung, and Kaoshung, have been constructed on western Taiwan, where is lying on thick sediments. These cities, with a high-population density, are usually a regional center of culture, economics, and politics. Historically, larger-sized earthquakes, e.g. the 1935 Hsingchu—Taichung earthquake and the 1999 Chi—Chi earthquake, often caused serious damage on the cities. Hence, urban seismology must be one of the main subjects of Taiwan's seismological community. Since 2005, a program project, sponsored by Academia Sinica, has been launched to investigate seismological problems in the Taipei Metropolitan Area. This program project is performed during the 2005—2007 period. The core research subjects are: (1) the deployment of the Taipei Down-hole Seismic Array; (2) the properties of earthquakes and active faults in the area; (3) the seismogenic-zone structures, including the 3-D velocity and Q structures, of the area; (4) the characteristics of strong-motions and sites affects; and (5) strong-motion prediction. In addition to academic goals, the results obtained from the program project will be useful for seismic hazard mitigation not only for the area but also for others.
Statistical Uncertainty Quantification of Physical Models during Reflood of LBLOCA
Oh, Deog Yeon; Seul, Kwang Won; Woo, Sweng Woong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)
2015-05-15
The use of the best-estimate (BE) computer codes in safety analysis for loss-of-coolant accident (LOCA) is the major trend in many countries to reduce the significant conservatism. A key feature of this BE evaluation requires the licensee to quantify the uncertainty of the calculations. So, it is very important how to determine the uncertainty distribution before conducting the uncertainty evaluation. Uncertainty includes those of physical model and correlation, plant operational parameters, and so forth. The quantification process is often performed mainly by subjective expert judgment or obtained from reference documents of computer code. In this respect, more mathematical methods are needed to reasonably determine the uncertainty ranges. The first uncertainty quantification are performed with the various increments for two influential uncertainty parameters to get the calculated responses and their derivatives. The different data set with two influential uncertainty parameters for FEBA tests, are chosen applying more strict criteria for selecting responses and their derivatives, which may be considered as the user’s effect in the CIRCÉ applications. Finally, three influential uncertainty parameters are considered to study the effect on the number of uncertainty parameters due to the limitation of CIRCÉ method. With the determined uncertainty ranges, uncertainty evaluations for FEBA tests are performed to check whether the experimental responses such as the cladding temperature or pressure drop are inside the limits of calculated uncertainty bounds. A confirmation step will be performed to evaluate the quality of the information in the case of the different reflooding PERICLES experiments. The uncertainty ranges of physical model in MARS-KS thermal-hydraulic code during the reflooding were quantified by CIRCÉ method using FEBA experiment tests, instead of expert judgment. Also, through the uncertainty evaluation for FEBA and PERICLES tests, it was confirmed
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
Steam generators clogging diagnosis through physical and statistical modelling
Girard, S.
2012-01-01
Steam generators are massive heat exchangers feeding the turbines of pressurised water nuclear power plants. Internal parts of steam generators foul up with iron oxides which gradually close some holes aimed for the passing of the fluid. This phenomenon called clogging causes safety issues and means to assess it are needed to optimise the maintenance strategy. The approach investigated in this thesis is the analysis of steam generators dynamic behaviour during power transients with a mono dimensional physical model. Two improvements to the model have been implemented. One was taking into account flows orthogonal to the modelling axis, the other was introducing a slip between phases accounting for velocity difference between liquid water and steam. These two elements increased the model's degrees of freedom and improved the adequacy of the simulation to plant data. A new calibration and validation methodology has been proposed to assess the robustness of the model. The initial inverse problem was ill posed: different clogging spatial configurations can produce identical responses. The relative importance of clogging, depending on its localisation, has been estimated by sensitivity analysis with the Sobol' method. The dimension of the model functional output had been previously reduced by principal components analysis. Finally, the input dimension has been reduced by a technique called sliced inverse regression. Based on this new framework, a new diagnosis methodology, more robust and better understood than the existing one, has been proposed. (author)
Excel 2016 for physical sciences statistics a guide to solving practical problems
Quirk, Thomas J; Horton, Howard F
2016-01-01
This book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical physical science problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel is an effective learning tool for quantitative analyses in environmental science courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2016 for Physical Sciences Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel 2016 to statistical techniques necessary in their courses and work. Each chapter explains statistical formulas and directs the reader to use Excel commands to solve specific, easy-to-understand physical science problems. Practice problems are provided at the end of each chapter with their s...
On estimating perturbative coefficients in quantum field theory and statistical physics
Samuel, M.A.; Stanford Univ., CA
1994-05-01
The authors present a method for estimating perturbative coefficients in quantum field theory and Statistical Physics. They are able to obtain reliable error-bars for each estimate. The results, in all cases, are excellent
Statistical physics of non-thermal phase transitions from foundations to applications
Abaimov, Sergey G
2015-01-01
Statistical physics can be used to better understand non-thermal complex systems—phenomena such as stock-market crashes, revolutions in society and in science, fractures in engineered materials and in the Earth’s crust, catastrophes, traffic jams, petroleum clusters, polymerization, self-organized criticality and many others exhibit behaviors resembling those of thermodynamic systems. In particular, many of these systems possess phase transitions identical to critical or spinodal phenomena in statistical physics. The application of the well-developed formalism of statistical physics to non-thermal complex systems may help to predict and prevent such catastrophes as earthquakes, snow-avalanches and landslides, failure of engineering structures, or economical crises. This book addresses the issue step-by-step, from phenomenological analogies between complex systems and statistical physics to more complex aspects, such as correlations, fluctuation-dissipation theorem, susceptibility, the concept of free ener...
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
Statistical issues in searches for new phenomena in High Energy Physics
Lyons, Louis; Wardle, Nicholas
2018-03-01
Many analyses of data in High Energy Physics are concerned with searches for New Physics. We review the statistical issues that arise in such searches, and then illustrate these using the specific example of the recent successful search for the Higgs boson, produced in collisions between high energy protons at CERN’s Large Hadron Collider.
2005-01-01
For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees
SERC School on Computational Statistical Physics held at the Indian Institute of Technology
Ray, Purusattam
2011-01-01
The present book is an outcome of the SERC school on Computational Statistical Physics held at the Indian Institute of Technology, Guwahati, in December 2008. Numerical experimentation has played an extremely important role in statistical physics in recent years. Lectures given at the School covered a large number of topics of current and continuing interest. Based on lectures by active researchers in the field- Bikas Chakrabarti, S Chaplot, Deepak Dhar, Sanjay Kumar, Prabal Maiti, Sanjay Puri, Purusattam Ray, Sitangshu Santra and Subir Sarkar- the nine chapters comprising the book deal with topics that range from the fundamentals of the field, to problems and questions that are at the very forefront of current research. This book aims to expose the graduate student to the basic as well as advanced techniques in computational statistical physics. Following a general introduction to statistical mechanics and critical phenomena, the various chapters cover Monte Carlo and molecular dynamics simulation methodolog...
Physics Teachers and Students: A Statistical and Historical Analysis of Women
Gregory, Amanda
2009-10-01
Historically, women have been denied an education comparable to that available to men. Since women have been allowed into institutions of higher learning, they have been studying and earning physics degrees. The aim of this poster is to discuss the statistical relationship between the number of women enrolled in university physics programs and the number of female physics faculty members. Special care has been given to examining the statistical data in the context of the social climate at the time that these women were teaching or pursuing their education.
2001-01-01
For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
1999-01-01
For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
Physics-based statistical model and simulation method of RF propagation in urban environments
Pao, Hsueh-Yuan; Dvorak, Steven L.
2010-09-14
A physics-based statistical model and simulation/modeling method and system of electromagnetic wave propagation (wireless communication) in urban environments. In particular, the model is a computationally efficient close-formed parametric model of RF propagation in an urban environment which is extracted from a physics-based statistical wireless channel simulation method and system. The simulation divides the complex urban environment into a network of interconnected urban canyon waveguides which can be analyzed individually; calculates spectral coefficients of modal fields in the waveguides excited by the propagation using a database of statistical impedance boundary conditions which incorporates the complexity of building walls in the propagation model; determines statistical parameters of the calculated modal fields; and determines a parametric propagation model based on the statistical parameters of the calculated modal fields from which predictions of communications capability may be made.
The Practicality of Statistical Physics Handout Based on KKNI and the Constructivist Approach
Sari, S. Y.; Afrizon, R.
2018-04-01
Statistical physics lecture shows that: 1) the performance of lecturers, social climate, students’ competence and soft skills needed at work are in enough category, 2) students feel difficulties in following the lectures of statistical physics because it is abstract, 3) 40.72% of students needs more understanding in the form of repetition, practice questions and structured tasks, and 4) the depth of statistical physics material needs to be improved gradually and structured. This indicates that learning materials in accordance of The Indonesian National Qualification Framework or Kerangka Kualifikasi Nasional Indonesia (KKNI) with the appropriate learning approach are needed to help lecturers and students in lectures. The author has designed statistical physics handouts which have very valid criteria (90.89%) according to expert judgment. In addition, the practical level of handouts designed also needs to be considered in order to be easy to use, interesting and efficient in lectures. The purpose of this research is to know the practical level of statistical physics handout based on KKNI and a constructivist approach. This research is a part of research and development with 4-D model developed by Thiagarajan. This research activity has reached part of development test at Development stage. Data collection took place by using a questionnaire distributed to lecturers and students. Data analysis using descriptive data analysis techniques in the form of percentage. The analysis of the questionnaire shows that the handout of statistical physics has very practical criteria. The conclusion of this study is statistical physics handouts based on the KKNI and constructivist approach have been practically used in lectures.
Multivariate statistical methods and data mining in particle physics (4/4)
CERN. Geneva
2008-01-01
The lectures will cover multivariate statistical methods and their applications in High Energy Physics. The methods will be viewed in the framework of a statistical test, as used e.g. to discriminate between signal and background events. Topics will include an introduction to the relevant statistical formalism, linear test variables, neural networks, probability density estimation (PDE) methods, kernel-based PDE, decision trees and support vector machines. The methods will be evaluated with respect to criteria relevant to HEP analyses such as statistical power, ease of computation and sensitivity to systematic effects. Simple computer examples that can be extended to more complex analyses will be presented.
Multivariate statistical methods and data mining in particle physics (2/4)
CERN. Geneva
2008-01-01
The lectures will cover multivariate statistical methods and their applications in High Energy Physics. The methods will be viewed in the framework of a statistical test, as used e.g. to discriminate between signal and background events. Topics will include an introduction to the relevant statistical formalism, linear test variables, neural networks, probability density estimation (PDE) methods, kernel-based PDE, decision trees and support vector machines. The methods will be evaluated with respect to criteria relevant to HEP analyses such as statistical power, ease of computation and sensitivity to systematic effects. Simple computer examples that can be extended to more complex analyses will be presented.
Multivariate statistical methods and data mining in particle physics (1/4)
CERN. Geneva
2008-01-01
The lectures will cover multivariate statistical methods and their applications in High Energy Physics. The methods will be viewed in the framework of a statistical test, as used e.g. to discriminate between signal and background events. Topics will include an introduction to the relevant statistical formalism, linear test variables, neural networks, probability density estimation (PDE) methods, kernel-based PDE, decision trees and support vector machines. The methods will be evaluated with respect to criteria relevant to HEP analyses such as statistical power, ease of computation and sensitivity to systematic effects. Simple computer examples that can be extended to more complex analyses will be presented.
2003-01-01
For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products
2004-01-01
For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
Nearing, G. S.
2014-12-01
Statistical models consistently out-perform conceptual models in the short term, however to account for a nonstationary future (or an unobserved past) scientists prefer to base predictions on unchanging and commutable properties of the universe - i.e., physics. The problem with physically-based hydrology models is, of course, that they aren't really based on physics - they are based on statistical approximations of physical interactions, and we almost uniformly lack an understanding of the entropy associated with these approximations. Thermodynamics is successful precisely because entropy statistics are computable for homogeneous (well-mixed) systems, and ergodic arguments explain the success of Newton's laws to describe systems that are fundamentally quantum in nature. Unfortunately, similar arguments do not hold for systems like watersheds that are heterogeneous at a wide range of scales. Ray Solomonoff formalized the situation in 1968 by showing that given infinite evidence, simultaneously minimizing model complexity and entropy in predictions always leads to the best possible model. The open question in hydrology is about what happens when we don't have infinite evidence - for example, when the future will not look like the past, or when one watershed does not behave like another. How do we isolate stationary and commutable components of watershed behavior? I propose that one possible answer to this dilemma lies in a formal combination of physics and statistics. In this talk I outline my recent analogue (Solomonoff's theorem was digital) of Solomonoff's idea that allows us to quantify the complexity/entropy tradeoff in a way that is intuitive to physical scientists. I show how to formally combine "physical" and statistical methods for model development in a way that allows us to derive the theoretically best possible model given any given physics approximation(s) and available observations. Finally, I apply an analogue of Solomonoff's theorem to evaluate the
Anosova, Z.P.
1988-01-01
A statistical criterion is proposed for distinguishing between random and physical groupings of stars and galaxies. The criterion is applied to nearby wide multiple stars, triplets of galaxies in the list of Karachentsev, Karachentseva, and Shcherbanovskii, and double galaxies in the list of Dahari, in which the principal components are Seyfert galaxies. Systems that are almost certainly physical, probably physical, probably optical, and almost certainly optical are identified. The limiting difference between the radial velocities of the components of physical multiple galaxies is estimated
Kraut, W. [Duale Hochschule Baden-Wuerttemberg (DHBW), Karlsruhe (Germany). Studiengang Sicherheitswesen
2016-07-01
The only proper way to describe uncertainties in health physics is by statistical means. But statistics never can replace Your personal evaluation of effect, nor can statistics transmute randomness into certainty like an ''uncertainty laundry''. The paper discusses these problems in routine practical work.
Classical Methods of Statistics With Applications in Fusion-Oriented Plasma Physics
Kardaun, Otto J W F
2005-01-01
Classical Methods of Statistics is a blend of theory and practical statistical methods written for graduate students and researchers interested in applications to plasma physics and its experimental aspects. It can also fruitfully be used by students majoring in probability theory and statistics. In the first part, the mathematical framework and some of the history of the subject are described. Many exercises help readers to understand the underlying concepts. In the second part, two case studies are presented exemplifying discriminant analysis and multivariate profile analysis. The introductions of these case studies outline contextual magnetic plasma fusion research. In the third part, an overview of statistical software is given and, in particular, SAS and S-PLUS are discussed. In the last chapter, several datasets with guided exercises, predominantly from the ASDEX Upgrade tokamak, are included and their physical background is concisely described. The book concludes with a list of essential keyword transl...
Identification of AE Bursts by Classification of Physical and Statistical Parameters
Mieza, J.I.; Oliveto, M.E.; Lopez Pumarega, M.I.; Armeite, M.; Ruzzante, J.E.; Piotrkowski, R.
2005-01-01
Physical and statistical parameters obtained with the Principal Components method, extracted from Acoustic Emission bursts coming from triaxial deformation tests were analyzed. The samples came from seamless steel tubes used in the petroleum industry and some of them were provided with a protective coating. The purpose of our work was to identify bursts originated in the breakage of the coating, from those originated in damage mechanisms in the bulk steel matrix. Analysis was performed by statistical distributions, fractal analysis and clustering methods
Statistical Plasma Physics in a Strong Magnetic Field: Paradigms and Problems
J.A. Krommes
2004-03-19
An overview is given of certain aspects of fundamental statistical theories as applied to strongly magnetized plasmas. Emphasis is given to the gyrokinetic formalism, the historical development of realizable Markovian closures, and recent results in the statistical theory of turbulent generation of long-wavelength flows that generalize and provide further physical insight to classic calculations of eddy viscosity. A Hamiltonian formulation of turbulent flow generation is described and argued to be very useful.
Introduction to modern theoretical physics. Volume II. Quantum theory and statistical physics
Harris, E.G.
1975-01-01
The topics discussed include the history and principles, some solvable problems, and symmetry in quantum mechanics, interference phenomena, approximation methods, some applications of nonrelativistic quantum mechanics, relativistic wave equations, quantum theory of radiation, second quantization, elementary particles and their interactions, thermodynamics, equilibrium statistical mechanics and its applications, the kinetic theory of gases, and collective phenomena
A new universality class in corpus of texts; A statistical physics study
Najafi, Elham; Darooneh, Amir H.
2018-05-01
Text can be regarded as a complex system. There are some methods in statistical physics which can be used to study this system. In this work, by means of statistical physics methods, we reveal new universal behaviors of texts associating with the fractality values of words in a text. The fractality measure indicates the importance of words in a text by considering distribution pattern of words throughout the text. We observed a power law relation between fractality of text and vocabulary size for texts and corpora. We also observed this behavior in studying biological data.
Information Theory - The Bridge Connecting Bounded Rational Game Theory and Statistical Physics
Wolpert, David H.
2005-01-01
A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality of all red-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. This paper shows that the same information theoretic mathematical structure, known as Product Distribution (PD) theory, addresses both issues. In this, PD theory not only provides a principle formulation of bounded rationality and a set of new types of mean field theory in statistical physics; it also shows that those topics are fundamentally one and the same.
Statistical physics inspired energy-efficient coded-modulation for optical communications.
Djordjevic, Ivan B; Xu, Lei; Wang, Ting
2012-04-15
Because Shannon's entropy can be obtained by Stirling's approximation of thermodynamics entropy, the statistical physics energy minimization methods are directly applicable to the signal constellation design. We demonstrate that statistical physics inspired energy-efficient (EE) signal constellation designs, in combination with large-girth low-density parity-check (LDPC) codes, significantly outperform conventional LDPC-coded polarization-division multiplexed quadrature amplitude modulation schemes. We also describe an EE signal constellation design algorithm. Finally, we propose the discrete-time implementation of D-dimensional transceiver and corresponding EE polarization-division multiplexed system. © 2012 Optical Society of America
Surveillance of Tuberculosis in Taipei: The Influence of Nontuberculous Mycobacteria.
Chen-Yuan Chiang
Full Text Available Notification of tuberculosis (TB but not nontuberculous mycobacteria (NTM is mandatory in Taiwan. Partly due to the strict regulation on TB notification, several patients infected with NTM were notified as TB cases. Notification of patients infected with NTM as TB cases can trigger public health actions and impose additional burdens on the public health system. We conducted a study to assess the influence of NTM infection on surveillance of TB in Taipei.The study population included all individuals with a positive culture for Mycobacterium who were citizens of Taipei City and notified as TB cases in the calendar years 2007-2010. Of the 4216 notified culture-positive tuberculosis (TB cases, 894 (21.2% were infected with NTM. The average annual reported case rate of infection with NTM was 8.6 (95% confidence interval 7.7-9.4 per 100,000 people. The reported case rate of NTM increased with age in both males and females. The proportion of reported TB cases infected with NTM was significantly higher in females than in males (27.6% vs 17.8%, adjusted OR (adjOR 1.93, 95% confidence interval (CI 1.63-2.28; in smear-positive than in smear-negative (23.1% vs 19.2%, adjOR 1.26, 95% CI 1.08-1.47; and in previously treated cases than in new cases (35.7% vs 19.1%, adjOR 2.30, 95% CI 1.88-2.82. The most frequent species was M. avium complex (32.4%, followed by M. chelonae complex (17.6%, M. fortuitum complex (17.0% and M. kansasii (9.8%. Of the 890 notified NTM cases assessed, 703 (79.0% were treated with anti-TB drugs, and 730 (82.0% were de-notified.The influence of NTM on surveillance of TB in Taipei was substantial. Health authorities should take action to ensure that nucleic acid amplification tests are performed in all smear-positive cases in a timely manner to reduce the misdiagnosis of patients infected with NTM as TB cases.
Surveillance of Tuberculosis in Taipei: The Influence of Nontuberculous Mycobacteria.
Chiang, Chen-Yuan; Yu, Ming-Chih; Yang, Shiang-Lin; Yen, Muh-Yong; Bai, Kuan-Jen
2015-01-01
Notification of tuberculosis (TB) but not nontuberculous mycobacteria (NTM) is mandatory in Taiwan. Partly due to the strict regulation on TB notification, several patients infected with NTM were notified as TB cases. Notification of patients infected with NTM as TB cases can trigger public health actions and impose additional burdens on the public health system. We conducted a study to assess the influence of NTM infection on surveillance of TB in Taipei. The study population included all individuals with a positive culture for Mycobacterium who were citizens of Taipei City and notified as TB cases in the calendar years 2007-2010. Of the 4216 notified culture-positive tuberculosis (TB) cases, 894 (21.2%) were infected with NTM. The average annual reported case rate of infection with NTM was 8.6 (95% confidence interval 7.7-9.4) per 100,000 people. The reported case rate of NTM increased with age in both males and females. The proportion of reported TB cases infected with NTM was significantly higher in females than in males (27.6% vs 17.8%, adjusted OR (adjOR) 1.93, 95% confidence interval (CI) 1.63-2.28); in smear-positive than in smear-negative (23.1% vs 19.2%, adjOR 1.26, 95% CI 1.08-1.47); and in previously treated cases than in new cases (35.7% vs 19.1%, adjOR 2.30, 95% CI 1.88-2.82). The most frequent species was M. avium complex (32.4%), followed by M. chelonae complex (17.6%), M. fortuitum complex (17.0%) and M. kansasii (9.8%). Of the 890 notified NTM cases assessed, 703 (79.0%) were treated with anti-TB drugs, and 730 (82.0%) were de-notified. The influence of NTM on surveillance of TB in Taipei was substantial. Health authorities should take action to ensure that nucleic acid amplification tests are performed in all smear-positive cases in a timely manner to reduce the misdiagnosis of patients infected with NTM as TB cases.
Molecular epidemiology and evolutionary genetics of Mycobacterium tuberculosis in Taipei.
Dou, Horng-Yunn; Tseng, Fan-Chen; Lin, Chih-Wei; Chang, Jia-Ru; Sun, Jun-Ren; Tsai, Wen-Shing; Lee, Shi-Yi; Su, Ih-Jen; Lu, Jang-Jih
2008-12-22
The control of tuberculosis in densely populated cities is complicated by close human-to-human contacts and potential transmission of pathogens from multiple sources. We conducted a molecular epidemiologic analysis of 356 Mycobacterium tuberculosis (MTB) isolates from patients presenting pulmonary tuberculosis in metropolitan Taipei. Classical antibiogram studies and genetic characterization, using mycobacterial interspersed repetitive-unit-variable-number tandem-repeat (MIRU-VNTR) typing and spoligotyping, were applied after culture. A total of 356 isolates were genotyped by standard spoligotyping and the strains were compared with in the international spoligotyping database (SpolDB4). All isolates were also categorized using the 15 loci MIRU-VNTR typing method and combin with NTF locus and RD deletion analyses. Of 356 isolates spoligotyped, 290 (81.4%) displayed known spoligotypes and 66 were not identified in the database. Major spoligotypes found were Beijing lineages (52.5%), followed by Haarlem lineages (13.5%) and EAI plus EAI-like lineages (11%). When MIRU-VNTR was employed, 140 patterns were identified, including 36 clusters by 252 isolates and 104 unique patterns, and the largest cluster comprised 95 isolates from the Beijing family. The combination of spoligotyping and MIRU-VNTR revealed that 236 (67%) of the 356 isolates were clustered in 43 genotypes. Strains of the Beijing family was more likely to be of modern strain and a higher percentage of multiple drug resistance than other families combined (P = 0.08). Patients infected with Beijing strains were younger than those with other strains (mean 58.7 vs. 64.2, p = 0.02). Moreover, 85.3% of infected persons younger than 25 years had Beijing modern strain, suggesting a possible recent spread in the young population by this family of TB strain in Taipei. Our data on MTB genotype in Taipei suggest that MTB infection has not been optimally controlled. Control efforts should be reinforced in view of the
UNDERSTANDING VISITOR EXPERIENCES AND MOTIVATIONS IN SUBURBAN TAIPEI
Chiung-Tzu Lucetta TSAI
2016-09-01
Full Text Available This research is aimed to cultivate higher-qualified human resources within the tourism field and provide plaining and developing direction based on the understanding of tourism features in San-ying area. There is a growing research interest in understanding the individual consumer's preferences, as well as management approaches of experiences and therefore, it has explored the understanding of the many different facets of experiences in tourism and hispitality business in suburban Taipei in particular the impact of the Sanxia and Yingge area. There is an attempt to examine the service quality of tourist attractions, moreover, the perceptions and travel experiences of tourists who visit Sanxia and Yingge area. Tourism and hospitality business in Sanxia and Yingge area present culture images and this study has discussed how this has influenced tourists' experiences, motivation and consumer behavior during their visit.
Peterlin, Primoz
2010-01-01
Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared. (Contains 7…
Peterlin, Primoz
2010-01-01
Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared.
Paprotny, D.; Morales Napoles, O.; Jonkman, S.N.
2017-01-01
Flood hazard is currently being researched on continental and global scales, using models of increasing complexity. In this paper we investigate a different, simplified approach, which combines statistical and physical models in place of conventional rainfall-run-off models to carry out flood
Competitive agents in a market: Statistical physics of the minority game
Sherrington, David
2007-10-01
A brief review is presented of the minority game, a simple frustrated many-body system stimulated by considerations of a market of competitive speculative agents. Its cooperative behaviour exhibits phase transitions and both ergodic and non-ergodic regimes. It provides novel challenges to statistical physics, reminiscent of those of mean-field spin glasses.
Home; Journals; Pramana – Journal of Physics; Volume 67; Issue 5. Puzzles in physics. Hsiang-Nan Li ... Author Affiliations. Hsiang-Nan Li1 2. Institute of Physics, Academia Sinica, Taipei, Taiwan 115, Republic of China; Department of Physics, National Cheng-Kung University, Tainan, Taiwan 701, Republic of China ...
Methods and applications of statistics in engineering, quality control, and the physical sciences
Balakrishnan, N
2011-01-01
Inspired by the Encyclopedia of Statistical Sciences, Second Edition (ESS2e), this volume presents a concise, well-rounded focus on the statistical concepts and applications that are essential for understanding gathered data in the fields of engineering, quality control, and the physical sciences. The book successfully upholds the goals of ESS2e by combining both previously-published and newly developed contributions written by over 100 leading academics, researchers, and practitioner in a comprehensive, approachable format. The result is a succinct reference that unveils modern, cutting-edge approaches to acquiring and analyzing data across diverse subject areas within these three disciplines, including operations research, chemistry, physics, the earth sciences, electrical engineering, and quality assurance. In addition, techniques related to survey methodology, computational statistics, and operations research are discussed, where applicable. Topics of coverage include: optimal and stochastic control, arti...
Oliveira, Sérgio C.; Zêzere, José L.; Lajas, Sara; Melo, Raquel
2017-07-01
Approaches used to assess shallow slide susceptibility at the basin scale are conceptually different depending on the use of statistical or physically based methods. The former are based on the assumption that the same causes are more likely to produce the same effects, whereas the latter are based on the comparison between forces which tend to promote movement along the slope and the counteracting forces that are resistant to motion. Within this general framework, this work tests two hypotheses: (i) although conceptually and methodologically distinct, the statistical and deterministic methods generate similar shallow slide susceptibility results regarding the model's predictive capacity and spatial agreement; and (ii) the combination of shallow slide susceptibility maps obtained with statistical and physically based methods, for the same study area, generate a more reliable susceptibility model for shallow slide occurrence. These hypotheses were tested at a small test site (13.9 km2) located north of Lisbon (Portugal), using a statistical method (the information value method, IV) and a physically based method (the infinite slope method, IS). The landslide susceptibility maps produced with the statistical and deterministic methods were combined into a new landslide susceptibility map. The latter was based on a set of integration rules defined by the cross tabulation of the susceptibility classes of both maps and analysis of the corresponding contingency tables. The results demonstrate a higher predictive capacity of the new shallow slide susceptibility map, which combines the independent results obtained with statistical and physically based models. Moreover, the combination of the two models allowed the identification of areas where the results of the information value and the infinite slope methods are contradictory. Thus, these areas were classified as uncertain and deserve additional investigation at a more detailed scale.
Estimation of Lithological Classification in Taipei Basin: A Bayesian Maximum Entropy Method
Wu, Meng-Ting; Lin, Yuan-Chien; Yu, Hwa-Lung
2015-04-01
In environmental or other scientific applications, we must have a certain understanding of geological lithological composition. Because of restrictions of real conditions, only limited amount of data can be acquired. To find out the lithological distribution in the study area, many spatial statistical methods used to estimate the lithological composition on unsampled points or grids. This study applied the Bayesian Maximum Entropy (BME method), which is an emerging method of the geological spatiotemporal statistics field. The BME method can identify the spatiotemporal correlation of the data, and combine not only the hard data but the soft data to improve estimation. The data of lithological classification is discrete categorical data. Therefore, this research applied Categorical BME to establish a complete three-dimensional Lithological estimation model. Apply the limited hard data from the cores and the soft data generated from the geological dating data and the virtual wells to estimate the three-dimensional lithological classification in Taipei Basin. Keywords: Categorical Bayesian Maximum Entropy method, Lithological Classification, Hydrogeological Setting
[The community-oriented experience of early intervention services in Taipei City].
Chu, Feng-Ying
2007-10-01
The purpose of this paper is to emphasize the importance of early intervention. The purpose of early intervention in Taipei City is to help child development, promote parenting skills, and reduce educational and social costs. In order to meet these goals, parenting groups and Taipei City Council have made great efforts to make early intervention work in Taipei City. In April 1995, Taipei City Government started planning and setting up the service network. To date, Taipei City has set up one reporting and referral center?, ?six community resources centers, 22 medical assessment and intervention clinics, 12 child development centers, one early intervention training center, three non-profit foundations and more than 300 inclusion schools, such as kindergartens and day care centers. With parent participation, professional devotion and Taipei City Government's commitment, the number of assisted children has increased from 98 to 2,523 /year. By the end of 2006, Taipei had already funded 25,277 children. We estimate Taipei City early intervention services to have affected at least 75,000 persons, including development-delayed and disabled children, their parents?, ?grandparents and siblings. We found that early intervention services help the children to build up self esteem, grow their potential, learn how to socialize, and receive an education, while the most important aim is to help them to reduce their level of disability or to prevent them from getting worse. At the same time, their families get support and a diverse range of services. An integrated early intervention program should include children, families, and multidisciplinary professionals. The system should therefore be more "family-centered" and "community-oriented" to provide appropriate services to children and families through a positive and aggressive attitude.
Reflections on Gibbs: From Statistical Physics to the Amistad V3.0
Kadanoff, Leo P.
2014-07-01
This note is based upon a talk given at an APS meeting in celebration of the achievements of J. Willard Gibbs. J. Willard Gibbs, the younger, was the first American physical sciences theorist. He was one of the inventors of statistical physics. He introduced and developed the concepts of phase space, phase transitions, and thermodynamic surfaces in a remarkably correct and elegant manner. These three concepts form the basis of different areas of physics. The connection among these areas has been a subject of deep reflection from Gibbs' time to our own. This talk therefore celebrated Gibbs by describing modern ideas about how different parts of physics fit together. I finished with a more personal note. Our own J. Willard Gibbs had all his many achievements concentrated in science. His father, also J. Willard Gibbs, also a Professor at Yale, had one great non-academic achievement that remains unmatched in our day. I describe it.
Statistical panorama of female physics graduate students for 2000-2010 in Peru
Cerón Loayza, María Luisa; Bravo Cabrejos, Jorge Aurelio
2013-03-01
We report the results of a statistical study on the number of women entering the undergraduate and master's programs of physics at Universidad Nacional Mayor de San Marcos in Peru. From 2006 through 2010, 13 female students entered the master's degree program but no females graduated with the degree. Considering that Peru is a developing country, a career in physics is not considered an attractive professional choice even for male students because it is thought that there are no work centers to practice this profession. We recommend that the causes preventing female physics students from completing their studies and research work be analyzed, and that strategies be planned to help women complete their academic work. We are considering getting help from the Peruvian Physics Society (SOPERFI) in order to draw more attention for our plan.
Synthetic Earthquake Statistics From Physical Fault Models for the Lower Rhine Embayment
Brietzke, G. B.; Hainzl, S.; Zöller, G.
2012-04-01
As of today, seismic risk and hazard estimates mostly use pure empirical, stochastic models of earthquake fault systems tuned specifically to the vulnerable areas of interest. Although such models allow for reasonable risk estimates they fail to provide a link between the observed seismicity and the underlying physical processes. Solving a state-of-the-art fully dynamic description set of all relevant physical processes related to earthquake fault systems is likely not useful since it comes with a large number of degrees of freedom, poor constraints on its model parameters and a huge computational effort. Here, quasi-static and quasi-dynamic physical fault simulators provide a compromise between physical completeness and computational affordability and aim at providing a link between basic physical concepts and statistics of seismicity. Within the framework of quasi-static and quasi-dynamic earthquake simulators we investigate a model of the Lower Rhine Embayment (LRE) that is based upon seismological and geological data. We present and discuss statistics of the spatio-temporal behavior of generated synthetic earthquake catalogs with respect to simplification (e.g. simple two-fault cases) as well as to complication (e.g. hidden faults, geometric complexity, heterogeneities of constitutive parameters).
Micro-foundations for macroeconomics: New set-up based on statistical physics
Yoshikawa, Hiroshi
2016-12-01
Modern macroeconomics is built on "micro foundations." Namely, optimization of micro agent such as consumer and firm is explicitly analyzed in model. Toward this goal, standard model presumes "the representative" consumer/firm, and analyzes its behavior in detail. However, the macroeconomy consists of 107 consumers and 106 firms. For the purpose of analyzing such macro system, it is meaningless to pursue the micro behavior in detail. In this respect, there is no essential difference between economics and physics. The method of statistical physics can be usefully applied to the macroeconomy, and provides Keynesian economics with correct micro-foundations.
Perspectives and challenges in statistical physics and complex systems for the next decade
Raposo, Ernesto P; Gomes Eleutério da Luz, Marcos
2014-01-01
Statistical Physics (SP) has followed an unusual evolutionary path in science. Originally aiming to provide a fundamental basis for another important branch of Physics, namely Thermodynamics, SP gradually became an independent field of research in its own right. But despite more than a century of steady progress, there are still plenty of challenges and open questions in the SP realm. In fact, the area is still rapidly evolving, in contrast to other branches of science, which already have well defined scopes and borderlines of applicability. This difference is due to the steadily expanding num
Indoor air quality in hairdressing salons in Taipei.
Chang, C-J; Cheng, S-F; Chang, P-T; Tsai, S-W
2018-01-01
To improve indoor air quality and to protect public health, Taiwan has enacted the "Indoor Air Quality Act (IAQ Act)" in 2012. For the general public, the indoor air quality in hair salons is important because it is a popular location that people will often visit for hair treatments. However, only a few exposure assessments regarding air pollutants have previously been performed in hair salons. To assess the air quality of hairdressing environments in Taipei, ten hairdressing salons were included for a walk-through survey in this study. In addition, the airborne concentrations of formaldehyde, volatile organic compounds (VOCs), CO 2 , and phthalate esters were also determined in 5 salons. Charcoal, XAD-2, and OVS-Tenax tubes were used for the air sampling, while the samples were analyzed with gas chromatography/mass spectrometer. It was found that the products used in hair salons contained various chemicals. In fact, from the walk-through survey, a total of 387 different ingredients were found on 129 hair product labels. The hair salons were not well ventilated, with CO 2 levels of 600 to 3576 ppm. The formaldehyde concentrations determined in this study ranged from 12.40 to 1.04 × 10 3 μg m -3 , and the maximum level was above the permissible exposure limit (PEL) of US Occupational Safety and Health Administration (US OSHA). Additionally, 83% of the samples were with levels higher than the standard regulated by Taiwan's IAQ Act. The concentrations of VOCs and phthalate esters were below the occupational exposure limits (OELs), but higher than what was found in general residential environments. The hair products were considered as the major source of air pollutants because significantly higher concentrations were found around the working areas. The number of perming treatments, the number of workers, and the frequency of using formaldehyde releasing products, were found to be associated with the levels of formaldehyde. This study indicates that efforts are
Examples of the Application of Nonparametric Information Geometry to Statistical Physics
Giovanni Pistone
2013-09-01
Full Text Available We review a nonparametric version of Amari’s information geometry in which the set of positive probability densities on a given sample space is endowed with an atlas of charts to form a differentiable manifold modeled on Orlicz Banach spaces. This nonparametric setting is used to discuss the setting of typical problems in machine learning and statistical physics, such as black-box optimization, Kullback-Leibler divergence, Boltzmann-Gibbs entropy and the Boltzmann equation.
New exponential, logarithm and q-probability in the non-extensive statistical physics
Chung, Won Sang
2013-01-01
In this paper, a new exponential and logarithm related to the non-extensive statistical physics is proposed by using the q-sum and q-product which satisfy the distributivity. And we discuss the q-mapping from an ordinary probability to q-probability. The q-entropy defined by the idea of q-probability is shown to be q-additive.
Quantum Entropy and Its Applications to Quantum Communication and Statistical Physics
Masanori Ohya
2010-05-01
Full Text Available Quantum entropy is a fundamental concept for quantum information recently developed in various directions. We will review the mathematical aspects of quantum entropy (entropies and discuss some applications to quantum communication, statistical physics. All topics taken here are somehow related to the quantum entropy that the present authors have been studied. Many other fields recently developed in quantum information theory, such as quantum algorithm, quantum teleportation, quantum cryptography, etc., are totally discussed in the book (reference number 60.
Beach, Shaun E.; Semkow, Thomas M.; Remling, David J.; Bradt, Clayton J.
2017-07-01
We have developed accessible methods to demonstrate fundamental statistics in several phenomena, in the context of teaching electronic signal processing in a physics-based college-level curriculum. A relationship between the exponential time-interval distribution and Poisson counting distribution for a Markov process with constant rate is derived in a novel way and demonstrated using nuclear counting. Negative binomial statistics is demonstrated as a model for overdispersion and justified by the effect of electronic noise in nuclear counting. The statistics of digital packets on a computer network are shown to be compatible with the fractal-point stochastic process leading to a power-law as well as generalized inverse Gaussian density distributions of time intervals between packets.
Hart, Carl R; Reznicek, Nathan J; Wilson, D Keith; Pettit, Chris L; Nykaza, Edward T
2016-05-01
Many outdoor sound propagation models exist, ranging from highly complex physics-based simulations to simplified engineering calculations, and more recently, highly flexible statistical learning methods. Several engineering and statistical learning models are evaluated by using a particular physics-based model, namely, a Crank-Nicholson parabolic equation (CNPE), as a benchmark. Narrowband transmission loss values predicted with the CNPE, based upon a simulated data set of meteorological, boundary, and source conditions, act as simulated observations. In the simulated data set sound propagation conditions span from downward refracting to upward refracting, for acoustically hard and soft boundaries, and low frequencies. Engineering models used in the comparisons include the ISO 9613-2 method, Harmonoise, and Nord2000 propagation models. Statistical learning methods used in the comparisons include bagged decision tree regression, random forest regression, boosting regression, and artificial neural network models. Computed skill scores are relative to sound propagation in a homogeneous atmosphere over a rigid ground. Overall skill scores for the engineering noise models are 0.6%, -7.1%, and 83.8% for the ISO 9613-2, Harmonoise, and Nord2000 models, respectively. Overall skill scores for the statistical learning models are 99.5%, 99.5%, 99.6%, and 99.6% for bagged decision tree, random forest, boosting, and artificial neural network regression models, respectively.
Chung-Jung FU
2015-10-01
Full Text Available Background: Infection by Toxocara spp. is known to be significantly associated with partial epilepsy. It has become popular for people to raise dogs/cats as pets and consume roasted meat/viscera, and the status of Toxocara spp. infection, epilepsy awareness, and associated risk factors among the general population are currently unknown in Taiwan.Methods: A seroepidemiological investigation among 203 college students (CSs, consisting of 110 males and 93 females with an average age of 21.5 ± 1.2 years, was conducted in 2009 in Taipei City. A Western blot analysis based on excretory-secretory antigens derived from Toxocara canis larvae (TcESs was applied to determine the positivity of serum immunoglobulin G antibodies. A self-administered questionnaire was also given to obtain information about demographic characteristics, epilepsy awareness, and risk factors. A logistic regression model was applied for the statistical analysis using SPSS software.Results: The overall seropositive rate of Toxocara spp. infection was 8.4% (17/203. As to epilepsy awareness, a non-significantly higher seroprevalence was found in CSs who claimed to "know" about epilepsy compared to those who did not know (P > 0.05.Conclusions: It appears that appropriate educational programs are urgently needed to provide correct knowledge related to the prevention and control measures against Toxocara spp. infections to avoid potential threats by this parasite to the general population in Taiwan.
Cooling effect of rivers on metropolitan Taipei using remote sensing.
Chen, Yen-Chang; Tan, Chih-Hung; Wei, Chiang; Su, Zi-Wen
2014-01-23
This study applied remote sensing technology to analyze how rivers in the urban environment affect the surface temperature of their ambient areas. While surface meteorological stations can supply accurate data points in the city, remote sensing can provide such data in a two-dimensional (2-D) manner. The goal of this paper is to apply the remote sensing technique to further our understanding of the relationship between the surface temperature and rivers in urban areas. The 2-D surface temperature data was retrieved from Landsat-7 thermal infrared images, while data collected by Formosat-2 was used to categorize the land uses in the urban area. The land surface temperature distribution is simulated by a sigmoid function with nonlinear regression analysis. Combining the aforementioned data, the range of effect on the surface temperature from rivers can be derived. With the remote sensing data collected for the Taipei Metropolitan area, factors affecting the surface temperature were explored. It indicated that the effect on the developed area was less significant than on the ambient nature zone; moreover, the size of the buffer zone between the river and city, such as the wetlands or flood plain, was found to correlate with the affected distance of the river surface temperature.
Factors affecting yearly and monthly visits to Taipei Zoo
Su, Ai-Tsen; Lin, Yann-Jou
2018-02-01
This study investigated factors affecting yearly and monthly numbers of visits to Taipei Zoo. Both linear and nonlinear regression models were used to estimate yearly visits. The results of both models showed that the "opening effect" and "animal star effect" had a significantly positive effect on yearly visits, while a SARS outbreak had a negative effect. The number of years had a significant influence on yearly visits. Results showed that the nonlinear model had better explanatory power and fitted the variations of visits better. Results of monthly model showed that monthly visits were significantly influenced by time fluctuations, weather conditions, and the animal star effect. Chinese New Year, summer vacation, numbers of holidays, and animal star exhibitions increased the number of monthly visits, while the number of days with temperatures at or below 15 °C, the number of days with temperatures at or above 30 °C, and the number of rainy days had significantly negative effects. Furthermore, the model of monthly visits showed that the animal star effect could last for over two quarters. The results of this study clarify the factors affecting visits to an outdoor recreation site and confirm the importance of meteorological factors to recreation use.
Roessler, U.
1992-01-01
This volume contains a selection of plenary and invited lectures of the Solid State Division spring meeting of the DPG (Deutsche Physikalische Gesellschaft) 1992 in Regensburg. The constribution come mainly from five fields of the physics of condensed matter: doped fullerenes and high Tc superconductors, surfaces, time-resolved on nonlinear optics, polymer melts, and low-dimensional semiconductor systems. (orig.)
Elżbieta Biernat
2014-12-01
Full Text Available Background: The aim of this paper is to assess whether basic descriptive statistics is sufficient to interpret the data on physical activity of Poles within occupational domain of life. Material and Methods: The study group consisted of 964 randomly selected Polish working professionals. The long version of the International Physical Activity Questionnaire (IPAQ was used. Descriptive statistics included characteristics of variables using: mean (M, median (Me, maximal and minimal values (max–min., standard deviation (SD and percentile values. Statistical inference was based on the comparison of variables with the significance level of 0.05 (Kruskal-Wallis and Pearson’s Chi2 tests. Results: Occupational physical activity (OPA was declared by 46.4% of respondents (vigorous – 23.5%, moderate – 30.2%, walking – 39.5%. The total OPA amounted to 2751.1 MET-min/week (Metabolic Equivalent of Task with very high standard deviation (SD = 5302.8 and max = 35 511 MET-min/week. It concerned different types of activities. Approximately 10% (90th percentile overstated the average. However, there was no significant difference depended on the character of the profession, or the type of activity. The average time of sitting was 256 min/day. As many as 39% of the respondents met the World Health Organization standards only due to OPA (42.5% of white-collar workers, 38% of administrative and technical employees and only 37.9% of physical workers. Conclusions: In the data analysis it is necessary to define quantiles to provide a fuller picture of the distributions of OPA in MET-min/week. It is also crucial to update the guidelines for data processing and analysis of long version of IPAQ. It seems that 16 h of activity/day is not a sufficient criterion for excluding the results from further analysis. Med Pr 2014;65(6:743–753
PHYSICS OF NON-GAUSSIAN FIELDS AND THE COSMOLOGICAL GENUS STATISTIC
James, J. Berian
2012-01-01
We report a technique to calculate the impact of distinct physical processes inducing non-Gaussianity on the cosmological density field. A natural decomposition of the cosmic genus statistic into an orthogonal polynomial sequence allows complete expression of the scale-dependent evolution of the topology of large-scale structure, in which effects including galaxy bias, nonlinear gravitational evolution, and primordial non-Gaussianity may be delineated. The relationship of this decomposition to previous methods for analyzing the genus statistic is briefly considered and the following applications are made: (1) the expression of certain systematics affecting topological measurements, (2) the quantification of broad deformations from Gaussianity that appear in the genus statistic as measured in the Horizon Run simulation, and (3) the study of the evolution of the genus curve for simulations with primordial non-Gaussianity. These advances improve the treatment of flux-limited galaxy catalogs for use with this measurement and further the use of the genus statistic as a tool for exploring non-Gaussianity.
Nonlinear Fluctuation Behavior of Financial Time Series Model by Statistical Physics System
Wuyang Cheng
2014-01-01
Full Text Available We develop a random financial time series model of stock market by one of statistical physics systems, the stochastic contact interacting system. Contact process is a continuous time Markov process; one interpretation of this model is as a model for the spread of an infection, where the epidemic spreading mimics the interplay of local infections and recovery of individuals. From this financial model, we study the statistical behaviors of return time series, and the corresponding behaviors of returns for Shanghai Stock Exchange Composite Index (SSECI and Hang Seng Index (HSI are also comparatively studied. Further, we investigate the Zipf distribution and multifractal phenomenon of returns and price changes. Zipf analysis and MF-DFA analysis are applied to investigate the natures of fluctuations for the stock market.
Data analysis in high energy physics. A practical guide to statistical methods
Behnke, Olaf; Schoerner-Sadenius, Thomas; Kroeninger, Kevin; Schott, Gregory
2013-01-01
This practical guide covers the essential tasks in statistical data analysis encountered in high energy physics and provides comprehensive advice for typical questions and problems. The basic methods for inferring results from data are presented as well as tools for advanced tasks such as improving the signal-to-background ratio, correcting detector effects, determining systematics and many others. Concrete applications are discussed in analysis walkthroughs. Each chapter is supplemented by numerous examples and exercises and by a list of literature and relevant links. The book targets a broad readership at all career levels - from students to senior researchers.
Assaraf, Roland
2014-12-01
We show that the recently proposed correlated sampling without reweighting procedure extends the locality (asymptotic independence of the system size) of a physical property to the statistical fluctuations of its estimator. This makes the approach potentially vastly more efficient for computing space-localized properties in large systems compared with standard correlated methods. A proof is given for a large collection of noninteracting fragments. Calculations on hydrogen chains suggest that this behavior holds not only for systems displaying short-range correlations, but also for systems with long-range correlations.
Geant4 electromagnetic physics for high statistic simulation of LHC experiments
Allison, J; Bagulya, A; Champion, C; Elles, S; Garay, F; Grichine, V; Howard, A; Incerti, S; Ivanchenko, V; Jacquemier, J; Maire, M; Mantero, A; Nieminen, P; Pandola, L; Santin, G; Sawkey, D; Schalicke, A; Urban, L
2012-01-01
An overview of the current status of electromagnetic physics (EM) of the Geant4 toolkit is presented. Recent improvements are focused on the performance of large scale production for LHC and on the precision of simulation results over a wide energy range. Significant efforts have been made to improve the accuracy without compromising of CPU speed for EM particle transport. New biasing options have been introduced, which are applicable to any EM process. These include algorithms to enhance and suppress processes, force interactions or splitting of secondary particles. It is shown that the performance of the EM sub-package is improved. We will report extensions of the testing suite allowing high statistics validation of EM physics. It includes validation of multiple scattering, bremsstrahlung and other models. Cross checks between standard and low-energy EM models have been performed using evaluated data libraries and reference benchmark results.
Wegner, Franz
2016-01-01
This text presents the mathematical concepts of Grassmann variables and the method of supersymmetry to a broad audience of physicists interested in applying these tools to disordered and critical systems, as well as related topics in statistical physics. Based on many courses and seminars held by the author, one of the pioneers in this field, the reader is given a systematic and tutorial introduction to the subject matter. The algebra and analysis of Grassmann variables is presented in part I. The mathematics of these variables is applied to a random matrix model, path integrals for fermions, dimer models and the Ising model in two dimensions. Supermathematics - the use of commuting and anticommuting variables on an equal footing - is the subject of part II. The properties of supervectors and supermatrices, which contain both commuting and Grassmann components, are treated in great detail, including the derivation of integral theorems. In part III, supersymmetric physical models are considered. While supersym...
Tropical limit and a micro-macro correspondence in statistical physics
Angelelli, Mario
2017-10-01
Tropical mathematics is used to establish a correspondence between certain microscopic and macroscopic objects in statistical models. Tropical algebra gives a common framework for macrosystems (subsets) and their elementary constituents (elements) that is well-behaved with respect to composition. This kind of connection is studied with maps that preserve a monoid structure. The approach highlights an underlying order relation that is explored through the concepts of filter and ideal. Particular attention is paid to asymmetry and duality between max- and min-criteria. Physical implementations are presented through simple examples in thermodynamics and non-equilibrium physics. The phenomenon of ultrametricity, the notion of tropical equilibrium and the role of ground energy in non-equilibrium models are discussed. Tropical symmetry, i.e. idempotence, is investigated.
Data analysis in high energy physics a practical guide to statistical methods
Behnke, Olaf; Kröninger, Kevin; Schott, Grégory; Schörner-Sadenius, Thomas
2013-01-01
This practical guide covers the most essential statistics-related tasks and problems encountered in high-energy physics data analyses. It addresses both advanced students entering the field of particle physics as well as researchers looking for a reliable source on optimal separation of signal and background, determining signals or estimating upper limits, correcting the data for detector effects and evaluating systematic uncertainties. Each chapter is dedicated to a single topic and supplemented by a substantial number of both paper and computer exercises related to real experiments, with the solutions provided at the end of the book along with references. A special feature of the book are the analysis walk-throughs used to illustrate the application of the methods discussed beforehand. The authors give examples of data analysis, referring to real problems in HEP, and display the different stages of data analysis in a descriptive manner. The accompanying website provides more algorithms as well as up-to-date...
Fundamental properties of fracture and seismicity in a non extensive statistical physics framework.
Vallianatos, Filippos
2010-05-01
A fundamental challenge in many scientific disciplines concerns upscaling, that is, of determining the regularities and laws of evolution at some large scale, from those known at a lower scale. Earthquake physics is no exception, with the challenge of understanding the transition from the laboratory scale to the scale of fault networks and large earthquakes. In this context, statistical physics has a remarkably successful work record in addressing the upscaling problem in physics. It is natural then to consider that the physics of many earthquakes has to be studied with a different approach than the physics of one earthquake and in this sense we can consider the use of statistical physics not only appropriate but necessary to understand the collective properties of earthquakes [see Corral 2004, 2005a,b,c;]. A significant attempt is given in a series of works [Main 1996; Rundle et al., 1997; Main et al., 2000; Main and Al-Kindy, 2002; Rundle et al., 2003; Vallianatos and Triantis, 2008a] that uses classical statistical physics to describe seismicity. Then a natural question arises. What type of statistical physics is appropriate to commonly describe effects from fracture level to seismicity scale?? The application of non extensive statistical physics offers a consistent theoretical framework, based on a generalization of entropy, to analyze the behavior of natural systems with fractal or multi-fractal distribution of their elements. Such natural systems where long - range interactions or intermittency are important, lead to power law behavior. We note that this is consistent with a classical thermodynamic approach to natural systems that rapidly attain equilibrium, leading to exponential-law behavior. In the frame of non extensive statistical physics approach, the probability function p(X) is calculated using the maximum entropy formulation of Tsallis entropy which involves the introduction of at least two constraints (Tsallis et al., 1998). The first one is the
Chu, Tu-Bin; Liao, Chien-Wei; Nara, Takeshi; Huang, Ying-Chie; Chou, Chia-Mei; Liu, Yu-Hsin; Fan, Chia-Kwung
2012-10-01
Whether Enterobius vermicularis (pinworm) infections among preschool children in Taipei City had truly declined was investigated. A total of 6,661 preschool children from 28 nurseries were randomly selected from 4 major geographic districts in Taipei City to examine the status of pinworm infection by using adhesive thin cellophane tape swab method. The overall prevalence of pinworm infection was 0.5% (30/6,661). Boys (0.6%; 21/3,524) had higher prevalence than girls (0.3%; 9/3,137) (p=0.06). Southern district (0.6%; 10/1,789) showed insignificantly higher prevalence than Western district (0.2%; 1/606) (p=0.22). Pinworm screening program remains necessary for some parts of Taipei City.
Tu-Bin Chu
2012-10-01
Full Text Available INTRODUCTION: Whether Enterobius vermicularis (pinworm infections among preschool children in Taipei City had truly declined was investigated. METHODS: A total of 6,661 preschool children from 28 nurseries were randomly selected from 4 major geographic districts in Taipei City to examine the status of pinworm infection by using adhesive thin cellophane tape swab method. RESULTS: The overall prevalence of pinworm infection was 0.5% (30/6,661. Boys (0.6%; 21/3,524 had higher prevalence than girls (0.3%; 9/3,137 (p=0.06. Southern district (0.6%; 10/1,789 showed insignificantly higher prevalence than Western district (0.2%; 1/606 (p=0.22. CONCLUSIONS: Pinworm screening program remains necessary for some parts of Taipei City.
The Impacts of the Mass Rapid Transit System on Household Car Ownership in Taipei
Wen-Hsiu Huang
2014-06-01
Full Text Available This paper investigates the impacts of Taipei Mass Rapid Transit (MRT system on household car ownership and analyses how socioeconomic characteristics affect household car ownership. We employ a difference-in-difference (DID strategy integrated with generalized Poisson regression models to examine the effects of MRT. The results are as follows: first, the establishment of Taipei MRT significantly reduced the level of household car ownership. Expanding the network of MRT system can be a feasible policy to control car ownership. Second, the levels of household car ownership are related to household’s socioeconomic characteristics. Third, households with high dependence on public transport own fewer cars after Taipei MRT began operation. Hence, the traffic authority should adopt more effective methods to encourage public transit use in order to decrease household car ownership.
Statistical physics of community ecology: a cavity solution to MacArthur’s consumer resource model
Advani, Madhu; Bunin, Guy; Mehta, Pankaj
2018-03-01
A central question in ecology is to understand the ecological processes that shape community structure. Niche-based theories have emphasized the important role played by competition for maintaining species diversity. Many of these insights have been derived using MacArthur’s consumer resource model (MCRM) or its generalizations. Most theoretical work on the MCRM has focused on small ecosystems with a few species and resources. However theoretical insights derived from small ecosystems many not scale up to large ecosystems with many resources and species because large systems with many interacting components often display new emergent behaviors that cannot be understood or deduced from analyzing smaller systems. To address these shortcomings, we develop a statistical physics inspired cavity method to analyze MCRM when both the number of species and the number of resources is large. Unlike previous work in this limit, our theory addresses resource dynamics and resource depletion and demonstrates that species generically and consistently perturb their environments and significantly modify available ecological niches. We show how our cavity approach naturally generalizes niche theory to large ecosystems by accounting for the effect of collective phenomena on species invasion and ecological stability. Our theory suggests that such phenomena are a generic feature of large, natural ecosystems and must be taken into account when analyzing and interpreting community structure. It also highlights the important role that statistical-physics inspired approaches can play in furthering our understanding of ecology.
Quantifying fluctuations in economic systems by adapting methods of statistical physics
Stanley, H. E.; Gopikrishnan, P.; Plerou, V.; Amaral, L. A. N.
2000-12-01
The emerging subfield of econophysics explores the degree to which certain concepts and methods from statistical physics can be appropriately modified and adapted to provide new insights into questions that have been the focus of interest in the economics community. Here we give a brief overview of two examples of research topics that are receiving recent attention. A first topic is the characterization of the dynamics of stock price fluctuations. For example, we investigate the relation between trading activity - measured by the number of transactions NΔ t - and the price change GΔ t for a given stock, over a time interval [t, t+ Δt] . We relate the time-dependent standard deviation of price fluctuations - volatility - to two microscopic quantities: the number of transactions NΔ t in Δ t and the variance WΔ t2 of the price changes for all transactions in Δ t. Our work indicates that while the pronounced tails in the distribution of price fluctuations arise from WΔ t, the long-range correlations found in ∣ GΔ t∣ are largely due to NΔ t. We also investigate the relation between price fluctuations and the number of shares QΔ t traded in Δ t. We find that the distribution of QΔ t is consistent with a stable Lévy distribution, suggesting a Lévy scaling relationship between QΔ t and NΔ t, which would provide one explanation for volume-volatility co-movement. A second topic concerns cross-correlations between the price fluctuations of different stocks. We adapt a conceptual framework, random matrix theory (RMT), first used in physics to interpret statistical properties of nuclear energy spectra. RMT makes predictions for the statistical properties of matrices that are universal, that is, do not depend on the interactions between the elements comprising the system. In physics systems, deviations from the predictions of RMT provide clues regarding the mechanisms controlling the dynamics of a given system, so this framework can be of potential value if
The Physical Models and Statistical Procedures Used in the RACER Monte Carlo Code
Sutton, T.M.; Brown, F.B.; Bischoff, F.G.; MacMillan, D.B.; Ellis, C.L.; Ward, J.T.; Ballinger, C.T.; Kelly, D.J.; Schindler, L.
1999-01-01
This report describes the MCV (Monte Carlo - Vectorized)Monte Carlo neutron transport code [Brown, 1982, 1983; Brown and Mendelson, 1984a]. MCV is a module in the RACER system of codes that is used for Monte Carlo reactor physics analysis. The MCV module contains all of the neutron transport and statistical analysis functions of the system, while other modules perform various input-related functions such as geometry description, material assignment, output edit specification, etc. MCV is very closely related to the 05R neutron Monte Carlo code [Irving et al., 1965] developed at Oak Ridge National Laboratory. 05R evolved into the 05RR module of the STEMB system, which was the forerunner of the RACER system. Much of the overall logic and physics treatment of 05RR has been retained and, indeed, the original verification of MCV was achieved through comparison with STEMB results. MCV has been designed to be very computationally efficient [Brown, 1981, Brown and Martin, 1984b; Brown, 1986]. It was originally programmed to make use of vector-computing architectures such as those of the CDC Cyber- 205 and Cray X-MP. MCV was the first full-scale production Monte Carlo code to effectively utilize vector-processing capabilities. Subsequently, MCV was modified to utilize both distributed-memory [Sutton and Brown, 1994] and shared memory parallelism. The code has been compiled and run on platforms ranging from 32-bit UNIX workstations to clusters of 64-bit vector-parallel supercomputers. The computational efficiency of the code allows the analyst to perform calculations using many more neutron histories than is practical with most other Monte Carlo codes, thereby yielding results with smaller statistical uncertainties. MCV also utilizes variance reduction techniques such as survival biasing, splitting, and rouletting to permit additional reduction in uncertainties. While a general-purpose neutron Monte Carlo code, MCV is optimized for reactor physics calculations. It has the
New Hybrid Monte Carlo methods for efficient sampling. From physics to biology and statistics
Akhmatskaya, Elena; Reich, Sebastian
2011-01-01
We introduce a class of novel hybrid methods for detailed simulations of large complex systems in physics, biology, materials science and statistics. These generalized shadow Hybrid Monte Carlo (GSHMC) methods combine the advantages of stochastic and deterministic simulation techniques. They utilize a partial momentum update to retain some of the dynamical information, employ modified Hamiltonians to overcome exponential performance degradation with the system’s size and make use of multi-scale nature of complex systems. Variants of GSHMCs were developed for atomistic simulation, particle simulation and statistics: GSHMC (thermodynamically consistent implementation of constant-temperature molecular dynamics), MTS-GSHMC (multiple-time-stepping GSHMC), meso-GSHMC (Metropolis corrected dissipative particle dynamics (DPD) method), and a generalized shadow Hamiltonian Monte Carlo, GSHmMC (a GSHMC for statistical simulations). All of these are compatible with other enhanced sampling techniques and suitable for massively parallel computing allowing for a range of multi-level parallel strategies. A brief description of the GSHMC approach, examples of its application on high performance computers and comparison with other existing techniques are given. Our approach is shown to resolve such problems as resonance instabilities of the MTS methods and non-preservation of thermodynamic equilibrium properties in DPD, and to outperform known methods in sampling efficiency by an order of magnitude. (author)
Analysis of Ground Displacements in Taipei Area by Using High Resolution X-band SAR Interferometry
Tung, H.; Chen, H. Y.; Hu, J. C.
2014-12-01
Located at the northern part of Taiwan, Taipei is the most densely populated city and the center of politic, economic, and culture of this island. North of the Taipei basin, the active Tatun volcano group with the eruptive potential to devastate the entire Taipei is only 15 km away from the capital Taipei. Furthermore, the active Shanchiao fault located in the western margin of Taipei basin. Therefore, it is not only an interesting scientific topic but also a strong social impact to better understand the assessment and mitigation of geological hazard in the metropolitan Taipei city. In this study, we use 12 high resolution X-band SAR images from the new generation COSMO-SkyMed (CSK) constellation for associating with leveling and GPS data to monitor surface deformation around the Shanchiao fault and the Tatun volcano group. The stripmap mode of CSK SAR images provides spatial resolution of 3 m x 3 m, which is one order of magnitude better than the previous available satellite SAR data. Furthermore, the more frequent revisit of the same Area of Interest (AOI) of the present X-band missions provides massive datasets to avoid the baseline limitation and temporal decorrelation to improve the temporal resolution of deformation in time series. After transferring the GPS vectors and leveling data to the LOS direction by referring to continuous GPS station BANC, the R square between PS velocities and GPS velocities is approximate to 0.9, which indicates the high reliability of our PSInSAR result. In addition, the well-fitting profiles between leveling data and PSInSAR result along two leveling routes both demonstrate that the significant deformation gradient mainly occurs along the Shanchiao fault. The severe land subsidence area is located in the western part of Taipei basin just next to the Shanchiao fault with a maximum of SRD rate of 30 mm/yr. However, the severe subsidence area, Wuku, is also one industrial area in Taipei which could be attributed to anthropogenic
Tayurskii, Dmitrii; Abe, Sumiyoshi; Alexandre Wang, Q.
2012-11-01
The 3rd International Workshop on Statistical Physics and Mathematics for Complex Systems (SPMCS2012) was held between 25-30 August at Kazan (Volga Region) Federal University, Kazan, Russian Federation. This workshop was jointly organized by Kazan Federal University and Institut Supérieur des Matériaux et Mécaniques Avancées (ISMANS), France. The series of SPMCS workshops was created in 2008 with the aim to be an interdisciplinary incubator for the worldwide exchange of innovative ideas and information about the latest results. The first workshop was held at ISMANS, Le Mans (France) in 2008, and the third at Huazhong Normal University, Wuhan (China) in 2010. At SPMCS2012, we wished to bring together a broad community of researchers from the different branches of the rapidly developing complexity science to discuss the fundamental theoretical challenges (geometry/topology, number theory, statistical physics, dynamical systems, etc) as well as experimental and applied aspects of many practical problems (condensed matter, disordered systems, financial markets, chemistry, biology, geoscience, etc). The program of SPMCS2012 was prepared based on three categories: (i) physical and mathematical studies (quantum mechanics, generalized nonequilibrium thermodynamics, nonlinear dynamics, condensed matter physics, nanoscience); (ii) natural complex systems (physical, geophysical, chemical and biological); (iii) social, economical, political agent systems and man-made complex systems. The conference attracted 64 participants from 10 countries. There were 10 invited lectures, 12 invited talks and 28 regular oral talks in the morning and afternoon sessions. The book of Abstracts is available from the conference website (http://www.ksu.ru/conf/spmcs2012/?id=3). A round table was also held, the topic of which was 'Recent and Anticipated Future Progress in Science of Complexity', discussing a variety of questions and opinions important for the understanding of the concept of
Grassberger, P.
2004-10-01
This book contains 18 contributions from different authors. Its subtitle `Econophysics, Bioinformatics, and Pattern Recognition' says more precisely what it is about: not so much about central problems of conventional statistical physics like equilibrium phase transitions and critical phenomena, but about its interdisciplinary applications. After a long period of specialization, physicists have, over the last few decades, found more and more satisfaction in breaking out of the limitations set by the traditional classification of sciences. Indeed, this classification had never been strict, and physicists in particular had always ventured into other fields. Helmholtz, in the middle of the 19th century, had considered himself a physicist when working on physiology, stressing that the physics of animate nature is as much a legitimate field of activity as the physics of inanimate nature. Later, Max Delbrück and Francis Crick did for experimental biology what Schrödinger did for its theoretical foundation. And many of the experimental techniques used in chemistry, biology, and medicine were developed by a steady stream of talented physicists who left their proper discipline to venture out into the wider world of science. The development we have witnessed over the last thirty years or so is different. It started with neural networks where methods could be applied which had been developed for spin glasses, but todays list includes vehicular traffic (driven lattice gases), geology (self-organized criticality), economy (fractal stochastic processes and large scale simulations), engineering (dynamical chaos), and many others. By staying in the physics departments, these activities have transformed the physics curriculum and the view physicists have of themselves. In many departments there are now courses on econophysics or on biological physics, and some universities offer degrees in the physics of traffic or in econophysics. In order to document this change of attitude
Ricci-Tersenghi, Federico; Zdeborova, Lenka; Zecchina, Riccardo; Tramel, Eric W; Cugliandolo, Leticia F
2015-01-01
This book contains a collection of the presentations that were given in October 2013 at the Les Houches Autumn School on statistical physics, optimization, inference, and message-passing algorithms. In the last decade, there has been increasing convergence of interest and methods between theoretical physics and fields as diverse as probability, machine learning, optimization, and inference problems. In particular, much theoretical and applied work in statistical physics and computer science has relied on the use of message-passing algorithms and their connection to the statistical physics of glasses and spin glasses. For example, both the replica and cavity methods have led to recent advances in compressed sensing, sparse estimation, and random constraint satisfaction, to name a few. This book’s detailed pedagogical lectures on statistical inference, computational complexity, the replica and cavity methods, and belief propagation are aimed particularly at PhD students, post-docs, and young researchers desir...
Kushnirenko, A.N.
1989-01-01
An attempt was made to substantiate statistical physics from the viewpoint of many-body quantum mechanics in the representation of occupation numbers. This approach enabled to develop the variation method for solution of stationary and nonstationary nonequilibrium problems
Physics colloquium: Single-electron counting in quantum metrology and in statistical mechanics
Geneva University
2011-01-01
GENEVA UNIVERSITY Ecole de physique Département de physique nucléaire et corspusculaire 24, quai Ernest-Ansermet 1211 Genève 4 Tél.: (022) 379 62 73 Fax: (022) 379 69 92olé Lundi 17 octobre 2011 17h00 - Ecole de Physique, Auditoire Stueckelberg PHYSICS COLLOQUIUM « Single-electron counting in quantum metrology and in statistical mechanics » Prof. Jukka Pekola Low Temperature Laboratory, Aalto University Helsinki, Finland First I discuss the basics of single-electron tunneling and its potential applications in metrology. My main focus is in developing an accurate source of single-electron current for the realization of the unit ampere. I discuss the principle and the present status of the so-called single- electron turnstile. Investigation of errors in transporting electrons one by one has revealed a wealth of observations on fundamental phenomena in mesoscopic superconductivity, including individual Andreev...
Holcman, David
2018-01-01
This is a monograph on the emerging branch of mathematical biophysics combining asymptotic analysis with numerical and stochastic methods to analyze partial differential equations arising in biological and physical sciences. In more detail, the book presents the analytic methods and tools for approximating solutions of mixed boundary value problems, with particular emphasis on the narrow escape problem. Informed throughout by real-world applications, the book includes topics such as the Fokker-Planck equation, boundary layer analysis, WKB approximation, applications of spectral theory, as well as recent results in narrow escape theory. Numerical and stochastic aspects, including mean first passage time and extreme statistics, are discussed in detail and relevant applications are presented in parallel with the theory. Including background on the classical asymptotic theory of differential equations, this book is written for scientists of various backgrounds interested in deriving solutions to real-world proble...
Solving Large-Scale Computational Problems Using Insights from Statistical Physics
Selman, Bart [Cornell University
2012-02-29
Many challenging problems in computer science and related fields can be formulated as constraint satisfaction problems. Such problems consist of a set of discrete variables and a set of constraints between those variables, and represent a general class of so-called NP-complete problems. The goal is to find a value assignment to the variables that satisfies all constraints, generally requiring a search through and exponentially large space of variable-value assignments. Models for disordered systems, as studied in statistical physics, can provide important new insights into the nature of constraint satisfaction problems. Recently, work in this area has resulted in the discovery of a new method for solving such problems, called the survey propagation (SP) method. With SP, we can solve problems with millions of variables and constraints, an improvement of two orders of magnitude over previous methods.
Sellaoui, Lotfi; Mechi, Nesrine; Lima, Éder Cláudio; Dotto, Guilherme Luiz; Ben Lamine, Abdelmottaleb
2017-10-01
Based on statistical physics elements, the equilibrium adsorption of diclofenac (DFC) and nimesulide (NM) on activated carbon was analyzed by a multilayer model with saturation. The paper aimed to describe experimentally and theoretically the adsorption process and study the effect of adsorbate size using the model parameters. From numerical simulation, the number of molecules per site showed that the adsorbate molecules (DFC and NM) were mostly anchored in both sides of the pore walls. The receptor sites density increase suggested that additional sites appeared during the process, to participate in DFC and NM adsorption. The description of the adsorption energy behavior indicated that the process was physisorption. Finally, by a model parameters correlation, the size effect of the adsorbate was deduced indicating that the molecule dimension has a negligible effect on the DFC and NM adsorption.
Occupational hand dermatitis in a tertiary referral dermatology clinic in Taipei.
Sun, C C; Guo, Y L; Lin, R S
1995-12-01
Occupational skin disease is one of the most common occupational diseases. The hand is the most frequent site of involvement in occupational skin disease. We interviewed and examined patients seen in the Contact Dermatitis Clinic of the National Taiwan University Medical Center, a tertiary referral center in Taipei City. For patients suspected of having allergic skin diseases, patch testing was carried out using the European standard series and suspected allergens. Occupational hand dermatitis (OHD) was diagnosed according to medical history, work exposure, physical examination, and patch test findings. 36% of patients seen were diagnosed as having OHD. Electronics, hairdressing, medical, chemical, and construction were the most important industries causing OHD. In the 164 patients with OHD, 58.5% had irritant contact dermatitis (ICD) and 41.5% allergic contact dermatitis (ACD). Dorsal fingers, nail folds, and dorsal hands were most frequently involved in patients with ACD; dorsal fingers, volar fingers and fingertips were most frequently involved in those with ICD. Using logistic regression analysis, we were able to identify the most important clinical presentations that predicted the types of OHD, ACD versus ICD. Patients with atopic history and palm involvement were more likely to have ICD, and those with nail fold involvement more likely to have ACD. In patients with ACD, the most important allergens were dichromate, nickel, cobalt, fragrance mix, epoxy resin, thiuram mix, and p-phenylenediamine. In this study, we identified the important industries and causal agents for OHD. Future preventive measures focused on these industries and agents to reduce OHD will be warranted.
Sexual knowledge, attitudes and activity of older people in Taipei, Taiwan.
Wang, Tze-Fang; Lu, Chwen-Hwa; Chen, I-Ju; Yu, Shu
2008-02-01
We examined sexual activity and predictive factors among older people in Taipei, Taiwan. We aimed to characterize the older population engaged in sexual activity and determine influencing factors, exploring aspects of sexuality that may influence elders' health and quality of life (QOL). Studies of sexual attitudes and behaviour have found that sexual difficulties are common among mature adults worldwide, influenced in men and women by physical health, ageing, psychosocial and cultural factors. We conducted a community-based retrospective study involving a random sample of 412 men and 204 women over age 65. A questionnaire on demographics and social situations was administered, along with a Sexuality Knowledge and Attitudes Scale; 34 questions evaluated sexual knowledge and 18 evaluated sexual attitudes. Two-hundred and twenty participants were sexually active (35.7%), 185 mainly with spouses (84.1%); frequency was 21.4 (SD 16.9) times per year (range: 1-120). Multiple logistic regressions identified five significant predictors of sexual activity: gender, age, being with spouse, sexual knowledge and sexual attitudes. Sexual activity was significantly associated with higher education levels, lower stress and more self-reported daily activities. Our results agreed with Western studies linking sexual activity with better health and higher QOL in older adults. Older peoples' stress and daily activity levels are recognized quality-of-life measures; lower stress and more daily activities among sexually active older people suggests a connection between sexual activity and higher QOL. Increasing knowledge and improving attitudes about sexuality may help older people build healthier relationships and enhance health and QOL. Relevance to clinical practice. If healthcare professionals possess greater understanding of older peoples' sexuality, healthcare systems may find ways to increase sexual knowledge and foster healthier attitudes and relationships to improve older peoples
Harlim, John; Mahdi, Adam; Majda, Andrew J.
2014-01-01
A central issue in contemporary science is the development of nonlinear data driven statistical–dynamical models for time series of noisy partial observations from nature or a complex model. It has been established recently that ad-hoc quadratic multi-level regression models can have finite-time blow-up of statistical solutions and/or pathological behavior of their invariant measure. Recently, a new class of physics constrained nonlinear regression models were developed to ameliorate this pathological behavior. Here a new finite ensemble Kalman filtering algorithm is developed for estimating the state, the linear and nonlinear model coefficients, the model and the observation noise covariances from available partial noisy observations of the state. Several stringent tests and applications of the method are developed here. In the most complex application, the perfect model has 57 degrees of freedom involving a zonal (east–west) jet, two topographic Rossby waves, and 54 nonlinearly interacting Rossby waves; the perfect model has significant non-Gaussian statistics in the zonal jet with blocked and unblocked regimes and a non-Gaussian skewed distribution due to interaction with the other 56 modes. We only observe the zonal jet contaminated by noise and apply the ensemble filter algorithm for estimation. Numerically, we find that a three dimensional nonlinear stochastic model with one level of memory mimics the statistical effect of the other 56 modes on the zonal jet in an accurate fashion, including the skew non-Gaussian distribution and autocorrelation decay. On the other hand, a similar stochastic model with zero memory levels fails to capture the crucial non-Gaussian behavior of the zonal jet from the perfect 57-mode model
Theoretical approaches to the steady-state statistical physics of interacting dissipative units
Bertin, Eric
2017-02-01
The aim of this review is to provide a concise overview of some of the generic approaches that have been developed to deal with the statistical description of large systems of interacting dissipative ‘units’. The latter notion includes, e.g. inelastic grains, active or self-propelled particles, bubbles in a foam, low-dimensional dynamical systems like driven oscillators, or even spatially extended modes like Fourier modes of the velocity field in a fluid. We first review methods based on the statistical properties of a single unit, starting with elementary mean-field approximations, either static or dynamic, that describe a unit embedded in a ‘self-consistent’ environment. We then discuss how this basic mean-field approach can be extended to account for spatial dependences, in the form of space-dependent mean-field Fokker-Planck equations, for example. We also briefly review the use of kinetic theory in the framework of the Boltzmann equation, which is an appropriate description for dilute systems. We then turn to descriptions in terms of the full N-body distribution, starting from exact solutions of one-dimensional models, using a matrix-product ansatz method when correlations are present. Since exactly solvable models are scarce, we also present some approximation methods which can be used to determine the N-body distribution in a large system of dissipative units. These methods include the Edwards approach for dense granular matter and the approximate treatment of multiparticle Langevin equations with colored noise, which models systems of self-propelled particles. Throughout this review, emphasis is put on methodological aspects of the statistical modeling and on formal similarities between different physical problems, rather than on the specific behavior of a given system.
Chen, Yu-Fen; Hu, Yi-Chun; Shen, Hsi-Che; Chang, Hui-Te; Tung, Tao-Hsin
2014-01-01
To discuss the prevalence and associated factors related to an elevated serum alanine aminotransferase (ALT) level among the elderly agricultural and fishing population. A total of 6542 (3989 males and 2553 females) healthy adults voluntarily admitted to a teaching hospital for a physical checkup in 2010 in Taipei, Taiwan. Fasting blood samples were drawn via venipuncture, and clinical nurses interviewed the study participants using a structured questionnaire from. The overall prevalence of an elevated serum ALT level was 18.2% and revealed a statistically significant decrease with increasing age (P < 0.001). The men exhibited a higher prevalence than the women (19.7% vs 15.9%; P < 0.001). Male sex; younger age; and presence of obesity, hypertension, hyperuricemia, and hypoalbuminemia were significantly associated with an elevated serum ALT level. Sex-related differences were also revealed. For the men, type 2 diabetes (odds ratio [OR], 1.23; 95% confidence interval [CI], 1.02-1.57), hypercholesterolemia (OR, 1.78; 95% CI, 1.22-2.83), hypertriglyceridemia (OR, 1.32; 95% CI, 1.04-1.73), and low high-density lipoprotein (OR, 1.26; 95% CI, 1.05-1.51) were significantly related to an elevated serum ALT level, but this was not so for the women. The disparity of ALT in age groups was revealed. Several sex-related differences were indicated pertaining to the prevalence of an elevated serum ALT level among elderly specific occupational population.
Riandry, M. A.; Ismet, I.; Akhsan, H.
2017-09-01
This study aims to produce a valid and practical statistical physics course handout on distribution function materials based on STEM. Rowntree development model is used to produce this handout. The model consists of three stages: planning, development and evaluation stages. In this study, the evaluation stage used Tessmer formative evaluation. It consists of 5 stages: self-evaluation, expert review, one-to-one evaluation, small group evaluation and field test stages. However, the handout is limited to be tested on validity and practicality aspects, so the field test stage is not implemented. The data collection technique used walkthroughs and questionnaires. Subjects of this study are students of 6th and 8th semester of academic year 2016/2017 Physics Education Study Program of Sriwijaya University. The average result of expert review is 87.31% (very valid category). One-to-one evaluation obtained the average result is 89.42%. The result of small group evaluation is 85.92%. From one-to-one and small group evaluation stages, averagestudent response to this handout is 87,67% (very practical category). Based on the results of the study, it can be concluded that the handout is valid and practical.
A Comparative Study of Music Education in Shanghai and Taipei: Westernization and Nationalization
Wai-Chung, Ho
2004-01-01
This study compares the music taught and its associated cultural values in Shanghai and Taipei primary and secondary schools. Both owe their cultural ascendancy to traditional Chinese music and western musicology. How do the music education systems of these two Chinese communities reflect their respective public cultures and political ideologies?…
Yue, Ziao Dong; Rudowicz, Elisabeth
2002-01-01
A survey of 489 undergraduates in Beijing, Guangzhou, Hong Kong, and Taipei, found politicians were nominated by all four samples as being the most creative individuals in the past and at present. Scientists and inventors ranked second in position. Artists, musicians, and businessmen were rarely nominated. (Contains references.) (Author/CR)
A Study of Fifth Graders' Environmental Learning Outcomes in Taipei
Lai, Ching-San
2018-01-01
Environmental education has recently received much more attention than before among elementary school students' science learning in Taiwan. The major purpose of this study is to explore the learning outcomes on environmental education for 5th graders in Taipei. A quasi-experimental design with a single group was used in this study. Students in the…
Reeves, Mark
2014-03-01
Entropy changes underlie the physics that dominates biological interactions. Indeed, introductory biology courses often begin with an exploration of the qualities of water that are important to living systems. However, one idea that is not explicitly addressed in most introductory physics or biology textbooks is dominant contribution of the entropy in driving important biological processes towards equilibrium. From diffusion to cell-membrane formation, to electrostatic binding in protein folding, to the functioning of nerve cells, entropic effects often act to counterbalance deterministic forces such as electrostatic attraction and in so doing, allow for effective molecular signaling. A small group of biology, biophysics and computer science faculty have worked together for the past five years to develop curricular modules (based on SCALEUP pedagogy) that enable students to create models of stochastic and deterministic processes. Our students are first-year engineering and science students in the calculus-based physics course and they are not expected to know biology beyond the high-school level. In our class, they learn to reduce seemingly complex biological processes and structures to be described by tractable models that include deterministic processes and simple probabilistic inference. The students test these models in simulations and in laboratory experiments that are biologically relevant. The students are challenged to bridge the gap between statistical parameterization of their data (mean and standard deviation) and simple model-building by inference. This allows the students to quantitatively describe realistic cellular processes such as diffusion, ionic transport, and ligand-receptor binding. Moreover, the students confront ``random'' forces and traditional forces in problems, simulations, and in laboratory exploration throughout the year-long course as they move from traditional kinematics through thermodynamics to electrostatic interactions. This talk
Applications of modern statistical methods to analysis of data in physical science
Wicker, James Eric
Modern methods of statistical and computational analysis offer solutions to dilemmas confronting researchers in physical science. Although the ideas behind modern statistical and computational analysis methods were originally introduced in the 1970's, most scientists still rely on methods written during the early era of computing. These researchers, who analyze increasingly voluminous and multivariate data sets, need modern analysis methods to extract the best results from their studies. The first section of this work showcases applications of modern linear regression. Since the 1960's, many researchers in spectroscopy have used classical stepwise regression techniques to derive molecular constants. However, problems with thresholds of entry and exit for model variables plagues this analysis method. Other criticisms of this kind of stepwise procedure include its inefficient searching method, the order in which variables enter or leave the model and problems with overfitting data. We implement an information scoring technique that overcomes the assumptions inherent in the stepwise regression process to calculate molecular model parameters. We believe that this kind of information based model evaluation can be applied to more general analysis situations in physical science. The second section proposes new methods of multivariate cluster analysis. The K-means algorithm and the EM algorithm, introduced in the 1960's and 1970's respectively, formed the basis of multivariate cluster analysis methodology for many years. However, several shortcomings of these methods include strong dependence on initial seed values and inaccurate results when the data seriously depart from hypersphericity. We propose new cluster analysis methods based on genetic algorithms that overcomes the strong dependence on initial seed values. In addition, we propose a generalization of the Genetic K-means algorithm which can accurately identify clusters with complex hyperellipsoidal covariance
A statistical methodology for quantification of uncertainty in best estimate code physical models
Vinai, Paolo; Macian-Juan, Rafael; Chawla, Rakesh
2007-01-01
A novel uncertainty assessment methodology, based on a statistical non-parametric approach, is presented in this paper. It achieves quantification of code physical model uncertainty by making use of model performance information obtained from studies of appropriate separate-effect tests. Uncertainties are quantified in the form of estimated probability density functions (pdf's), calculated with a newly developed non-parametric estimator. The new estimator objectively predicts the probability distribution of the model's 'error' (its uncertainty) from databases reflecting the model's accuracy on the basis of available experiments. The methodology is completed by applying a novel multi-dimensional clustering technique based on the comparison of model error samples with the Kruskall-Wallis test. This takes into account the fact that a model's uncertainty depends on system conditions, since a best estimate code can give predictions for which the accuracy is affected by the regions of the physical space in which the experiments occur. The final result is an objective, rigorous and accurate manner of assigning uncertainty to coded models, i.e. the input information needed by code uncertainty propagation methodologies used for assessing the accuracy of best estimate codes in nuclear systems analysis. The new methodology has been applied to the quantification of the uncertainty in the RETRAN-3D void model and then used in the analysis of an independent separate-effect experiment. This has clearly demonstrated the basic feasibility of the approach, as well as its advantages in yielding narrower uncertainty bands in quantifying the code's accuracy for void fraction predictions
Tsunami vs Infragravity Surge: Statistics and Physical Character of Extreme Runup
Lynett, P. J.; Montoya, L. H.
2017-12-01
Motivated by recent observations of energetic and impulsive infragravity (IG) flooding events - also known as sneaker waves - we will present recent work on the relative probabilities and dynamics of extreme flooding events from tsunamis and long period wind wave events. The discussion will be founded on videos and records of coastal flooding by both recent tsunamis and IG, such as those in the Philippines during Typhoon Haiyan. From these observations, it is evident that IG surges may approach the coast as breaking bores with periods of minutes; a very tsunami-like character. Numerical simulations will be used to estimate flow elevations and speeds from potential IG surges, and these will be compared with similar values from tsunamis, over a range of different beach profiles. We will examine the relative rareness of each type of flooding event, which for large values of IG runup is a particularly challenging topic. For example, for a given runup elevation or flooding speed, the related tsunami return period may be longer than that associated with IG, implying that deposit information associated with such elevations or speeds are more likely to be caused by IG. Our purpose is to provide a statistical and physical discriminant between tsunami and IG, such that in areas exposed to both, a proper interpretation of overland transport, deposition, and damage is possible.
Statistical and physical content of low-energy photons in nuclear medicine imaging
Gagnon, D.; Pouliot, N.; Laperriere, L.; Harel, F.; Gregoire, J.; Arsenault, A.
1990-01-01
Limit in the energy resolution of present gamma camera technology prevents a total rejection of Compton events: inclusion of bad photons in the image is inescapable. Various methods acquiring data over a large portion of the spectrum have already been described. This paper investigates the usefulness of low energy photons using statistical and physical models. Holospectral Imaging, for instance, exploits correlation between energy frames to build an information related transformation optimizing primary photon image. One can also use computer simulation to show that a portion of low energy photons is detected at the same location (pixel) as pure primary photons. These events are for instance: photons undergoing scatter interaction in the crystal; photons undergoing a small angle backscatter or forwardscatter interaction in the medium, photons backscattered by the Pyrex into the crystal. For a 140 keV source in 10 cm of water and a 1/4 inch thick crystal, more than 6% of all the photons detected do not have the primary energy and still are located in the right 4 mm pixel. Similarly, it is possible to show that more than 5% of all the photons detected at 140 keV deposit their energy in more than one pixel. These results give additional support to techniques considering low energy photons and more sophisticated ways to segregate between good and bad events
Statistical physics of fracture: scientific discovery through high-performance computing
Kumar, Phani; Nukala, V V; Simunovic, Srdan; Mills, Richard T
2006-01-01
The paper presents the state-of-the-art algorithmic developments for simulating the fracture of disordered quasi-brittle materials using discrete lattice systems. Large scale simulations are often required to obtain accurate scaling laws; however, due to computational complexity, the simulations using the traditional algorithms were limited to small system sizes. We have developed two algorithms: a multiple sparse Cholesky downdating scheme for simulating 2D random fuse model systems, and a block-circulant preconditioner for simulating 2D random fuse model systems. Using these algorithms, we were able to simulate fracture of largest ever lattice system sizes (L = 1024 in 2D, and L = 64 in 3D) with extensive statistical sampling. Our recent simulations on 1024 processors of Cray-XT3 and IBM Blue-Gene/L have further enabled us to explore fracture of 3D lattice systems of size L = 200, which is a significant computational achievement. These largest ever numerical simulations have enhanced our understanding of physics of fracture; in particular, we analyze damage localization and its deviation from percolation behavior, scaling laws for damage density, universality of fracture strength distribution, size effect on the mean fracture strength, and finally the scaling of crack surface roughness
Statistical Data Analyses of Trace Chemical, Biochemical, and Physical Analytical Signatures
Udey, Ruth Norma [Michigan State Univ., East Lansing, MI (United States)
2013-01-01
Analytical and bioanalytical chemistry measurement results are most meaningful when interpreted using rigorous statistical treatments of the data. The same data set may provide many dimensions of information depending on the questions asked through the applied statistical methods. Three principal projects illustrated the wealth of information gained through the application of statistical data analyses to diverse problems.
Chen, Xinguang; Lau, Maggie; Kan, Ming Yue; Chiang, I-Chyun; Hu, Yih-Jin; Gong, Jie; Li, Lue; Ngok, King-Lun
2016-10-01
This study aimed at assessing the differences in prevalence rates of common health behavior among adolescents in the five Chinese cities and the influential factors at the contextual and individual levels. We compared the standardized rates of three lifestyle behaviors (sedentary, dietary, and physical activity) and three addictive behaviors (cigarette smoking, alcohol consumption, and participation in gambling) among a sample of 13,950 adolescents. The sample was randomly selected from five cities, including Hong Kong, Macau, Taipei, Zhuhai, and Wuhan. Population size, GDP per capita, and literacy at the city level as well as parental monitoring and school performance at the student's level were assessed. Multi-level mixed effect models were used to examine the interaction of individual level factors with study sites. The six health behaviors differed significantly across sites with the highest rates of alcohol consumption in Hong Kong (39.5 %), of cigarette smoking in Macau (9.8 %), and of gambling in Taipei (37.1 %) and Hong Kong (35.9 %). The city-level measures were associated with only a few behavioral measures. Relative to Hong Kong, parental monitoring had stronger association with the three addictive behaviors in the other sites. Findings suggest that although the study sites share similar Chinese culture, students in the five cities differed from each other with regard to levels of health behaviors. Relative to the broad socioeconomic development, differences in parental monitoring played a significant role in explaining the observed difference.
Lan, Ganhui; Tu, Yuhai
2016-01-01
preserving information, it does not reveal the underlying mechanism that leads to the observed input-output relationship, nor does it tell us much about which information is important for the organism and how biological systems use information to carry out specific functions. To do that, we need to develop models of the biological machineries, e.g. biochemical networks and neural networks, to understand the dynamics of biological information processes. This is a much more difficult task. It requires deep knowledge of the underlying biological network—the main players (nodes) and their interactions (links)—in sufficient detail to build a model with predictive power, as well as quantitative input-output measurements of the system under different perturbations (both genetic variations and different external conditions) to test the model predictions to guide further development of the model. Due to the recent growth of biological knowledge thanks in part to high throughput methods (sequencing, gene expression microarray, etc) and development of quantitative in vivo techniques such as various florescence technology, these requirements are starting to be realized in different biological systems. The possible close interaction between quantitative experimentation and theoretical modeling has made systems biology an attractive field for physicists interested in quantitative biology. In this review, we describe some of the recent work in developing a quantitative predictive model of bacterial chemotaxis, which can be considered as the hydrogen atom of systems biology. Using statistical physics approaches, such as the Ising model and Langevin equation, we study how bacteria, such as E. coli, sense and amplify external signals, how they keep a working memory of the stimuli, and how they use these data to compute the chemical gradient. In particular, we will describe how E. coli cells avoid cross-talk in a heterogeneous receptor cluster to keep a ligand-specific memory. We will also
Lan, Ganhui; Tu, Yuhai
2016-05-01
preserving information, it does not reveal the underlying mechanism that leads to the observed input-output relationship, nor does it tell us much about which information is important for the organism and how biological systems use information to carry out specific functions. To do that, we need to develop models of the biological machineries, e.g. biochemical networks and neural networks, to understand the dynamics of biological information processes. This is a much more difficult task. It requires deep knowledge of the underlying biological network—the main players (nodes) and their interactions (links)—in sufficient detail to build a model with predictive power, as well as quantitative input-output measurements of the system under different perturbations (both genetic variations and different external conditions) to test the model predictions to guide further development of the model. Due to the recent growth of biological knowledge thanks in part to high throughput methods (sequencing, gene expression microarray, etc) and development of quantitative in vivo techniques such as various florescence technology, these requirements are starting to be realized in different biological systems. The possible close interaction between quantitative experimentation and theoretical modeling has made systems biology an attractive field for physicists interested in quantitative biology. In this review, we describe some of the recent work in developing a quantitative predictive model of bacterial chemotaxis, which can be considered as the hydrogen atom of systems biology. Using statistical physics approaches, such as the Ising model and Langevin equation, we study how bacteria, such as E. coli, sense and amplify external signals, how they keep a working memory of the stimuli, and how they use these data to compute the chemical gradient. In particular, we will describe how E. coli cells avoid cross-talk in a heterogeneous receptor cluster to keep a ligand-specific memory. We will also
Lan, Ganhui; Tu, Yuhai
2016-05-01
preserving information, it does not reveal the underlying mechanism that leads to the observed input-output relationship, nor does it tell us much about which information is important for the organism and how biological systems use information to carry out specific functions. To do that, we need to develop models of the biological machineries, e.g. biochemical networks and neural networks, to understand the dynamics of biological information processes. This is a much more difficult task. It requires deep knowledge of the underlying biological network-the main players (nodes) and their interactions (links)-in sufficient detail to build a model with predictive power, as well as quantitative input-output measurements of the system under different perturbations (both genetic variations and different external conditions) to test the model predictions to guide further development of the model. Due to the recent growth of biological knowledge thanks in part to high throughput methods (sequencing, gene expression microarray, etc) and development of quantitative in vivo techniques such as various florescence technology, these requirements are starting to be realized in different biological systems. The possible close interaction between quantitative experimentation and theoretical modeling has made systems biology an attractive field for physicists interested in quantitative biology. In this review, we describe some of the recent work in developing a quantitative predictive model of bacterial chemotaxis, which can be considered as the hydrogen atom of systems biology. Using statistical physics approaches, such as the Ising model and Langevin equation, we study how bacteria, such as E. coli, sense and amplify external signals, how they keep a working memory of the stimuli, and how they use these data to compute the chemical gradient. In particular, we will describe how E. coli cells avoid cross-talk in a heterogeneous receptor cluster to keep a ligand-specific memory. We will also
Juang, J. Y.; Sun, C. H.; Jiang, J. A.; Wen, T. H.
2017-12-01
The urban heat island effect (UHI) caused by the regional-to-global environmental changes, dramatic urbanization, and shifting in land-use compositions has becoming an important environmental issue in recent years. In the past century, the coverage of urban area in Taipei Basin has dramatically increasing by ten folds. The strengthen of UHI effect significantly enhances the frequency of warm-night effect, and strongly influences the thermal environment of the residents in the Greater Taipei Metropolitan. In addition, the urban expansions due to dramatic increasing in urban populations and traffic loading significantly impacts the air quality and causes health issue in Taipei. In this study, the main objective is to quantify and characterize the temporal and spatial distributions of thermal environmental and air quality in the Greater Taipei Metropolitan Area by using monitoring data from Central Weather Bureau, Environmental Protection Administration. In addition, in this study, we conduct the analysis on the distribution of physiological equivalent temperature in the micro scale in the metropolitan area by using the observation data and quantitative simulation to investigate how the thermal environment is influenced under different conditions. Furthermore, we establish a real-time mobile monitoring system by using wireless sensor network to investigate the correlation between the thermal environment, air quality and other environmental factors, and propose to develop the early warning system for heat stress and air quality in the metropolitan area. The results from this study can be integrated into the management and planning system, and provide sufficient and important background information for the development of smart city in the metropolitan area in the future.
Molecular epidemiology and evolutionary genetics of
Su Ih-Jen; Lee Shi-Yi; Tsai Wen-Shing; Sun Jun-Ren; Chang Jia-Ru; Lin Chih-Wei; Tseng Fan-Chen; Dou Horng-Yunn; Lu Jang-Jih
2008-01-01
Abstract Background The control of tuberculosis in densely populated cities is complicated by close human-to-human contacts and potential transmission of pathogens from multiple sources. We conducted a molecular epidemiologic analysis of 356 Mycobacterium tuberculosis (MTB) isolates from patients presenting pulmonary tuberculosis in metropolitan Taipei. Classical antibiogram studies and genetic characterization, using mycobacterial interspersed repetitive-unit-variable-number tandem-repeat (M...
Visualisation and globalisation in the Asia-Pacific region: the Taipei Biennial 1996-2008
Turner, Ming
2009-01-01
Whilst globalisation, urbanisation and explosive expansion of urban spaces are the most dynamic and challenging issues in the Asia-Pacific region today, modernisation and cultural re-interpretation are also taking place at a rapid speed. Asia-Pacific metropolises, combining most of their nations’ population and resources, are at the centre of its globalisation process and intend to create their own characters whilst information and fashion have been moving between territories. Taipei, being t...
Impacts of Typhoon Soudelor (2015) on the water quality of Taipei, Taiwan
Hoda Fakour; Shang-Lien Lo; Tsair-Fuh Lin
2016-01-01
Typhoon Soudelor was one of the strongest storms in the world in 2015. The category 5 hurricane made landfall in Taiwan on August 8, causing extensive damage and severe impacts on the environment. This paper describes the changes of trihalomethane (THM) concentrations in tap and drinking fountain water in selected typhoon-affected areas in Taipei before and after the typhoon. Samples were taken from water transmission mains at various distances from the local water treatment plant. The result...
Ying-Ming Su; Mei-Shu Huang
2015-01-01
To mitigate the urban heat island effect has become a global issue when we are faced with the challenge of climate change. Through literature review, plant photosynthesis can reduce the carbon dioxide and mitigate the urban heat island effect to a degree. Because there are not enough open space and parks, green roof has become an important policy in Taiwan. We selected elementary school buildings in northern New Taipei City as research subjects since elementary schools ar...
Chang, Tso-Kang; Liao, Chien-Wei; Huang, Ying-Chieh; Chang, Chun-Chao; Chou, Chia-Mei; Tsay, Hsin-Chieh; Huang, Alice; Guu, Shu-Fen; Kao, Ting-Chang; Fan, Chia-Kwung
2009-06-01
The prevalence of Enterobius vermicularis infection among preschool children was reported to be low based on a 5-year screening program in Taipei City, Taiwan. The Taipei City government intended to terminate the E. vermicularis screening program among preschool children. Thus, we were entrusted with confirming whether pinworm infections among preschool children in Taipei City had truly declined. From each of 12 administrative districts 2-3 kindergartens were randomly selected for investigation. In total, 4,349 children were examined, of which 2,537 were boys and 1,812 were girls. The cellophane tape adhered to a glass slide was used, and all examinations were done by certified medical technologists. Results indicated that the overall prevalence rate of pinworm infections was 0.62% (27/4,349). Although the infection rate was higher among boys (0.67%, 17/2,537) than in girls (0.55%, 10/1,812), no significant difference was found (chi(2) = 0.399, P = 0.62). According to the administrative district, the infection rate ranged from no positive cases of E. vermicularis infection in the Xinyi, Zhongzhen, and Wanhua Districts (0%; 0/299, 0/165, and 0/358, respectively), to 0.26% (1/131) in Songshan District, with the highest rate of 1.88% (7/373) in Wenshan District. Because the overall infection rate (0.62%, 27/4,349) in the present study was unchanged compared to that (0.40%, 197/49,541) previously reported in 2005, we propose that regular pinworm screening and treatment programs should be continued in some parts of Taipei City.
Yi-Ming Kuo
2011-06-01
Full Text Available Fine airborne particulate matter (PM2.5 has adverse effects on human health. Assessing the long-term effects of PM2.5 exposure on human health and ecology is often limited by a lack of reliable PM2.5 measurements. In Taipei, PM2.5 levels were not systematically measured until August, 2005. Due to the popularity of geographic information systems (GIS, the landuse regression method has been widely used in the spatial estimation of PM concentrations. This method accounts for the potential contributing factors of the local environment, such as traffic volume. Geostatistical methods, on other hand, account for the spatiotemporal dependence among the observations of ambient pollutants. This study assesses the performance of the landuse regression model for the spatiotemporal estimation of PM2.5 in the Taipei area. Specifically, this study integrates the landuse regression model with the geostatistical approach within the framework of the Bayesian maximum entropy (BME method. The resulting epistemic framework can assimilate knowledge bases including: (a empirical-based spatial trends of PM concentration based on landuse regression, (b the spatio-temporal dependence among PM observation information, and (c site-specific PM observations. The proposed approach performs the spatiotemporal estimation of PM2.5 levels in the Taipei area (Taiwan from 2005–2007.
Yu, Hwa-Lung; Wang, Chih-Hsih; Liu, Ming-Che; Kuo, Yi-Ming
2011-06-01
Fine airborne particulate matter (PM2.5) has adverse effects on human health. Assessing the long-term effects of PM2.5 exposure on human health and ecology is often limited by a lack of reliable PM2.5 measurements. In Taipei, PM2.5 levels were not systematically measured until August, 2005. Due to the popularity of geographic information systems (GIS), the landuse regression method has been widely used in the spatial estimation of PM concentrations. This method accounts for the potential contributing factors of the local environment, such as traffic volume. Geostatistical methods, on other hand, account for the spatiotemporal dependence among the observations of ambient pollutants. This study assesses the performance of the landuse regression model for the spatiotemporal estimation of PM2.5 in the Taipei area. Specifically, this study integrates the landuse regression model with the geostatistical approach within the framework of the Bayesian maximum entropy (BME) method. The resulting epistemic framework can assimilate knowledge bases including: (a) empirical-based spatial trends of PM concentration based on landuse regression, (b) the spatio-temporal dependence among PM observation information, and (c) site-specific PM observations. The proposed approach performs the spatiotemporal estimation of PM2.5 levels in the Taipei area (Taiwan) from 2005-2007.
Improved consistency in dosing anti-tuberculosis drugs in Taipei, Taiwan.
Chiang, Chen-Yuan; Yu, Ming-Chih; Shih, Hsiu-Chen; Yen, Muh-Yong; Hsu, Yu-Ling; Yang, Shiang-Lin; Lin, Tao-Ping; Bai, Kuan-Jen
2012-01-01
It was reported that 35.5% of tuberculosis (TB) cases reported in 2003 in Taipei City had no recorded pre-treatment body weight and that among those who had, inconsistent dosing of anti-TB drugs was frequent. Taiwan Centers for Disease Control (CDC) have taken actions to strengthen dosing of anti-TB drugs among general practitioners. Prescribing practices of anti-TB drugs in Taipei City in 2007-2010 were investigated to assess whether interventions on dosing were effective. Lists of all notified culture positive TB cases in 2007-2010 were obtained from National TB Registry at Taiwan CDC. A medical audit of TB case management files was performed to collect pretreatment body weight and regimens prescribed at commencement of treatment. Dosages prescribed were compared with dosages recommended. The proportion of patients with recorded pre-treatment body weight was 64.5% in 2003, which increased to 96.5% in 2007-2010 (pTaipei City has remarkably improved after health authorities implemented a series of interventions.
PIRVU DANIELA
2016-04-01
Full Text Available This paper proposes a framework for exploring the main research approaches of the financial markets, conducted in the past years by statistical physics specialists. It, also, presents the global financial developments in the last few years, as well as a review of the most important steps in the development of the physical and mathematical modelling of the socioeconomic phenomena. In this regard, we analysed research findings published in the notable international journals. Our research demonstrated that the econophysical models developed in the past few years for the description of the financial phenomena and processes do not provide satisfactory results for the construction of complete solutions able to answer the nowadays financial challenges. We believe that research instrumentation of statistical physics has developed significantly lately and the research approaches in this field should continue and should be enhanced.
Global regionalized seismicity in view of Non-Extensive Statistical Physics
Chochlaki, Kalliopi; Vallianatos, Filippos; Michas, Georgios
2018-03-01
In the present work we study the distribution of Earth's shallow seismicity on different seismic zones, as occurred from 1981 to 2011 and extracted from the Centroid Moment Tensor (CMT) catalog. Our analysis is based on the subdivision of the Earth's surface into seismic zones that are homogeneous with regards to seismic activity and orientation of the predominant stress field. For this, we use the Flinn-Engdahl regionalization (FE) (Flinn and Engdahl, 1965), which consists of fifty seismic zones as modified by Lombardi and Marzocchi (2007). The latter authors grouped the 50 FE zones into larger tectonically homogeneous ones, utilizing the cumulative moment tensor method, resulting into thirty-nine seismic zones. In each one of these seismic zones we study the distribution of seismicity in terms of the frequency-magnitude distribution and the inter-event time distribution between successive earthquakes, a task that is essential for hazard assessments and to better understand the global and regional geodynamics. In our analysis we use non-extensive statistical physics (NESP), which seems to be one of the most adequate and promising methodological tools for analyzing complex systems, such as the Earth's seismicity, introducing the q-exponential formulation as the expression of probability distribution function that maximizes the Sq entropy as defined by Tsallis, (1988). The qE parameter is significantly greater than one for all the seismic regions analyzed with value range from 1.294 to 1.504, indicating that magnitude correlations are particularly strong. Furthermore, the qT parameter shows some temporal correlations but variations with cut-off magnitude show greater temporal correlations when the smaller magnitude earthquakes are included. The qT for earthquakes with magnitude greater than 5 takes values from 1.043 to 1.353 and as we increase the cut-off magnitude to 5.5 and 6 the qT value ranges from 1.001 to 1.242 and from 1.001 to 1.181 respectively, presenting
Gross, D.H.E.
2006-01-01
Heat can flow from cold to hot at any phase separation even in macroscopic systems. Therefore also Lynden-Bell's famous gravo-thermal catastrophe must be reconsidered. In contrast to traditional canonical Boltzmann-Gibbs statistics this is correctly described only by microcanonical statistics. Systems studied in chemical thermodynamics (ChTh) by using canonical statistics consist of several homogeneous macroscopic phases. Evidently, macroscopic statistics as in chemistry cannot and should not be applied to non-extensive or inhomogeneous systems like nuclei or galaxies. Nuclei are small and inhomogeneous. Multifragmented nuclei are even more inhomogeneous and the fragments even smaller. Phase transitions of first order and especially phase separations therefore cannot be described by a (homogeneous) canonical ensemble. Taking this serious, fascinating perspectives open for statistical nuclear fragmentation as test ground for the basic principles of statistical mechanics, especially of phase transitions, without the use of the thermodynamic limit. Moreover, there is also a lot of similarity between the accessible phase space of fragmenting nuclei and inhomogeneous multistellar systems. This underlines the fundamental significance for statistical physics in general. (orig.)
Thompson, John
2015-04-01
As the Physical Review Focused Collection demonstrates, recent frontiers in physics education research include systematic investigations at the upper division. As part of a collaborative project, we have examined student understanding of several topics in upper-division thermal and statistical physics. A fruitful context for research is the Boltzmann factor in statistical mechanics: the standard derivation involves several physically justified mathematical steps as well as the invocation of a Taylor series expansion. We have investigated student understanding of the physical significance of the Boltzmann factor as well as its utility in various circumstances, and identified various lines of student reasoning related to the use of the Boltzmann factor. Results from written data as well as teaching interviews suggest that many students do not use the Boltzmann factor when answering questions related to probability in applicable physical situations, even after lecture instruction. We designed an inquiry-based tutorial activity to guide students through a derivation of the Boltzmann factor and to encourage deep connections between the physical quantities involved and the mathematics. Observations of students working through the tutorial suggest that many students at this level can recognize and interpret Taylor series expansions, but they often lack fluency in creating and using Taylor series appropriately, despite previous exposure in both calculus and physics courses. Our findings also suggest that tutorial participation not only increases the prevalence of relevant invocation of the Boltzmann factor, but also helps students gain an appreciation of the physical implications and meaning of the mathematical formalism behind the formula. Supported in part by NSF Grants DUE-0817282, DUE-0837214, and DUE-1323426.
Zhang, Jinzhao; Segurado, Jacobo; Schneidesch, Christophe
2013-01-01
Since 1980's, Tractebel Engineering (TE) has being developed and applied a multi-physical modelling and safety analyses capability, based on a code package consisting of the best estimate 3D neutronic (PANTHER), system thermal hydraulic (RELAP5), core sub-channel thermal hydraulic (COBRA-3C), and fuel thermal mechanic (FRAPCON/FRAPTRAN) codes. A series of methodologies have been developed to perform and to license the reactor safety analysis and core reload design, based on the deterministic bounding approach. Following the recent trends in research and development as well as in industrial applications, TE has been working since 2010 towards the application of the statistical sensitivity and uncertainty analysis methods to the multi-physical modelling and licensing safety analyses. In this paper, the TE multi-physical modelling and safety analyses capability is first described, followed by the proposed TE best estimate plus statistical uncertainty analysis method (BESUAM). The chosen statistical sensitivity and uncertainty analysis methods (non-parametric order statistic method or bootstrap) and tool (DAKOTA) are then presented, followed by some preliminary results of their applications to FRAPCON/FRAPTRAN simulation of OECD RIA fuel rod codes benchmark and RELAP5/MOD3.3 simulation of THTF tests. (authors)
A statistical law in the perception of risks and physical quantities in traffic
Elvik, Rune
2015-01-01
This paper suggests that a universal psychophysical law influences the perception of risks and physical quantities in traffic. This law states that there will be a tendency to overestimate low probabilities or small quantities, while high probabilities or large quantities may be underestimated....... Studies of the perception of risk and physical quantities in traffic have found a highly consistent pattern....
Statistical Learning Is Not Affected by a Prior Bout of Physical Exercise.
Stevens, David J; Arciuli, Joanne; Anderson, David I
2016-05-01
This study examined the effect of a prior bout of exercise on implicit cognition. Specifically, we examined whether a prior bout of moderate intensity exercise affected performance on a statistical learning task in healthy adults. A total of 42 participants were allocated to one of three conditions-a control group, a group that exercised for 15 min prior to the statistical learning task, and a group that exercised for 30 min prior to the statistical learning task. The participants in the exercise groups cycled at 60% of their respective V˙O2 max. Each group demonstrated significant statistical learning, with similar levels of learning among the three groups. Contrary to previous research that has shown that a prior bout of exercise can affect performance on explicit cognitive tasks, the results of the current study suggest that the physiological stress induced by moderate-intensity exercise does not affect implicit cognition as measured by statistical learning. Copyright © 2015 Cognitive Science Society, Inc.
DbAccess: Interactive Statistics and Graphics for Plasma Physics Databases
Davis, W.; Mastrovito, D.
2003-01-01
DbAccess is an X-windows application, written in IDL(reg s ign), meeting many specialized statistical and graphical needs of NSTX [National Spherical Torus Experiment] plasma physicists, such as regression statistics and the analysis of variance. Flexible ''views'' and ''joins,'' which include options for complex SQL expressions, facilitate mixing data from different database tables. General Atomics Plot Objects add extensive graphical and interactive capabilities. An example is included for plasma confinement-time scaling analysis using a multiple linear regression least-squares power fit
Statistical analysis of morphometric indicators and physical readiness variability of students
R.A. Gainullin
2017-10-01
Full Text Available Aim: To evaluate the interaction of morphometric characteristics with the reactions of the cardiorespiratory system and the indices of physical training during the process of physical exercise training at the university. Material: The students of the first course (n = 91, aged 17-18 took part in the survey. The students were divided into 6 groups. All students were engaged in physical training. All the studied indicators were conditionally divided into two groups. The first group of studies included indicators of physical fitness. The second group was formed by morphofunctional indices. Results: The indicators of the physical preparedness of students demonstrate a wide range and heterogeneity. This should be taken into account when staffing training groups. When using the technique of development of local regional muscular endurance, the values of orthostatic test and the Skibinski index show significant variability. Also high and significant correlation interactions are shown by indicators: manual dynamometry; strength endurance; the values of the Skibinski index. Also, in the orthotropic test, the same effect was observed: age, body length, heart rate. A similar analysis of morphofunctional indices shows significant correlation links: the Skibinski index and orthotropic tests; age and the Skibinski index; weight and body length. Conclusions: from the point of view of physical fitness, groups of sports training (the second group and hypertensive groups (group 5 proved to be the most stable. A group of volunteers turned out to be the most stable relative to the morphofunctional indicators.
2013-01-01
This book offers a comprehensive picture of nonequilibrium phenomena in nanoscale systems. Written by internationally recognized experts in the field, this book strikes a balance between theory and experiment, and includes in-depth introductions to nonequilibrium fluctuation relations, nonlinear dynamics and transport, single molecule experiments, and molecular diffusion in nanopores. The authors explore the application of these concepts to nano- and biosystems by cross-linking key methods and ideas from nonequilibrium statistical physics, thermodynamics, stochastic theory, and dynamical s
João Henrique Gomes
2017-05-01
Full Text Available Abstract AIMS This study aimed to verify th erelation ship between of anthropometric and physical performance variables with game-related statistics in professional elite basketball players during a competition. METHODS Eleven male basketball players were evaluated during 10 weeks in two distinct moments (regular season and playoffs. Overall, 11 variables of physical fitness and 13 variables of game-related statistics were analysed. RESULTS The following significant Pearson’scorrelations were found in regular season: percentage of fat mass with assists (r = -0.62 and steals (r = -0.63; height (r = 0.68, lean mass (r = 0.64, and maximum strength (r = 0.67 with blocks; squat jump with steals (r = 0.63; and time in the T-test with success ful two-point field-goals (r = -0.65, success ful free-throws (r = -0.61, and steals (r = -0.62. However, in playoffs, only stature and lean mass maintained these correlations (p ≤ 0.05. CONCLUSIONS The anthropometric and physical characteristics of the players showed few correlations with the game-related statistics in regular season, and these correlations are even lower in the playoff games of a professional elite Champion ship, wherefore, not being good predictors of technical performance.
Liu, Lu; Wei, Jianrong; Zhang, Huishu; Xin, Jianhong; Huang, Jiping
2013-01-01
Because classical music has greatly affected our life and culture in its long history, it has attracted extensive attention from researchers to understand laws behind it. Based on statistical physics, here we use a different method to investigate classical music, namely, by analyzing cumulative distribution functions (CDFs) and autocorrelation functions of pitch fluctuations in compositions. We analyze 1,876 compositions of five representative classical music composers across 164 years from Bach, to Mozart, to Beethoven, to Mendelsohn, and to Chopin. We report that the biggest pitch fluctuations of a composer gradually increase as time evolves from Bach time to Mendelsohn/Chopin time. In particular, for the compositions of a composer, the positive and negative tails of a CDF of pitch fluctuations are distributed not only in power laws (with the scale-free property), but also in symmetry (namely, the probability of a treble following a bass and that of a bass following a treble are basically the same for each composer). The power-law exponent decreases as time elapses. Further, we also calculate the autocorrelation function of the pitch fluctuation. The autocorrelation function shows a power-law distribution for each composer. Especially, the power-law exponents vary with the composers, indicating their different levels of long-range correlation of notes. This work not only suggests a way to understand and develop music from a viewpoint of statistical physics, but also enriches the realm of traditional statistical physics by analyzing music.
Lu Liu
Full Text Available Because classical music has greatly affected our life and culture in its long history, it has attracted extensive attention from researchers to understand laws behind it. Based on statistical physics, here we use a different method to investigate classical music, namely, by analyzing cumulative distribution functions (CDFs and autocorrelation functions of pitch fluctuations in compositions. We analyze 1,876 compositions of five representative classical music composers across 164 years from Bach, to Mozart, to Beethoven, to Mendelsohn, and to Chopin. We report that the biggest pitch fluctuations of a composer gradually increase as time evolves from Bach time to Mendelsohn/Chopin time. In particular, for the compositions of a composer, the positive and negative tails of a CDF of pitch fluctuations are distributed not only in power laws (with the scale-free property, but also in symmetry (namely, the probability of a treble following a bass and that of a bass following a treble are basically the same for each composer. The power-law exponent decreases as time elapses. Further, we also calculate the autocorrelation function of the pitch fluctuation. The autocorrelation function shows a power-law distribution for each composer. Especially, the power-law exponents vary with the composers, indicating their different levels of long-range correlation of notes. This work not only suggests a way to understand and develop music from a viewpoint of statistical physics, but also enriches the realm of traditional statistical physics by analyzing music.
Statistical Learning Is Not Affected by a Prior Bout of Physical Exercise
Stevens, David J.; Arciuli, Joanne; Anderson, David I.
2016-01-01
This study examined the effect of a prior bout of exercise on implicit cognition. Specifically, we examined whether a prior bout of moderate intensity exercise affected performance on a statistical learning task in healthy adults. A total of 42 participants were allocated to one of three conditions--a control group, a group that exercised for…
Ali, Abebe Mohammed; Darvishzadeh, R.; Skidmore, Andrew K.
2017-01-01
One of the key traits in the assessment of ecosystem functions is a specific leaf area (SLA). The main aim of this study was to examine the potential of new generation satellite images, such as Landsat-8 imagery, for the retrieval of SLA at regional and global scales. Therefore, both statistical and
Brownian ratchets from statistical physics to bio and nano-motors
Cubero, David
2016-01-01
Illustrating the development of Brownian ratchets, from their foundations, to their role in the description of life at the molecular scale and in the design of artificial nano-machinery, this text will appeal to both advanced graduates and researchers entering the field. Providing a self-contained introduction to Brownian ratchets, devices which rectify microscopic fluctuations, Part I avoids technicalities and sets out the broad range of physical systems where the concept of ratchets is relevant. Part II supplies a single source for a complete and modern theoretical analysis of ratchets in regimes such as classical vs quantum and stochastic vs deterministic, and in Part III readers are guided through experimental developments in different physical systems, each highlighting a specific unique feature of ratchets. The thorough and systematic approach to the topic ensures that this book provides a complete guide to Brownian ratchets for newcomers and established researchers in physics, biology and biochemistry.
Chang, Shih-Yu; Lee, Chung-Te; Chou, Charles C.-K.; Liu, Shaw-Chen; Wen, Tian-Xue
The characteristics of ambient aerosols, affected by solar radiation, relative humidity, wind speed, wind direction, and gas-aerosol interaction, changed rapidly at different spatial and temporal scales. In Taipei Basin, dense traffic emissions and sufficient solar radiation for typical summer days favored the formation of secondary aerosols. In winter, the air quality in Taipei Basin was usually affected by the Asian continental outflows due to the long-range transport of pollutants carried by the winter monsoon. The conventional filter-based method needs a long time for collecting aerosols and analyzing compositions, which cannot provide high time-resolution data to investigate aerosol sources, atmospheric transformation processes, and health effects. In this work, the in situ ion chromatograph (IC) system was developed to provide 15-min time-resolution data of nine soluble inorganic species (Cl -, NO 2-, NO 3-, SO 42-, Na +, NH 4+, K +, Mg 2+ and Ca 2+). Over 89% of all particles larger than approximately 0.056 μm were collected by the in situ IC system. The in situ IC system is estimated to have a limit of detection lower than 0.3 μg m -3 for the various ambient ionic components. Depending on the hourly measurements, the pollutant events with high aerosol concentrations in Taipei Basin were associated with the local traffic emission in rush hour, the accumulation of pollutants in the stagnant atmosphere, the emission of industrial pollutants from the nearby factories, the photochemical secondary aerosol formation, and the long-range transport of pollutants from Asian outflows.
Statistical Modelling of Synaptic Vesicles Distribution and Analysing their Physical Characteristics
Khanmohammadi, Mahdieh
transmission electron microscopy is used to acquire images from two experimental groups of rats: 1) rats subjected to a behavioral model of stress and 2) rats subjected to sham stress as the control group. The synaptic vesicle distribution and interactions are modeled by employing a point process approach......This Ph.D. thesis deals with mathematical and statistical modeling of synaptic vesicle distribution, shape, orientation and interactions. The first major part of this thesis treats the problem of determining the effect of stress on synaptic vesicle distribution and interactions. Serial section...... on differences of statistical measures in section and the same measures in between sections. Three-dimensional (3D) datasets are reconstructed by using image registration techniques and estimated thicknesses. We distinguish the effect of stress by estimating the synaptic vesicle densities and modeling...
WE-E-201-01: Use and Abuse of Common Statistics in Radiological Physics
Labby, Z.
2015-01-01
Physicists are often expected to have a solid grounding in experimental design and statistical analysis, sometimes filling in when biostatisticians or other experts are not available for consultation. Unfortunately, graduate education on these topics is seldom emphasized and few opportunities for continuing education exist. Clinical physicists incorporate new technology and methods into their practice based on published literature. A poor understanding of experimental design and analysis could Result in inappropriate use of new techniques. Clinical physicists also improve current practice through quality initiatives that require sound experimental design and analysis. Academic physicists with a poor understanding of design and analysis may produce ambiguous (or misleading) results. This can Result in unnecessary rewrites, publication rejection, and experimental redesign (wasting time, money, and effort). This symposium will provide a practical review of error and uncertainty, common study designs, and statistical tests. Instruction will primarily focus on practical implementation through examples and answer questions such as: where would you typically apply the test/design and where is the test/design typically misapplied (i.e., common pitfalls)? An analysis of error and uncertainty will also be explored using biological studies and associated modeling as a specific use case. Learning Objectives: Understand common experimental testing and clinical trial designs, what questions they can answer, and how to interpret the results Determine where specific statistical tests are appropriate and identify common pitfalls Understand the how uncertainty and error are addressed in biological testing and associated biological modeling
WE-E-201-01: Use and Abuse of Common Statistics in Radiological Physics
Labby, Z. [University of Wisconsin (United States)
2015-06-15
Physicists are often expected to have a solid grounding in experimental design and statistical analysis, sometimes filling in when biostatisticians or other experts are not available for consultation. Unfortunately, graduate education on these topics is seldom emphasized and few opportunities for continuing education exist. Clinical physicists incorporate new technology and methods into their practice based on published literature. A poor understanding of experimental design and analysis could Result in inappropriate use of new techniques. Clinical physicists also improve current practice through quality initiatives that require sound experimental design and analysis. Academic physicists with a poor understanding of design and analysis may produce ambiguous (or misleading) results. This can Result in unnecessary rewrites, publication rejection, and experimental redesign (wasting time, money, and effort). This symposium will provide a practical review of error and uncertainty, common study designs, and statistical tests. Instruction will primarily focus on practical implementation through examples and answer questions such as: where would you typically apply the test/design and where is the test/design typically misapplied (i.e., common pitfalls)? An analysis of error and uncertainty will also be explored using biological studies and associated modeling as a specific use case. Learning Objectives: Understand common experimental testing and clinical trial designs, what questions they can answer, and how to interpret the results Determine where specific statistical tests are appropriate and identify common pitfalls Understand the how uncertainty and error are addressed in biological testing and associated biological modeling.
Lin, Jin-Ding; Lin, Ya-Wen; Yen, Chia-Feng; Loh, Ching-Hui; Chwo, Miao-Ju
2009-01-01
The purposes of the present study are to provide the first data on utilization, understanding and satisfaction of the National Health Insurance (NHI) premium subsidy for families of children with disabilities in Taipei. Data from the 2001 Taipei Early Intervention Utilization and Evaluation Survey for Aged 0-6 Children with Disabilities were…
Applications of statistical physics and information theory to the analysis of DNA sequences
Grosse, Ivo
2000-10-01
DNA carries the genetic information of most living organisms, and the of genome projects is to uncover that genetic information. One basic task in the analysis of DNA sequences is the recognition of protein coding genes. Powerful computer programs for gene recognition have been developed, but most of them are based on statistical patterns that vary from species to species. In this thesis I address the question if there exist universal statistical patterns that are different in coding and noncoding DNA of all living species, regardless of their phylogenetic origin. In search for such species-independent patterns I study the mutual information function of genomic DNA sequences, and find that it shows persistent period-three oscillations. To understand the biological origin of the observed period-three oscillations, I compare the mutual information function of genomic DNA sequences to the mutual information function of stochastic model sequences. I find that the pseudo-exon model is able to reproduce the mutual information function of genomic DNA sequences. Moreover, I find that a generalization of the pseudo-exon model can connect the existence and the functional form of long-range correlations to the presence and the length distributions of coding and noncoding regions. Based on these theoretical studies I am able to find an information-theoretical quantity, the average mutual information (AMI), whose probability distributions are significantly different in coding and noncoding DNA, while they are almost identical in all studied species. These findings show that there exist universal statistical patterns that are different in coding and noncoding DNA of all studied species, and they suggest that the AMI may be used to identify genes in different living species, irrespective of their taxonomic origin.
A generalization of random matrix theory and its application to statistical physics.
Wang, Duan; Zhang, Xin; Horvatic, Davor; Podobnik, Boris; Eugene Stanley, H
2017-02-01
To study the statistical structure of crosscorrelations in empirical data, we generalize random matrix theory and propose a new method of cross-correlation analysis, known as autoregressive random matrix theory (ARRMT). ARRMT takes into account the influence of auto-correlations in the study of cross-correlations in multiple time series. We first analytically and numerically determine how auto-correlations affect the eigenvalue distribution of the correlation matrix. Then we introduce ARRMT with a detailed procedure of how to implement the method. Finally, we illustrate the method using two examples taken from inflation rates for air pressure data for 95 US cities.
Dorfman, Kevin D
2018-02-01
The development of bright bisintercalating dyes for deoxyribonucleic acid (DNA) in the 1990s, most notably YOYO-1, revolutionized the field of polymer physics in the ensuing years. These dyes, in conjunction with modern molecular biology techniques, permit the facile observation of polymer dynamics via fluorescence microscopy and thus direct tests of different theories of polymer dynamics. At the same time, they have played a key role in advancing an emerging next-generation method known as genome mapping in nanochannels. The effect of intercalation on the bending energy of DNA as embodied by a change in its statistical segment length (or, alternatively, its persistence length) has been the subject of significant controversy. The precise value of the statistical segment length is critical for the proper interpretation of polymer physics experiments and controls the phenomena underlying the aforementioned genomics technology. In this perspective, we briefly review the model of DNA as a wormlike chain and a trio of methods (light scattering, optical or magnetic tweezers, and atomic force microscopy (AFM)) that have been used to determine the statistical segment length of DNA. We then outline the disagreement in the literature over the role of bisintercalation on the bending energy of DNA, and how a multiscale biomechanical approach could provide an important model for this scientifically and technologically relevant problem.
Statistical Analysis Methods for Physics Models Veriﬁcation and Validation
De Luca, Silvia
2017-01-01
The validation and veriﬁcation process is a fundamental step for any software like Geant4 and GeantV, which aim to perform data simulation using physics models and Monte Carlo techniques. As experimental physicists, we have to face the problem to compare the results obtained using simulations with what the experiments actually observed. One way to solve the problem is to perform a consistency test. Within the Geant group, we developed a C++ compact library which will be added to the automated validation process on the Geant Validation Portal
Population aging and its impacts: strategies of the health-care system in Taipei.
Lin, Ming-Hsien; Chou, Ming-Yueh; Liang, Chih-Kuang; Peng, Li-Ning; Chen, Liang-Kung
2010-11-01
Taiwan is one of the fastest aging countries in the world. As such, the government has developed various strategies to promote an age-friendly health-care system. Health services are supported by National Health Insurance (NHI), which insures over 97% of citizens and over 99% of health-care institutes. The current health-care system has difficulties in caring for older patients with multiple comorbidities, complex care needs, functional impairments, and post-acute care needs. Taipei, an international metropolis with a well-preserved tradition of filial piety in Chinese societies, has developed various strategies to overcome the aforementioned barriers to an age-friendly health-care system. These include an emphasis on general medical care and a holistic approach in all specialties, development of a geriatrics specialty training program, development of post-acute services, and strengthening of linkages between health and social care services. Despite achievements thus far, challenges still include creating a more extensive integration between medical specialties, promotion of an interdisciplinary care model across specialties and health-care settings, and integration of health and social care services. The experiences of Taipei in developing an age-friendly health-care service system may be a culturally appropriate model for other Chinese and Asian communities. Copyright © 2010 Elsevier B.V. All rights reserved.
Depressive symptoms of elderly Chinese in Guangzhou, Hong Kong, and Taipei.
Lai, Daniel W L
2009-09-01
Understanding the socio-cultural context is an important pre-requisite for understanding global aging and mental health. This study aimed to examine the variation in the types of depressive symptoms of aging Chinese in three ethnic Chinese societies. Data were based on a mixed purposive and random sample of aging Chinese in Guangzhou, Hong Kong, and Taipei. The 891 Chinese participants of 65 years or older were included. Depressive symptoms were measured by a Chinese 15-item Geriatric Depression Scale. Factor analysis was used to identify the factor structure of the scale when used with elderly Chinese in the three cities. There are 'within-ethnic group' differences in manifestation of depressive symptoms. Symptoms of the elderly Chinese in Guangzhou and Hong Kong were similarly related to items that indicate uncertainty and disinterest in living. The ones in Taipei expressed symptoms indicating disinterest and a negative mood. These differences were probably due to the variations in the socio-cultural, demographic, and structural characteristics among the three cities. Depressive symptoms can be culturally related and manifested differently by people sharing a similar ethnicity. The same ethnicity does not mean homogeneity. The findings should be useful for mental health practitioners in Western societies working with older Chinese immigrants. Knowing the mental health characteristics of these client groups will facilitate the designing of appropriate assessment and intervention tools to fit the culturally unique mental health needs of different subgroups in these ethno-cultural communities.
Risk factors for unfavorable outcome of pulmonary tuberculosis in adults in Taipei, Taiwan.
Yen, Yung-Feng; Yen, Muh-Yong; Shih, Hsiu-Chen; Deng, Chung-Yeh
2012-05-01
This study was undertaken to identify factors associated with unfavorable outcomes in patients with pulmonary tuberculosis (PTB) in Taipei, Taiwan in 2007-2008. Taiwanese adults with culture-positive PTB diagnosed in Taipei during the study period were included in this retrospective cohort study. Unfavorable outcomes were classified as treatment default, death, treatment failure, or transfer. Of 1616 eligible patients, 22.6% (365) had unfavorable outcomes, mainly death. After controlling for patient sociodemographic factors, clinical findings, and underlying disease, independent risk factors for unfavorable outcomes included advanced age, unemployment, end-stage renal disease requiring dialysis, malignancy, acid-fast bacilius smear-positivity, multidrug-resistant TB, and notification from ordinary ward or intensive care unit. In contrast, patients receiving directly observed treatment, and with a high school or higher education were significantly less likely to have unfavorable outcomes. This study advanced our understanding by revealing that a high school or higher education might lower the risk of an unfavorable outcome. Our results also confirmed the risk factors for unfavorable outcomes shown in previous research. Future TB control programmes in Taiwan should target particularly high-risk patients including those who had lower educational levels. Copyright © 2012 Royal Society of Tropical Medicine and Hygiene. Published by Elsevier Ltd. All rights reserved.
Wang, Yunn-Jinn; Chen, Chi-Feng; Lin, Jen-Yang
2013-10-16
Pollutants deposited on road surfaces and distributed in the environment are a source of nonpoint pollution. Field data are traditionally hard to collect from roads because of constant traffic. In this study, in cooperation with the traffic administration, the dry deposition on and road runoff from urban roads was measured in Taipei City and New Taipei City, Taiwan. The results showed that the dry deposition is 2.01-5.14 g/m(2) · day and 78-87% of these solids are in the 75-300 µm size range. The heavy metals in the dry deposited particles are mainly Fe, Zn, and Na, with average concentrations of 34,978, 1,519 and 1,502 ppm, respectively. Elevated express roads show the highest heavy metal concentrations. Not only the number of vehicles, but also the speed of the traffic should be considered as factors that influence road pollution, as high speeds may accelerate vehicle wear and deposit more heavy metals on road surfaces. In addition to dry deposition, the runoff and water quality was analyzed every five minutes during the first two hours of storm events to capture the properties of the first flush road runoff. The sample mean concentration (SMC) from three roads demonstrated that the first flush runoff had a high pollution content, notably for suspended solid (SS), chemical oxygen demand (COD), oil and grease, Pb, and Zn. Regular sweeping and onsite water treatment facilities are suggested to minimize the pollution from urban roads.
Presbycusis among older Chinese people in Taipei, Taiwan: a community-based study.
Chang, Hsin-Pin; Chou, Pesus
2007-12-01
The purpose of this study was to estimate the prevalence and severity of presbycusis in older Chinese people in Taipei, Taiwan. Pure-tone audiometry and a questionnaire were administered to a randomly-recruited cohort of people > 65 years old (n=1221) from a community in Taipei. The study cohort showed pure-tone thresholds worsening, especially at frequencies >2 kHz, with increasing age. The mean pure-tone average at speech frequencies (0.5, 1, and 2 kHz) of the better ear of subjects stratified by five-year age groups ranged from 34.9 dB hearing level (HL) to 46.4 dB HL. The pure-tone average at speech frequency in women was slightly higher than that in men in all age groups. The prevalence of presbycusis (M3 > or = 55 dBHL) was 1.6% (65-69 years), 3.2% (70-74 years), 7.5% (75-79 years), and 14.9% (> or =80 years). Persistent tinnitus was present in 13.9% of subjects, and 18.8% of subjects had a history of vertigo. Of subjects with a clinically evident hearing impairment (M3 > or = 55 dB HL), 18.4% used hearing aids. These data provide estimates of the prevalence and severity of presbycusis in community-dwelling older persons in Taiwan.
The Web 2.0 concept of urban disaster information in Taipei city: Mobile application development
Tsai, Yuan-Fan; Chan, Chun-Hsiang; Wang, Han; Pan, Yun-Xing; Lin, Gine-Jie
2014-05-01
In recent years, due to the global warming and global climate anomaly, more and more disasters appear such as flood and debris flow. The disasters always cause loss of life and property. However, the cross-aged invention, smart phone, makes our life more conveniently for delivering lots of information instantly. This study uses Eclipse as the development platform, and designs the urban disaster information mobile Application (APP) which is for debris flow and flood in Taipei city area. In this study, an urban disaster information APP, Taipei Let You Know, has successfully developed under android development environment, combined disaster indicators and the warming value of disaster. In order to ameliorate official information delay problem, this APP not only shows official information, but also offers a WEB 2.0 platform for public users to upload all disaster information instantly. As the result, the losses of life and property can decrease, and the disaster information delivery can be faster and more accurate by utilizing this APP in the future.
Comparison of the landslide susceptibility models in Taipei Water Source Domain, Taiwan
WU, C. Y.; Yeh, Y. C.; Chou, T. H.
2017-12-01
Taipei Water Source Domain, locating at the southeast of Taipei Metropolis, is the main source of water resource in this region. Recently, the downstream turbidity often soared significantly during the typhoon period because of the upstream landslides. The landslide susceptibilities should be analysed to assess the influence zones caused by different rainfall events, and to ensure the abilities of this domain to serve enough and quality water resource. Generally, the landslide susceptibility models can be established based on either a long-term landslide inventory or a specified landslide event. Sometimes, there is no long-term landslide inventory in some areas. Thus, the event-based landslide susceptibility models are established widely. However, the inventory-based and event-based landslide susceptibility models may result in dissimilar susceptibility maps in the same area. So the purposes of this study were to compare the landslide susceptibility maps derived from the inventory-based and event-based models, and to interpret how to select a representative event to be included in the susceptibility model. The landslide inventory from Typhoon Tim in July, 1994 and Typhoon Soudelor in August, 2015 was collected, and used to establish the inventory-based landslide susceptibility model. The landslides caused by Typhoon Nari and rainfall data were used to establish the event-based model. The results indicated the high susceptibility slope-units were located at middle upstream Nan-Shih Stream basin.
Thermal transport in low dimensions from statistical physics to nanoscale heat transfer
2016-01-01
Understanding non-equilibrium properties of classical and quantum many-particle systems is one of the goals of contemporary statistical mechanics. Besides its own interest for the theoretical foundations of irreversible thermodynamics(e.g. of the Fourier's law of heat conduction), this topic is also relevant to develop innovative ideas for nanoscale thermal management with possible future applications to nanotechnologies and effective energetic resources. The first part of the volume (Chapters 1-6) describes the basic models, the phenomenology and the various theoretical approaches to understand heat transport in low-dimensional lattices (1D e 2D). The methods described will include equilibrium and nonequilibrium molecular dynamics simulations, hydrodynamic and kinetic approaches and the solution of stochastic models. The second part (Chapters 7-10) deals with applications to nano and microscale heat transfer, as for instance phononic transport in carbon-based nanomaterials, including the prominent case of na...
Meng-Hsuan Cheng
2016-03-01
Full Text Available Many studies have examined the effects of air pollution on daily mortality over the past two decades. However, information on the relationship between levels of coarse particles (PM2.5–10 and daily mortality is relatively sparse due to the limited availability of monitoring data. Furthermore, the results are inconsistent. In the current study, the association between coarse particle levels and daily mortality in Taipei, Taiwan’s largest city, which has a subtropical climate, was undertaken for the period 2006–2008 using a time-stratified case-crossover analysis. For the single pollutant model (without adjustment for other pollutants, PM2.5–10 showed statistically significant association with total mortality both on warm and cool days, with an interquartile range increase associated with a 11% (95% CI = 6%–17% and 4% (95% CI = 1%–7% rise in number of total deaths, respectively. In two-pollutant models, PM2.5–10 remained significant effects on total mortality after the inclusion of SO2 and O3 both on warm and cool days. We observed no significant associations between PM2.5–10 and daily mortality from respiratory diseases both on warm and cool days. For daily mortality from circulatory diseases, the effect of PM2.5–10 remained significant when SO2 or O3 was added in the regression model both on warm and cool days. Future studies of this type in cities with varying climates and cultures are needed.
Important contributions of M.C. Wang and C.S. Wang Chang to non-equilibrium statistical physics
Liu Jixing
2004-01-01
In the middle of the 20th century two Chinese women physicists, Ming-Chen Wang and Cheng-Shu Wang Chang made great contributions to statistical physics. The famous review article 'On the theory of the Brownian motion II' by Ming-Chen Wang and G.E. Uhlenbeck published in Rev. of Mod. Phys. in 1945 provided a complete scientific classification of stochastic processes which is still adopted by the scientific community as the standard classification. The Wang-Chang-Uhlenbeck (WCU) equation proposed jointly by C.S. Wang-Chang and Uhlenbeck became the fundamental kinetic equation for the treatment of transport properties of multi-atomic gases with internal degrees of freedom in the physics literature. These important scientific contributions are analyzed and reviewed
Farrell, Brian F.; Ioannou, Petros J.
2017-08-01
This paper describes a study of the self-sustaining process in wall turbulence. The study is based on a second order statistical state dynamics model of Couette flow in which the state variables are the streamwise mean flow (first cumulant) and perturbation covariance (second cumulant). This statistical state dynamics model is closed by either setting the third cumulant to zero or by replacing it with a stochastic parametrization. Statistical state dynamics models with this form are referred to as S3T models. S3T models have been shown to self-sustain turbulence with a mean flow and second order perturbation structure similar to that obtained by direct numerical simulation of the equations of motion. The use of a statistical state dynamics model to study the physical mechanisms underlying turbulence has important advantages over the traditional approach of studying the dynamics of individual realizations of turbulence. One advantage is that the analytical structure of S3T statistical state dynamics models isolates the interaction between the mean flow and the perturbation components of the turbulence. Isolation of the interaction between these components reveals how this interaction underlies both the maintenance of the turbulence variance by transfer of energy from the externally driven flow to the perturbation components as well as the enforcement of the observed statistical mean turbulent state by feedback regulation between the mean and perturbation fields. Another advantage of studying turbulence using statistical state dynamics models of S3T form is that the analytical structure of S3T turbulence can be completely characterized. For example, the perturbation component of turbulence in the S3T system is demonstrably maintained by a parametric perturbation growth mechanism in which fluctuation of the mean flow maintains the perturbation field which in turn maintains the mean flow fluctuations in a synergistic interaction. Furthermore, the equilibrium
Pramana – Journal of Physics | Indian Academy of Sciences
Author Affiliations. M K Singh1 2 3 A K Soma3 V Singh1 3 R Pathak2. Physics Department, Banaras Hindu University, Varanasi 221 005, India; Physics Department, Tilak Dhari Postgraduate College, Jaunpur 222 002, India; Institute of Physics, Academia Sinica, Taipei 11529, Taiwan ...
A Statistical Study of Socio-economic and Physical Risk Factors of Myocardial Infarction
M. Alamgir
2005-07-01
Full Text Available A sample of 506 patients from various hospitals in Peshawar was examined to determine significant socio-economic and physical risk factors of Myocardial Infarction (heart attack. The factors examined were smoking (S, hypertension (H, cholesterol (C, diabetes (D, family history (F, residence (R, own a house (OH, number of dependents (ND, household income (I, obesity and lack of exercise (E. The response variable MI was binary. Therefore, logistic regression was applied (using GLIM and SPSS packages to analyze the data and to select a parsimonious model. Logistic regression models have been obtained indicating significant risk factors for both sexes, for males and for females separately. The best-selected model for both sexes is of factors S, F, D, H and C. The best-selected model for males is of factors CIFH, S, H, D, C and F, while the best-selected model for females is of factors D, H, C and F.
Teixeira, Marilia S.; Pinto, Nivia G.P.; Barroso, Regina C.; Oliveira, Luis F., E-mail: mariliasilvat@gmail.co, E-mail: lfolive@oi.com.b, E-mail: cely_barroso@hotmail.co, E-mail: nitatag@gmail.co [Universidade do Estado do Rio de Janeiro (UERJ), Rio de Janeiro, RJ (Brazil). Inst. de Fisica
2009-07-01
The objective of biomedical research with different radiation natures is to contribute for the understanding of the basic physics and biochemistry of the biological systems, the disease diagnostic and the development of the therapeutic techniques. The main benefits are: the cure of tumors through the therapy, the anticipated detection of diseases through the diagnostic, the using as prophylactic mean for blood transfusion, etc. Therefore, for the better understanding of the biological interactions occurring after exposure to radiation, it is necessary for the optimization of therapeutic procedures and strategies for reduction of radioinduced effects. The group pf applied physics of the Physics Institute of UERJ have been working in the characterization of biological samples (human tissues, teeth, saliva, soil, plants, sediments, air, water, organic matrixes, ceramics, fossil material, among others) using X-rays diffraction and X-ray fluorescence. The application of these techniques for measurement, analysis and interpretation of the biological tissues characteristics are experimenting considerable interest in the Medical and Environmental Physics. All quantitative data analysis must be initiated with descriptive statistic calculation (means and standard deviations) in order to obtain a previous notion on what the analysis will reveal. It is well known que o high values of standard deviation found in experimental measurements of biologicals samples can be attributed to biological factors, due to the specific characteristics of each individual (age, gender, environment, alimentary habits, etc). This work has the main objective the development of a program for the use of specific statistic methods for the optimization of experimental data an analysis. The specialized programs for this analysis are proprietary, another objective of this work is the implementation of a code which is free and can be shared by the other research groups. As the program developed since the
Statistical Modelling of Global Tectonic Activity and some Physical Consequences of its Results
Konstantin Statnikov
2015-02-01
Full Text Available Based on the analysis of global earthquake data bank for the last thirty years, a global tectonic activity indicator was proposed comprising a weekly globally averaged mean earthquake magnitude value. It was shown that 84% of indicator variability is a harmonic oscillation with a fundamental period of 37.2 years, twice the maximum period in the tidal oscillation spectrum (18.6 years. From this observation, a conclusion was drawn that parametric resonance (PR exists between global tectonic activity and low-frequency tides. The conclusion was also confirmed by the existence of the statistically significant PR response at the second lowest tidal frequency i.e. 182.6 days. It was shown that the global earthquake flow, with a determination factor 93%, is a sum of two Gaussian streams, nearly equally intense, with mean values of 23 and 83 events per week and standard deviations of 9 and 30 events per week, respectively. The Earth periphery to 'mean time interval between earthquakes' ratios in the first and the second flow modes described above match, by the order of magnitude, the sound velocity in the fluid (~1500 m/s and in elastic medium (5500 m/s.
Ingber, Lester; Nunez, Paul L
2011-02-01
The dynamic behavior of scalp potentials (EEG) is apparently due to some combination of global and local processes with important top-down and bottom-up interactions across spatial scales. In treating global mechanisms, we stress the importance of myelinated axon propagation delays and periodic boundary conditions in the cortical-white matter system, which is topologically close to a spherical shell. By contrast, the proposed local mechanisms are multiscale interactions between cortical columns via short-ranged non-myelinated fibers. A mechanical model consisting of a stretched string with attached nonlinear springs demonstrates the general idea. The string produces standing waves analogous to large-scale coherent EEG observed in some brain states. The attached springs are analogous to the smaller (mesoscopic) scale columnar dynamics. Generally, we expect string displacement and EEG at all scales to result from both global and local phenomena. A statistical mechanics of neocortical interactions (SMNI) calculates oscillatory behavior consistent with typical EEG, within columns, between neighboring columns via short-ranged non-myelinated fibers, across cortical regions via myelinated fibers, and also derives a string equation consistent with the global EEG model. Copyright © 2010 Elsevier Inc. All rights reserved.
Blessing of dimensionality: mathematical foundations of the statistical physics of data.
Gorban, A N; Tyukin, I Y
2018-04-28
The concentrations of measure phenomena were discovered as the mathematical background to statistical mechanics at the end of the nineteenth/beginning of the twentieth century and have been explored in mathematics ever since. At the beginning of the twenty-first century, it became clear that the proper utilization of these phenomena in machine learning might transform the curse of dimensionality into the blessing of dimensionality This paper summarizes recently discovered phenomena of measure concentration which drastically simplify some machine learning problems in high dimension, and allow us to correct legacy artificial intelligence systems. The classical concentration of measure theorems state that i.i.d. random points are concentrated in a thin layer near a surface (a sphere or equators of a sphere, an average or median-level set of energy or another Lipschitz function, etc.). The new stochastic separation theorems describe the thin structure of these thin layers: the random points are not only concentrated in a thin layer but are all linearly separable from the rest of the set, even for exponentially large random sets. The linear functionals for separation of points can be selected in the form of the linear Fisher's discriminant. All artificial intelligence systems make errors. Non-destructive correction requires separation of the situations (samples) with errors from the samples corresponding to correct behaviour by a simple and robust classifier. The stochastic separation theorems provide us with such classifiers and determine a non-iterative (one-shot) procedure for their construction.This article is part of the theme issue 'Hilbert's sixth problem'. © 2018 The Author(s).
Blessing of dimensionality: mathematical foundations of the statistical physics of data
Gorban, A. N.; Tyukin, I. Y.
2018-04-01
The concentrations of measure phenomena were discovered as the mathematical background to statistical mechanics at the end of the nineteenth/beginning of the twentieth century and have been explored in mathematics ever since. At the beginning of the twenty-first century, it became clear that the proper utilization of these phenomena in machine learning might transform the curse of dimensionality into the blessing of dimensionality. This paper summarizes recently discovered phenomena of measure concentration which drastically simplify some machine learning problems in high dimension, and allow us to correct legacy artificial intelligence systems. The classical concentration of measure theorems state that i.i.d. random points are concentrated in a thin layer near a surface (a sphere or equators of a sphere, an average or median-level set of energy or another Lipschitz function, etc.). The new stochastic separation theorems describe the thin structure of these thin layers: the random points are not only concentrated in a thin layer but are all linearly separable from the rest of the set, even for exponentially large random sets. The linear functionals for separation of points can be selected in the form of the linear Fisher's discriminant. All artificial intelligence systems make errors. Non-destructive correction requires separation of the situations (samples) with errors from the samples corresponding to correct behaviour by a simple and robust classifier. The stochastic separation theorems provide us with such classifiers and determine a non-iterative (one-shot) procedure for their construction. This article is part of the theme issue `Hilbert's sixth problem'.
Vega, H.J. de; Sanchez, N.
2002-01-01
We complete our study of the self-gravitating gas by computing the fluctuations around the saddle point solution for the three statistical ensembles (grand canonical, canonical and microcanonical). Although the saddle point is the same for the three ensembles, the fluctuations change from one ensemble to the other. The zeroes of the small fluctuations determinant determine the position of the critical points for each ensemble. This yields the domains of validity of the mean field approach. Only the S-wave determinant exhibits critical points. Closed formulae for the S- and P-wave determinants of fluctuations are derived. The local properties of the self-gravitating gas in thermodynamic equilibrium are studied in detail. The pressure, energy density, particle density and speed of sound are computed and analyzed as functions of the position. The equation of state turns out to be locally p(r→ )=Tρ V (r→ ) as for the ideal gas. Starting from the partition function of the self-gravitating gas, we prove in this microscopic calculation that the hydrostatic description yielding locally the ideal gas equation of state is exact in the N=∞ limit. The dilute nature of the thermodynamic limit (N∼L→∞ with N/L fixed) together with the long range nature of the gravitational forces play a crucial role in obtaining such ideal gas equation. The self-gravitating gas being inhomogeneous, we have PV/[NT]=f(η)≤1 for any finite volume V. The inhomogeneous particle distribution in the ground state suggests a fractal distribution with Haussdorf dimension D, D is slowly decreasing with increasing density, 1< D<3. The average distance between particles is computed in Monte Carlo simulations and analytically in the mean field approach. A dramatic drop at the phase transition is exhibited, clearly illustrating the properties of the collapse
Statistical homogeneity tests applied to large data sets from high energy physics experiments
Trusina, J.; Franc, J.; Kůs, V.
2017-12-01
Homogeneity tests are used in high energy physics for the verification of simulated Monte Carlo samples, it means if they have the same distribution as a measured data from particle detector. Kolmogorov-Smirnov, χ 2, and Anderson-Darling tests are the most used techniques to assess the samples’ homogeneity. Since MC generators produce plenty of entries from different models, each entry has to be re-weighted to obtain the same sample size as the measured data has. One way of the homogeneity testing is through the binning. If we do not want to lose any information, we can apply generalized tests based on weighted empirical distribution functions. In this paper, we propose such generalized weighted homogeneity tests and introduce some of their asymptotic properties. We present the results based on numerical analysis which focuses on estimations of the type-I error and power of the test. Finally, we present application of our homogeneity tests to data from the experiment DØ in Fermilab.
Teaching at the edge of knowledge: Non-equilibrium statistical physics
Schmittmann, Beate
2007-03-01
As physicists become increasingly interested in biological problems, we frequently find ourselves confronted with complex open systems, involving many interacting constituents and characterized by non-vanishing fluxes of mass or energy. Faced with the task of predicting macroscopic behaviors from microscopic information for these non-equilibrium systems, the familiar Gibbs-Boltzmann framework fails. The development of a comprehensive theoretical characterization of non-equilibrium behavior is one of the key challenges of modern condensed matter physics. In its absence, several approaches have been developed, from master equations to thermostatted molecular dynamics, which provide key insights into the rich and often surprising phenomenology of systems far from equilibrium. In my talk, I will address some of these methods, selecting those that are most relevant for a broad range of interdisciplinary problems from biology to traffic, finance, and sociology. The ``portability'' of these methods makes them valuable for graduate students from a variety of disciplines. To illustrate how different methods can complement each other when probing a problem from, e.g., the life sciences, I will discuss some recent attempts at modeling translation, i.e., the process by which the genetic information encoded on an mRNA is translated into the corresponding protein.
Liu, C.-M.; Liou, M.-L.; Yeh, S.-C.; Shang, N.-C.
2009-01-01
In recent years, many national and local governments claim for a specific GHG (greenhouse gas) reduction goal targeted for many years later. In 2005, the Taipei City government announced that Taipei's total GHG emission in 2015 will reach the same level as that in 2005 and then down to 75% of that level at year 2030. However, based on the estimated energy consumption and GHG emission and the proposed emission reduction plans from the local government, it is clear that these goals are not going to be accomplished. In Taipei, the residential and commercial sector contributes more than 78% of the total GHG emission. Thus, in a business as usual scenario, the total GHG emission in 2030 would be 79% more than that in 2005, far more than the target value proclaimed. As many key factors are uncontrollable by the local government, a target-aimed strategy designing process by looking into changes in Taipei and identifying major targets is proposed in this study. It is demonstrated that such a universally applicable approach will give more confidence to the public on working toward the expected GHG reduction goal
Ho, Wai-Chung; Law, Wing-Wah
2006-01-01
In the past, the music curricula of Hong Kong (HK), Mainland China and Taiwan have focused on Western music, but with the advent of music technology and the new tripartite paradigm of globalisation, localisation and Sinophilia this has begun to change. Hong Kong, Shanghai and Taipei share a common historical culture and their populations are…
Yung-Feng Yen
2015-05-01
Conclusion: Poor HRQOL was associated with a number of factors among IDUs at methadone clinics in Taipei, Taiwan. To improve HRQOL in this population, future programs should focus on IDUs with a history of drug overdose. In addition, methadone programs and social support should be integrated to improve HRQOL among this socially marginalized population.
Chen, Tzu-Ling; Tai, Chen-Jei; Chu, Yu-Roo; Han, Kuo-Chiang; Lin, Kuan-Chia; Chien, Li-Yin
2011-02-01
The objectives of this study were to identify cultural factors (including acculturation and breastfeeding cultures in subjects' native countries and those in mainstream Taiwanese society) and social support related to breastfeeding among immigrant mothers in Taiwan. This study was a cross-sectional survey performed from October 2007 through January 2008. The study participants were 210 immigrant mothers living in Taipei City. The prevalence of exclusive and partial breastfeeding at 3 months postpartum was 59.0% and 14.3%, respectively. Logistic regression analysis revealed that breastfeeding experience among mothers-in-law and the perceived level of acceptance of breastfeeding in Taiwan were positively associated with breastfeeding at 3 months postpartum. Immigrant women with a higher level of household activity support were less likely to breastfeed. Immigrant mothers in Taiwan usually come from cultures with a higher acceptance level for breastfeeding; however, their breastfeeding practices are more likely to be influenced by the mainstream culture in Taiwan.
Chia-Li Chen
2017-03-01
Full Text Available Although many museums nowadays provide multilingual services, translations in museums have not received enough attention from researchers. The issue of how ideology is embedded in museum texts is translated is particularly underresearched. Since museums are often important sites for tourists to learn about a nation, translation plays a pivotal role in mediating how international visitors construct the host nation’s identity. The translation of national identity is even more important when sensitive topics are dealt with, such as exhibitions of the past in memorial museums. This paper takes the Taipei 228 Memorial Museum as a case study to examine how Taiwanese identity is formatted in the Chinese text and reframed in the English translation. The current study found inconsistent historical perspectives embedded in both texts, particularly in the English translation. We argue that, without awareness of ideological assumptions embedded in translations, museums run the risk of sending unintended messages to international visitors.
A Comparative Study of the International Perspectives of Six-Graders in Taipei and Shanghai
Yueh-Chun Huang
2015-01-01
Full Text Available With the emergence of globalization, it has become increasingly important for all citizens to possess an international perspective. The trend of internationalizing educational systems has also emerged in various countries. Thus, to explore the degree to which students can possess international perspectives is an important topic worthy of studying. The purpose of this study was to develop a questionnaire to investigate the current status of the international perspectives of and the differences between sixth graders in Taipei and Shanghai. A total of 1,300 sixth -graders were randomly stratified from the two cities, with 1,111 valid questionnaires returned for further analysis. A significant difference was found in sixth graders’ international perspectives between the two cities. Differences of their demographic characteristics were also identified. More similarities than differences in their backgrounds and experiences were identified. Both similarities and differences were also found in their parents’ level of education and occupation.
Vladimir Sokolov
2009-01-01
Full Text Available The response of Taipei basin upon earthquake excitation was studied using records of recent earthquakes. The strong-motion database includes records obtained at 32 stations of the Taipei TSMIP net work from 83 deep and 142 shallow earthquakes (M > 4.0 that occurred in 1992 - 2004. The characteristics of frequency-de pendent site response were obtained as spectral ratios between the actual earthquake records (horizontal components and those modelled for a hypothetical Very Hard Rock (VHR condition. The models for VHR spectra of Taiwan earthquakes had been recently proposed by Sokolov et al. (2005b, 2006. Analysis of site response characteristics and comparison with simple 1D models of the soil column resulted in the following conclusions: (1 The spectral ratios through out the basin obtained from deep earth quakes (depth > 35 km exhibit good agreement with the theoretical ratios calculated using the 1D models constructed using avail able geological and geotechnical data. (2 The spectral ratios obtained from shallow earth quakes show influence of: (a surface waves generated when travelling from distant sources to the basin and (b relatively low-frequency (< 1 - 2 Hz waves generated within the basin. (3 Some shallow earth quakes pro duce extremely high amplification at frequencies 0.3 - 1 Hz within the basin that may be dangerous for high-rise buildings and high way bridges. (4 The obtained results may be used in probabilistic seismic microzonation of the basin when many possible earth quakes located at various distances are considered. 2D and 3D simulation is necessary to model the seismic influence from particularly large earthquakes.
Association between childhood sexual abuse and adverse psychological outcomes among youth in Taipei.
Li, Nan; Ahmed, Saifuddin; Zabin, Laurie S
2012-03-01
The objective of this study was to examine the relationship between a history of childhood sexual abuse (CSA) and negative psychological consequences in adulthood, controlling for family environments and Confucian values. The data used in this study were collected from Taipei. The final analysis sample comprised 4,084 participants aged 15-24 years. Three sets of logistic regression models were fitted to verify the association between CSA and negative psychological outcomes. Sociodemographic variables, household instability, and parenting variables, as well as Confucian value variables were controlled in models step by step. The overall prevalence of CSA in our analysis sample was 5.2%. The overall prevalence of depression, anxiety, and suicidal ideation among Taipei respondents was 11.8%, 16.4%, and 16.7%, respectively, but young people who experienced CSA had significantly higher rates of all three than young adults who had not experienced CSA. After controlling for other covariates, the odds ratios of depression, anxiety, and suicidal ideation associated with a history of CSA were 1.78 (95% confidence intervals [CI]: 1.25-2.54), 1.77 (95% CI: 1.28-2.44), and 2.56 (95% CI: 1.56-4.29), respectively. Our findings suggested that CSA was an independent predictor of negative psychological consequences in adulthood. In our analysis, we controlled for household, parenting, and Confucian culture factors, which provides a better understanding of how they work together to affect adult psychological status. Copyright © 2012 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.
The maternal and child healthcare needs of new immigrants in Taipei.
Chen, Mei-Ju; Tang, Chao-Hsiun; Jeng, Huey-Mei; Chiu, Allen Wen-Hsiang
2008-12-01
The primary aim of this study was to evaluate the maternal and child healthcare needs of new immigrants in Taiwan. Results will be used to reflect upon the services which the government is currently providing, and to determine if further investigation may be required to establish whether or not the health care quality currently provided by public health nurses succeeds in meeting the needs of new immigrants. Face-to-face interviews were undertaken by public health nurses on 1,068 women from Mainland China, and a further 1,068 women from other Southeast Asian countries, all of whom were randomly selected from the 12 administrative districts of Taipei. Information on the healthcare information needs of mothers and children (10 items), psychological distress variables, health status and socio-demographic variables of both the new immigrants and their Taiwanese spouses were collected via a structured questionnaire, of which a total of 1,829 completed copies were returned. Chi-square tests were performed to examine differences in both healthcare needs and psychological distress levels amongst different new immigrant ethnic groups. Logistic regressions were subsequently performed with the adjusted odds ratios (ORs) then being calculated to examine the differential effects of the healthcare needs of the different ethnic groups of new immigrants. The needs of the Vietnamese immigrants were found to be significantly different from those of the Mainland Chinese immigrants in all items, with the former needing Chinese communication assistance particularly at those times when they received medical treatment (p Cultural competence in public health nursing education should not be deemphasized in Taiwan. Within the public sector, there is a clear need to create and implement partnerships between the public and private sectors on the overall issue of new immigrants within the community. Results strongly suggest that public health nurses should be aware of how to meet the
Wang, Yunn-Jinn; Chen, Chi-Feng; Lin, Jen-Yang
2013-01-01
Pollutants deposited on road surfaces and distributed in the environment are a source of nonpoint pollution. Field data are traditionally hard to collect from roads because of constant traffic. In this study, in cooperation with the traffic administration, the dry deposition on and road runoff from urban roads was measured in Taipei City and New Taipei City, Taiwan. The results showed that the dry deposition is 2.01–5.14 g/m2·day and 78–87% of these solids are in the 75–300 µm size range. The heavy metals in the dry deposited particles are mainly Fe, Zn, and Na, with average concentrations of 34,978, 1,519 and 1,502 ppm, respectively. Elevated express roads show the highest heavy metal concentrations. Not only the number of vehicles, but also the speed of the traffic should be considered as factors that influence road pollution, as high speeds may accelerate vehicle wear and deposit more heavy metals on road surfaces. In addition to dry deposition, the runoff and water quality was analyzed every five minutes during the first two hours of storm events to capture the properties of the first flush road runoff. The sample mean concentration (SMC) from three roads demonstrated that the first flush runoff had a high pollution content, notably for suspended solid (SS), chemical oxygen demand (COD), oil and grease, Pb, and Zn. Regular sweeping and onsite water treatment facilities are suggested to minimize the pollution from urban roads. PMID:24135820
Cuesta, J.A.; Sanchez, A.
1998-01-01
This book contains the Proceedings of ''Fisica Estadistica'97'' (FisEs'97, VIII Spanish Meeting on Statistical Physics), held at the Campus of Getafe (Madrid, Spain) of the Universidad Carlos III de Madrid on September 25 through 27, 1997. Although this is the first time the Proceedings of a Meeting in this series are published, ''Fisica Estasdistica'' dates back to 1986, when about fifty Spanish scientists attended the first edition in Barcelona. That first Meeting was organized by a group of young and not so young physicists who wanted to set up a national conference of an international level and with a broader, more interdisciplinary scope than others held at that time. Their idea quickly got off the ground and following the first edition, sequels took place every year and a half: Palma de Mallorca (1988), Badajoz (1990), Cabuenas, Asturies (1991), El Escorial, Madrid (1993), Sevilla (1994), and Zaragoza (1996)
Lyons, L.
2016-01-01
Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.
Rebhan, E.
2005-01-01
The present second volume treats quantum mechanics, relativistic quantum mechanics, the foundations of quantum-field and elementary-particle theory as well as thermodynamics and statistics. Both volumes comprehend all fields, which are usually offered in a course about theoretical physics. In all treated fields a very careful introduction to the basic natural laws forms the starting point, whereby it is thoroughly analysed, which of them is based on empirics, which is logically deducible, and which role play basic definitions. Extendingly the matter extend of the corresponding courses starting from the relativistic quantum theory an introduction to the elementary particles is developed. All problems are very thoroughly and such extensively studied, that each step is singularly reproducible. On motivation and good understandability is cared much about. The mixing of mathematical difficulties with problems of physical nature often obstructive in the learning is so circumvented, that important mathematical methods are presented in own chapters (for instance Hilbert spaces, Lie groups). By means of many examples and problems (for a large part with solutions) the matter worked out is deepened and exercised. Developments, which are indeed important, but seem for the first approach abandonable, are pursued in excurses. This book starts from courses, which the author has held at the Heinrich-Heine university in Duesseldorf, and was in many repetitions fitted to the requirements of the students. It is conceived in such a way, that it is also after the study suited as dictionary or for the regeneration
Statistical Physics of Adaptation
2016-08-23
Massachusetts Institute of Technology, Floor 6, 400 Tech Square, Cambridge , Massachusetts 02139, USA (Received 23 December 2014; revised manuscript...population composed of two types of exponentially growing self-replicators—we illustrate a simple relationship between outcome-likelihood and...flux that does apply in a very broad class of driven systems, thus illustrating , more generally, what role dissipative history plays in determining the
Introduction to statistical physics
Huang, Kerson
2009-01-01
A Macroscopic View of MatterViewing the World at Different Scales Thermodynamics The Thermodynamic Limit Thermodynamic TransformationsClassic Ideal Gas First Law of Thermodynamics Magnetic SystemsHeat and EntropyThe Heat Equations Applications to Ideal Gas Carnot Cycle Second Law of Thermodynamics Absolute Temperature Temperature as Integrating Factor EntropyEntropy of Ideal Gas The Limits of ThermodynamicsUsing ThermodynamicsThe Energy EquationSome Measurable Coefficients Entropy and Loss TS Diagram Condition for Equilibrium Helmholtz Free EnergyGibbs Potential Maxwell Relations Chemical Pote
Segou, Margarita
2016-01-01
I perform a retrospective forecast experiment in the most rapid extensive continental rift worldwide, the western Corinth Gulf (wCG, Greece), aiming to predict shallow seismicity (depth statistics, four physics-based (CRS) models, combining static stress change estimations and the rate-and-state laboratory law and one hybrid model. For the latter models, I incorporate the stress changes imparted from 31 earthquakes with magnitude M ≥ 4.5 at the extended area of wCG. Special attention is given on the 3-D representation of active faults, acting as potential receiver planes for the estimation of static stress changes. I use reference seismicity between 1990 and 1995, corresponding to the learning phase of physics-based models, and I evaluate the forecasts for six months following the 1995 M = 6.4 Aigio earthquake using log-likelihood performance metrics. For the ETAS realizations, I use seismic events with magnitude M ≥ 2.5 within daily update intervals to enhance their predictive power. For assessing the role of background seismicity, I implement a stochastic reconstruction (aka declustering) aiming to answer whether M > 4.5 earthquakes correspond to spontaneous events and identify, if possible, different triggering characteristics between aftershock sequences and swarm-type seismicity periods. I find that: (1) ETAS models outperform CRS models in most time intervals achieving very low rejection ratio RN = 6 per cent, when I test their efficiency to forecast the total number of events inside the study area, (2) the best rejection ratio for CRS models reaches RN = 17 per cent, when I use varying target depths and receiver plane geometry, (3) 75 per cent of the 1995 Aigio aftershocks that occurred within the first month can be explained by static stress changes, (4) highly variable performance on behalf of both statistical and physical models is suggested by large confidence intervals of information gain per earthquake and (5) generic ETAS models can adequately
Webb, S. J.; Ashwal, L. D.; Cooper, G. R.
2007-12-01
Susceptibility (n=~110,000) and density (n=~~2500) measurements on core samples have been collected in a stratigraphic context from the Bellevue (BV-1) 2950 m deep borehole in the Northern Lobe of the Bushveld Complex. This drill core starts in the granitoid roof rocks, extends through the entire Upper Zone, and ends approximately in the middle of the Main Zone. These physical property measurements now provide an extensive database useful for geophysical modeling and stratigraphic studies. In an effort to quantify the periodicity of the layering we have applied various statistical and wavelet methods to analyze the susceptibility and density data. The density data have revealed a strong periodic layering with a scale of ~~80 m that extends through the Main and Upper Zones. In the Main Zone the layering is unusual in that the density values increase upwards by as much as 10%. This is due to systematic variation in the modal abundance of mafic silicates and appears to be related to separate pulses during emplacement. The magnetic susceptibility data in the Upper Zone also show a strong cyclicity of similar scale. The discrete wavelet transform, using the real Haar wavelet, has been applied to help discretise the susceptibility data and clarifies the geological boundaries without blurring them, which is a common problem with multipoint moving averages. As expected, the histogram of the entire data set is non-Gaussian, with a long tail for high values. We can roughly fit a power law to the log histogram plot indicating a probable fractal distribution of susceptibilities. However if we window the data in the range 750-1000 m the histogram is very different. This region shows a strong peak and no power law relationship. This dramatic change in statistical properties prompted us to investigate these properties more thoroughly. To complement the wavelet analysis we have calculated various statistical measures (mean, standard deviation, skew, and
Efstathiou, Angeliki; Tzanis, Andreas; Vallianatos, Filippos
2014-05-01
The context of Non Extensive Statistical Physics (NESP) has recently been suggested to comprise an appropriate tool for the analysis of complex dynamic systems with scale invariance, long-range interactions, long-range memory and systems that evolve in a fractal-like space-time. This is because the active tectonic grain is thought to comprise a (self-organizing) complex system; therefore, its expression (seismicity) should be manifested in the temporal and spatial statistics of energy release rates. In addition to energy release rates expressed by the magnitude M, measures of the temporal and spatial interactions are the time (Δt) and hypocentral distance (Δd) between consecutive events. Recent work indicated that if the distributions of M, Δt and Δd are independent so that the joint probability p(M,Δt,Δd) factorizes into the probabilities of M, Δt and Δd, i.e. p(M,Δt,Δd)= p(M)p(Δt)p(Δd), then the frequency of earthquake occurrence is multiply related, not only to magnitude as the celebrated Gutenberg - Richter law predicts, but also to interevent time and distance by means of well-defined power-laws consistent with NESP. The present work applies these concepts to investigate the self-organization and temporal/spatial dynamics of seismicity in Greece and western Turkey, for the period 1964-2011. The analysis was based on the ISC earthquake catalogue which is homogenous by construction with consistently determined hypocenters and magnitude. The presentation focuses on the analysis of bivariate Frequency-Magnitude-Time distributions, while using the interevent distances as spatial constraints (or spatial filters) for studying the spatial dependence of the energy and time dynamics of the seismicity. It is demonstrated that the frequency of earthquake occurrence is multiply related to the magnitude and the interevent time by means of well-defined multi-dimensional power-laws consistent with NESP and has attributes of universality,as its holds for a broad
Kholifatus Saadah
2017-12-01
Full Text Available Since the occurrence of application-based taxi, phenomenon and resistance have emerged in metropolitan cities around the world. One of the main issues highlighted is digital collaborative consumption which emerges as the consequences of globalization. As an interpretive case study research, this paper aims to analyze the use of Uber as an alternative to public transportation in Taipei and Surabaya. Authors discuss the issue by comparing the reaction toward the occurrence of Uber and Taipei and Surabaya. Authors apply the theory from Hegre, Gissinger, & Gledtisch (2002 about globalization and social conflict to explain social issues as the consequences of digital collaborative consumption as the new consumption model. According to the theory, globalization creates a deprivation which makes the struggle to access source of capital become more intense. Poverty is the main generator of radical action and violence. Analyzing the phenomena of Uber usage and the resistance from traditional taxi businessmen in Taipei and Surabaya, the authors argue that globalization reflected on digital collaborative consumption could lead to social unrest for parties who cannot adapt to the changes in economic practice. As shown by many cases of app-based rejection taxi in public places such as airport, train station and bus station; the traditionalists show resistance towards globalization and the economic shift of public transportation business model.
Gao, Bin; Li, Xiaoqing; Woo, Wai Lok; Tian, Gui Yun
2018-05-01
Thermographic inspection has been widely applied to non-destructive testing and evaluation with the capabilities of rapid, contactless, and large surface area detection. Image segmentation is considered essential for identifying and sizing defects. To attain a high-level performance, specific physics-based models that describe defects generation and enable the precise extraction of target region are of crucial importance. In this paper, an effective genetic first-order statistical image segmentation algorithm is proposed for quantitative crack detection. The proposed method automatically extracts valuable spatial-temporal patterns from unsupervised feature extraction algorithm and avoids a range of issues associated with human intervention in laborious manual selection of specific thermal video frames for processing. An internal genetic functionality is built into the proposed algorithm to automatically control the segmentation threshold to render enhanced accuracy in sizing the cracks. Eddy current pulsed thermography will be implemented as a platform to demonstrate surface crack detection. Experimental tests and comparisons have been conducted to verify the efficacy of the proposed method. In addition, a global quantitative assessment index F-score has been adopted to objectively evaluate the performance of different segmentation algorithms.
The application of seismic risk-benefit analysis to land use planning in Taipei City.
Hung, Hung-Chih; Chen, Liang-Chun
2007-09-01
In the developing countries of Asia local authorities rarely use risk analysis instruments as a decision-making support mechanism during planning and development procedures. The main purpose of this paper is to provide a methodology to enable planners to undertake such analyses. We illustrate a case study of seismic risk-benefit analysis for the city of Taipei, Taiwan, using available land use maps and surveys as well as a new tool developed by the National Science Council in Taiwan--the HAZ-Taiwan earthquake loss estimation system. We use three hypothetical earthquakes to estimate casualties and total and annualised direct economic losses, and to show their spatial distribution. We also characterise the distribution of vulnerability over the study area using cluster analysis. A risk-benefit ratio is calculated to express the levels of seismic risk attached to alternative land use plans. This paper suggests ways to perform earthquake risk evaluations and the authors intend to assist city planners to evaluate the appropriateness of their planning decisions.
Ji, Dar-Der; Huang, I-Hsiu; Lai, Chao-Chih; Wu, Fang-Tzy; Jiang, Donald Dah-Shyong; Hsu, Bing-Mu; Lin, Wei-Chen
2017-02-01
Enterotoxigenic Bacteroides fragilis (ETBF) and toxin-encoding Clostridium difficile (TXCD) are associated with gastroenteritis. Routine anaerobic blood culture for recovery of these anaerobic pathogens is not used for the detection of their toxins, especially for toxin-variant TXCD. The aim of this study was to investigate the prevalence and risk factors of the genotypes of these anaerobes in patients with acute diarrheal illnesses. The data and samples of 513 patients with gastroenteritis were collected in a Taipei emergency department from March 1, 2006 to December 31, 2009. Nonenterotoxigenic B. fragilis (NTBF) and ETBF and the toxin genotypes of TXCD were detected by molecular methods. The prevalence rates of NTBF, ETBF, and TXCD infections were 33.14%, 1.56%, and 2.34%, respectively. ETBF infections often occurred in the elderly (average age = 67.13 years) and during the cold, dry winters. TXCD infections were widely distributed in age and often occurred in the warm, wet springs and summers. The symptoms of ETBF-infected patients were significantly more severe than those of NTBF-infected patients. This study identified and analyzed the prevalence, risk factors, and clinical presentations of these anaerobic infections. Future epidemiologic and clinical studies are needed to understand the role of ETBF and TXCD in human gastroenteritis. Copyright © 2015. Published by Elsevier B.V.
Seismic Observations in the Taipei Metropolitan Area Using the Downhole Network
Win-Gee Huang
2010-01-01
Full Text Available Underlain by soft soils, the Taipei Metropolitan Area (TMA experienced major damage due to ground-motion amplification during the Hualien earthquake of 1986, the Chi-Chi earthquake of 1999, the Hualien earthquake of 2002 and the Taitung earthquake of 2003. To study how a local site can substantially change the characteristics of seismic waves as they pass through soft deposits below the free surface, two complementary downhole seismic arrays have been operated in the TMA, since 1991 and 2008. The accelerometer downhole array is composed of eight boreholes at depths in excess of 300 meters. The downhole array velocity sensor collocated with accelerometer composed of four boreholes at depths up to 90 meters. The integrated seismic network monitors potential earthquakes originating from faults in and around the TMA and provides wide-dynamic range measurement of data ranging in amplitude from seismic background noise levels to damage levels as a result of shaking. The data sets can be used to address on the response of soft-soil deposits to ground motions. One of the major considerations is the nonlinear response of soft soil deposits at different levels of excitation. The collocated acceloerometer and velocity sensors at boreholes give the necessary data for studies of non-linearity to be acquired. Such measurements in anticipation of future large, damaging earthquakes will be of special importance for the mitigation of earthquake losses.
Jou, Ming-Huey; Chen, Ping-Ling; Lee, Sheuan; Yin, Teresa J C
2003-03-01
The purpose of this study was to investigate the performance and associated factors of sexuality education by elementary school nurses in Taipei. A structured questionnaire was utilized to collect data from a convenience sample of 145 elementary school nurses. The Kuder-Richarson reliability for sex knowledge scale was.73, and Cronbach's agr; for sex attitude scale was.93. The findings of the study were as followed: (1) Sex knowledge was high among study samples. The average scores for sex knowledge regarding " masturbation ", " sexual harassment and sexual abuse " were among the highest; those regarding " secondary sexual characteristics ", " ovulation ", " menstruation health care ", and " sexually transmitted diseases " were among the lowest. (2) Sex attitude was positive. Eighty percent of the study subjects agreed that school nurses were responsible for the promotion of sexual health in schools. More than 90% of the study subjects were willing to participate actively in sexuality education program in school, providing health consultation and guidance. (3) Twenty percent of the study subjects were not involved in sex education because they were not invited or due to busy working schedule.(4) Marital status, highest level of education, job title, job seniority, continuing education or training experience were the factors associated with the implementation of sexuality education among school nurses.
Running injuries and associated factors in participants of ING Taipei Marathon.
Chang, Wei-Ling; Shih, Yi-Fen; Chen, Wen-Yin
2012-08-01
To investigate the distribution of lower extremity running injuries and their associated factors. Descriptive and exploratory study. 1004 participants of the 2005 ING Taipei International Marathon. We used a self-developed questionnaire to collect data of previous running injuries and applied multivariate logistic regression modeling to examine relationships between these injuries and associated factors. Of the 893 valid questionnaires, 396 (44.4%) reported having previous lower extremity pain related to running. Knee joint pain was the most common problem (32.5%). Hip pain was associated with the racing group, training duration, and medial arch support. Use of knee orthotics (P = 0.002) and ankle braces (P = 0.007) was related to a higher rate of knee and ankle pain. Participants of the full marathon group who practiced on a synthetic track had a higher incidence of ankle pain. A training duration of >60 min was linked to an increased rate of foot pain (P = 0.003). Our data indicated that running injuries were associated with training duration and use of orthotics. Clinicians can use this information in treating or preventing running associated injuries and pain. Copyright © 2011 Elsevier Ltd. All rights reserved.
Impacts of Typhoon Soudelor (2015) on the water quality of Taipei, Taiwan.
Fakour, Hoda; Lo, Shang-Lien; Lin, Tsair-Fuh
2016-04-29
Typhoon Soudelor was one of the strongest storms in the world in 2015. The category 5 hurricane made landfall in Taiwan on August 8, causing extensive damage and severe impacts on the environment. This paper describes the changes of trihalomethane (THM) concentrations in tap and drinking fountain water in selected typhoon-affected areas in Taipei before and after the typhoon. Samples were taken from water transmission mains at various distances from the local water treatment plant. The results showed that organic matter increased between pre- and post-typhoon periods with a greater proportion of aromatic compounds. Although drinking fountains showed moderately less total trihalomethane (TTHM) levels than that of tap water, the intake of high turbidity water considerably diminished the efficiency of their purification systems after the typhoon. The percentage distribution of THM species increased throughout the distribution network, probably due to a longer contact time between chlorine and the organic matter in the pipelines. After 2 to 5 min of boiling, THM reduction was considerable in all cases with the greater extent in post-typhoon samples. It is evident that extreme weather conditions may have a severe impact on water quality, and thus more cautious strategies should be adopted in such cases.
Impacts of Typhoon Soudelor (2015) on the water quality of Taipei, Taiwan
Fakour, Hoda; Lo, Shang-Lien; Lin, Tsair-Fuh
2016-04-01
Typhoon Soudelor was one of the strongest storms in the world in 2015. The category 5 hurricane made landfall in Taiwan on August 8, causing extensive damage and severe impacts on the environment. This paper describes the changes of trihalomethane (THM) concentrations in tap and drinking fountain water in selected typhoon-affected areas in Taipei before and after the typhoon. Samples were taken from water transmission mains at various distances from the local water treatment plant. The results showed that organic matter increased between pre- and post-typhoon periods with a greater proportion of aromatic compounds. Although drinking fountains showed moderately less total trihalomethane (TTHM) levels than that of tap water, the intake of high turbidity water considerably diminished the efficiency of their purification systems after the typhoon. The percentage distribution of THM species increased throughout the distribution network, probably due to a longer contact time between chlorine and the organic matter in the pipelines. After 2 to 5 min of boiling, THM reduction was considerable in all cases with the greater extent in post-typhoon samples. It is evident that extreme weather conditions may have a severe impact on water quality, and thus more cautious strategies should be adopted in such cases.
Łukasz Bil
Full Text Available We study how the approach grounded on non-extensive statistical physics can be applied to describe and distinguish different stages of the stock and money market development. A particular attention is given to asymmetric behavior of fat tailed distributions of positive and negative returns. A new method to measure this asymmetry is proposed. It is based on the value of the non-extensive Tsallis parameter q. The new quantifier of the relative asymmetry level between tails in terms of the Tsallis parameters q± is provided to analyze the effect of memory in data caused by nonlinear autocorrelations. The presented analysis takes into account data of separate stocks from the main developing stock market in Europe, i.e., the Warsaw Stock Exchange (WSE in Poland and-for comparison-data from the most mature money market (Forex. It is argued that the proposed new quantifier is able to describe the stage of market development and its robustness to speculation. The main strength is put on a description and interpretation of the asymmetry between statistical properties of positive and negative returns for various stocks and for diversified time-lags Δt of data counting. The particular caution in this context is addressed to the difference between intraday and interday returns. Our search is extended to study memory effects and their dependence on the quotation frequency for similar large companies-owners of food-industrial retail supermarkets acting on both Polish and European markets (Eurocash, Jeronimo-Martins, Carrefour, Tesco-but traded on various European stock markets of diversified economical maturity (respectively in Warsaw, Lisbon, Paris and London. The latter analysis seems to indicate quantitatively that stocks from the same economic sector traded on different markets within European Union (EU may be a target of diversified level of speculations involved in trading independently on the true economic situation of the company. Our work thus gives
Bil, Łukasz; Zienowicz, Magdalena
2017-01-01
We study how the approach grounded on non-extensive statistical physics can be applied to describe and distinguish different stages of the stock and money market development. A particular attention is given to asymmetric behavior of fat tailed distributions of positive and negative returns. A new method to measure this asymmetry is proposed. It is based on the value of the non-extensive Tsallis parameter q. The new quantifier of the relative asymmetry level between tails in terms of the Tsallis parameters q± is provided to analyze the effect of memory in data caused by nonlinear autocorrelations. The presented analysis takes into account data of separate stocks from the main developing stock market in Europe, i.e., the Warsaw Stock Exchange (WSE) in Poland and—for comparison—data from the most mature money market (Forex). It is argued that the proposed new quantifier is able to describe the stage of market development and its robustness to speculation. The main strength is put on a description and interpretation of the asymmetry between statistical properties of positive and negative returns for various stocks and for diversified time-lags Δt of data counting. The particular caution in this context is addressed to the difference between intraday and interday returns. Our search is extended to study memory effects and their dependence on the quotation frequency for similar large companies—owners of food-industrial retail supermarkets acting on both Polish and European markets (Eurocash, Jeronimo-Martins, Carrefour, Tesco)—but traded on various European stock markets of diversified economical maturity (respectively in Warsaw, Lisbon, Paris and London). The latter analysis seems to indicate quantitatively that stocks from the same economic sector traded on different markets within European Union (EU) may be a target of diversified level of speculations involved in trading independently on the true economic situation of the company. Our work thus gives
Bil, Łukasz; Grech, Dariusz; Zienowicz, Magdalena
2017-01-01
We study how the approach grounded on non-extensive statistical physics can be applied to describe and distinguish different stages of the stock and money market development. A particular attention is given to asymmetric behavior of fat tailed distributions of positive and negative returns. A new method to measure this asymmetry is proposed. It is based on the value of the non-extensive Tsallis parameter q. The new quantifier of the relative asymmetry level between tails in terms of the Tsallis parameters q± is provided to analyze the effect of memory in data caused by nonlinear autocorrelations. The presented analysis takes into account data of separate stocks from the main developing stock market in Europe, i.e., the Warsaw Stock Exchange (WSE) in Poland and-for comparison-data from the most mature money market (Forex). It is argued that the proposed new quantifier is able to describe the stage of market development and its robustness to speculation. The main strength is put on a description and interpretation of the asymmetry between statistical properties of positive and negative returns for various stocks and for diversified time-lags Δt of data counting. The particular caution in this context is addressed to the difference between intraday and interday returns. Our search is extended to study memory effects and their dependence on the quotation frequency for similar large companies-owners of food-industrial retail supermarkets acting on both Polish and European markets (Eurocash, Jeronimo-Martins, Carrefour, Tesco)-but traded on various European stock markets of diversified economical maturity (respectively in Warsaw, Lisbon, Paris and London). The latter analysis seems to indicate quantitatively that stocks from the same economic sector traded on different markets within European Union (EU) may be a target of diversified level of speculations involved in trading independently on the true economic situation of the company. Our work thus gives indications
MacKenzie, Dana
2004-01-01
The drawbacks of using 19th-century mathematics in physics and astronomy are illustrated. To continue with the expansion of the knowledge about the cosmos, the scientists will have to come in terms with modern statistics. Some researchers have deliberately started importing techniques that are used in medical research. However, the physicists need to identify the brand of statistics that will be suitable for them, and make a choice between the Bayesian and the frequentists approach. (Edited abstract).
Cartwright-Taylor, Alexis; Vallianatos, Filippos; Sammonds, Peter
2014-05-01
We have conducted room-temperature, triaxial compression experiments on samples of Carrara marble, recording concurrently acoustic and electric current signals emitted during the deformation process as well as mechanical loading information and ultrasonic wave velocities. Our results reveal that in a dry non-piezoelectric rock under simulated crustal pressure conditions, a measurable electric current (nA) is generated within the stressed sample. The current is detected only in the region beyond (quasi-)linear elastic deformation; i.e. in the region of permanent deformation beyond the yield point of the material and in the presence of microcracking. Our results extend to shallow crustal conditions previous observations of electric current signals in quartz-free rocks undergoing uniaxial deformation and support the idea of a universal electrification mechanism related to deformation. Confining pressure conditions of our slow strain rate (10-6 s-1) experiments range from the purely brittle regime (10 MPa) to the semi-brittle transition (30-100MPa) where cataclastic flow is the dominant deformation mechanism. Electric current is generated under all confining pressures,implying the existence of a current-producing mechanism during both microfracture and frictional sliding. Some differences are seen in the current evolution between these two regimes, possibly related to crack localisation. In all cases, the measured electric current exhibits episodes of strong fluctuations over short timescales; calm periods punctuated by bursts of strong activity. For the analysis, we adopt an entropy-based statistical physics approach (Tsallis, 1988), particularly suited to the study of fracture related phenomena. We find that the probability distribution of normalised electric current fluctuations over short time intervals (0.5 s) can be well described by a q-Gaussian distribution of a form similar to that which describes turbulent flows. This approach yields different entropic
Sheet, Debdoot; Karamalis, Athanasios; Kraft, Silvan; Noël, Peter B.; Vag, Tibor; Sadhu, Anup; Katouzian, Amin; Navab, Nassir; Chatterjee, Jyotirmoy; Ray, Ajoy K.
2013-03-01
Breast cancer is the most common form of cancer in women. Early diagnosis can significantly improve lifeexpectancy and allow different treatment options. Clinicians favor 2D ultrasonography for breast tissue abnormality screening due to high sensitivity and specificity compared to competing technologies. However, inter- and intra-observer variability in visual assessment and reporting of lesions often handicaps its performance. Existing Computer Assisted Diagnosis (CAD) systems though being able to detect solid lesions are often restricted in performance. These restrictions are inability to (1) detect lesion of multiple sizes and shapes, and (2) differentiate between hypo-echoic lesions from their posterior acoustic shadowing. In this work we present a completely automatic system for detection and segmentation of breast lesions in 2D ultrasound images. We employ random forests for learning of tissue specific primal to discriminate breast lesions from surrounding normal tissues. This enables it to detect lesions of multiple shapes and sizes, as well as discriminate between hypo-echoic lesion from associated posterior acoustic shadowing. The primal comprises of (i) multiscale estimated ultrasonic statistical physics and (ii) scale-space characteristics. The random forest learns lesion vs. background primal from a database of 2D ultrasound images with labeled lesions. For segmentation, the posterior probabilities of lesion pixels estimated by the learnt random forest are hard thresholded to provide a random walks segmentation stage with starting seeds. Our method achieves detection with 99.19% accuracy and segmentation with mean contour-to-contour error < 3 pixels on a set of 40 images with 49 lesions.
A method for statistical comparison of data sets and its uses in analysis of nuclear physics data
Bityukov, S.I.; Smirnova, V.V.; Krasnikov, N.V.; Maksimushkina, A.V.; Nikitenko, A.N.
2014-01-01
Authors propose a method for statistical comparison of two data sets. The method is based on the method of statistical comparison of histograms. As an estimator of quality of the decision made, it is proposed to use the value which it is possible to call the probability that the decision (data sets are various) is correct [ru
Chiang, C-Y; Lee, J-J; Yu, M-C; Enarson, D A; Lin, T-P; Luh, K-T
2009-01-01
All individuals reported as being treated for pulmonary tuberculosis (PTB) among citizens of Taipei City, Taiwan, in 2003. To investigate risk factors associated with treatment interruption for at least 2 consecutive months and death. The outcome of PTB cases was determined by consulting medical charts. Of 1127 PTB patients registered, 824 (73.1%) were successfully treated, 189 (16.8%) died, 65 (5.8%) interrupted treatment, 17 (1.5%) were still on treatment 15 months after commencing treatment and 32 (2.8%) failed. The only significant factor associated with treatment interruption was visits to other health facilities after commencing tuberculosis (TB) treatment. TB patients had a standardised mortality ratio of 8.7 (95%CI 7.5-10.0). Factors significantly associated with death were age (adjusted hazard ratio [adjHR] 1.06. 95%CI 1.05-1.08), sputum culture not performed/unknown (adjHR 2.07, 95%CI 1.47-2.92), and comorbidity with respiratory disease (adjHR 1.68, 95%CI 1.24-2.27), infectious disease (adjHR 2.80, 95%CI 2.07-3.78), renal disease (adjHR 2.58, 95%CI 1.82-3.66) or cancer (adjHR 3.31, 95%CI 2.35-4.65), compared with other patients. Visits to other health facilities were associated with interruption of treatment for at least 2 months. A high proportion of deaths was due to old age and comorbidity.
Observation of a "holiday effect": a case of Chinese New Year in Taipei
Tan, P.-H.; Chou, C.; Chen, P.-Y.; Liang, J.-Y.
2009-04-01
Our study was an attempt to conduct a comprehensive and systematical examination of the holiday effect, defined as the difference in air pollutant concentrations between holiday and non-holiday periods. This holiday effect can be applied to other countries with similar national or cultural holidays. Hourly and daily surface measurements of six major air pollutants from thirteen air quality monitoring stations of the Taiwan Environmental Protection Administration during the Chinese New Year (CNY) and non-Chinese New Year (NCNY) periods were used. We documented evidence of a "holiday effect", where air pollutant concentrations were significantly different between holidays (CNY) and non-holidays (NCNY), in the Taipei metropolitan area over the past thirteen years (1994-2006). The concentrations of NOx, CO, NMHC, SO2 and PM10 were lower in the CNY than in the NCNY period, while the variation of O3 was reversed, which was mainly due to the NO titration effect. Similar differences in these six air pollutants between the CNY and NCNY periods were also found in the diurnal cycle and in the interannual variation. For the diurnal cycle, a common traffic-related double-peak variation was observed in the NCNY period, but not in the CNY period. Impacts of dust storms were also observed, especially on SO2 and PM10 in the CNY period. In the 13-year period of 1994-2006, decreasing trends of NOx and CO in the NCNY period implied a possible reduction of local emissions. Increasing trends of SO2 and PM10 in the CNY period, on the other hand, indicated a possible enhancement of long-range transport. These two mechanisms weakened the holiday effect.
Analysis of motorcycle exhaust regular testing data--a case study of Taipei City.
Chen, Yi-Chi; Chen, Lu-Yen; Jeng, Fu-Tien
2009-06-01
In Taiwan, a continuous increase in the number of motorcycles has made exhaust pollution one of the major emission sources of air pollutants. The regular testing program carried out by the Republic of China Environmental Protection Agency was designed to reduce air pollutant emissions by enhancing maintenance and repair. During the execution period, abundant testing results were accumulated to discuss pollutant emissions from motorcycles. Exhaust testing data of motorcycles in Taipei City from 1996 to 2005 were chosen as the basic data to survey changes in motorcycle exhaust. Effects of motorcycle age and mileage on exhaust pollution were studied. The introduction of advanced emission standards enhances the elimination of high-emitting motorcycles. The testing data indicate that the testing rate rose from approximately 50 to 70% and the failure rate changed from approximately 15 to 10%. The operation cycles of two-stroke motorcycles make them high-emitting vehicles. Concentrations of carbon monoxide and hydrocarbons are higher in two-stroke motorcycle exhaust than that in four-stroke motorcycles. In contrast, the concentration of carbon dioxide produced from complete oxidation processes is lower in exhaust from two-stroke motorcycles. Therefore, failure rates of two-stroke motorcycles are higher than those of four-stroke motorcycles and were also observed to deactivate more easily. On the basis of analytical results of testing data, we found that failure rates show a gradually increasing trend for motorcycles older than 3 yr or used for mileages greater than 10,000 km, and failure rates are highly correlated to the age/mileage of motorcycles. We reason that the accumulation of age or mileage means accumulating usage time of engines and emission control systems. Concentrations of pollutant emissions would increase because of engine wear and emission control system deactivation. After discussing changes of failure rates and pollutant emissions, some suggestions are
Diurnal Variations of Airborne Pollen and Spores in Taipei City, Taiwan
Yueh-Lin Yang
2003-09-01
Full Text Available The diurnal variation of airborne pollen and spores in Taipei City, Taiwan, was investigated during a two-year survey from 1993 to 1994. The pollen and spores were sampled using a Burkard seven-day volumetric pollen trap. The diurnal trends of the total amount of pollen and spores in 1993 and in 1994 were similar to each other, and peaked at 3 to 10 o’clock. The diurnal patterns of airborne pollen and spores of Broussonetia, Fraxinus, Cyathea and Gramineae in 1993 were similar to those in 1994. High concentrations of Broussonetia and Fraxinus were obtained from midnight to the next morning. Cyathea spores peaked from morning till noon, and Gramineae peaked in the afternoon. The diurnal patterns of airborne pollen of Bischofia, Juniperus, Mallotus, Morus, Trema and Urticaceae in 1993 were different to those in 1994. Regular diurnal patterns also associated with the taxa, which produce large pollen or spores, such as Gramineae and Cyathea. In contrast, Bischofia, Juniperus, Mallotus, Morus, Trema and Urticaceae produce relatively small pollen and the diurnal patterns of their airborne pollen were found irregular. The source plants Broussonetia and Fraxinus were close to the collection site so the diurnal patterns of their airborne pollen were regular, suggesting that the diurnal fluctuations of the pollen or spores in air might be affected by the source of plants and the sizes of pollen or spores. The transportation of the smaller pollen or spores in air is probably more easily affected by instability of air currents; they are therefore more likely to exhibit irregular diurnal patterns.
Estimation of Rainfall Erosivity via 1-Minute to Hourly Rainfall Data from Taipei, Taiwan
Huang, Ting-Yin; Yang, Ssu-Yao; Jan, Chyan-Deng
2017-04-01
Soil erosion is a natural process on hillslopes that threats people's life and properties, having a considerable environmental and economic implications for soil degradation, agricultural activity and water quality. The rainfall erosivity factor (R-factor) in the Universal Soil Loss Equation (USLE), composed of total kinetic energy (E) and the maximum 30-min rainfall intensity (I30), is widely used as an indicator to measure the potential risks of soil loss caused by rainfall at a regional scale. This R factor can represent the detachment and entrainment involved in climate conditions on hillslopes, but lack of 30-min rainfall intensity data usually lead to apply this factor more difficult in many regions. In recent years, fixed-interval, hourly rainfall data is readily available and widely used due to the development of automatic weather stations. Here we assess the estimations of R, E, and I30 based on 1-, 5-, 10-, 15-, 30-, 60-minute rainfall data, and hourly rainfall data obtained from Taipei weather station during 2004 to 2010. Results show that there is a strong correlation among R-factors estimated from different interval rainfall data. Moreover, the shorter time-interval rainfall data (e.g., 1-min) yields larger value of R-factor. The conversion factors of rainfall erosivity (ratio of values estimated from the resolution lower than 30-min rainfall data to those estimated from 60-min and hourly rainfall data, respectively) range from 1.85 to 1.40 (resp. from 1.89 to 1.02) for 60-min (resp. hourly) rainfall data as the time resolution increasing from 30-min to 1-min. This paper provides useful information on estimating R-factor when hourly rainfall data is only available.
Schwabl, Franz
2006-01-01
The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...
Assessment of PM10 enhancement by yellow sand on the air quality of Taipei, Taiwan in 2001.
Chang, Shuenn-Chin; Lee, Chung-Te
2007-09-01
The impact of long-range transport of yellow sand from Asian Continent to the Taipei Metropolitan Area (Taipei) not only deteriorates air quality but also poses health risks to all, especially the children and the elderly. As such, it is important to assess the enhancement of PM(10) during yellow sand periods. In order to estimate PM(10) enhancement, we adopted factor analysis to distinguish the yellow-sand (YS) periods from non-yellow-sand (NYS) periods based on air quality monitoring records. Eight YS events were identified using factor analysis coupling with an independent validation procedure by checking background site values, examining meteorological conditions, and modeling air mass trajectory from January 2001 to May 2001. The duration of each event varied from 11 to 132 h, which was identified from the time when the PM(10) level was high, and the CO and NOx levels were low. Subsequently, we used the artificial neural network (ANN) to simulate local PM(10) levels from related parameters including local gas pollutants and meteorological factors during the NYS periods. The PM(10) enhancement during the YS periods is then calculated by subtracting the simulated PM(10) from the observed PM(10) levels. Based on our calculations, the PM(10) enhancement in the maximum hour of each event ranged from 51 to 82%. Moreover, in the eight events identified in 2001, it was estimated that a total amount of 7,210 tons of PM(10) were transported by yellow sand to Taipei. Thus, in this study, we demonstrate that an integration of factor analysis with ANN model could provide a very useful method in identifying YS periods and in determining PM(10) enhancement caused by yellow sand.
Hsiao-Lan Liu
2014-12-01
Full Text Available In order to achieve a sustainable urban environment, the increase of green space areas is commonly used as a planning tool and adaptation strategy to combat environmental impacts resulting from global climate change and urbanization. Therefore, it is important to understand the change of green space areas and the derived impacts from the change. This research firstly applied space analysis and landscape ecology metrics to analyze the structure change of the pattern of green space area within the Taipei Metropolitan Area. Then, partial least squares were used to identify the consequences on microclimate and air pollution pattern caused by the changing pattern of green space areas within the districts of the Taipei Metropolitan Area. According to the analytical results, the green space area within Taipei Metropolitan Areas has decreased 1.19% from 1995 to 2007, but 93.19% of the green space areas have been kept for their original purposes. Next, from the landscape ecology metrics analysis, in suburban areas the linkages, pattern parameters, and space aggregation are all improving, and the fragmentation measure is also decreasing, but shape is becoming more complex. However, due to intensive land development in the city core, the pattern has becomes severely fragmented and decentralized causing the measures of the linkages and pattern parameters to decrease. The results from structural equation modeling indicate that the changing pattern of green space areas has great influences on air pollution and microclimate patterns. For instance, less air pollution, smaller rainfall patterns and cooler temperatures are associated with improvement in space aggregation, increasing the larger sized green space patch.
A survey of serum specific-lgE to common allergens in primary school children of Taipei City.
Wan, Kong-Sang; Yang, Winnie; Wu, Wei-Fong
2010-03-01
Environmental factors and eating habits have had a significant impact on the increased sensitization to allergens in children. This study investigated changes in common allergen sensitivities among children in Taipei City, Taiwan. A total of 142 primary schools in Taipei City, which included 25,094 students aged 7-8 years, were surveyed using an ISAAC questionnaire to screen for allergies. For positive responders, serum allergen-specific IgE was confirmed using the Pharmacia CAP system. A total of 1,500 students (5.98%) had confirmed sensitivities to allergens. Dust mite sensitivity among these children was nearly 90%. The prevalences of sensitivities to Dermatophagoides pteronyssinus, D. farinae and Blomia tropicalis were 90.79%, 88.24%, and 84.63%, respectively. Dog dander (29.95%) was the second most common aeroallergen to induce sensitivity. Allergies to cat dander (8.69%) and to cockroach (15.48%) had decreased dramatically compared with previous analyses. Among the food allergens studied, the most common allergens that induced sensitization were (in order of prevalence) crab, milk, egg white, and shrimp (88.08%, 22.45%, 24.23%, and 21.44%, respectively). Mold and pollen sensitization was identified in fewer than 2% of the schoolchildren. Dust mites remain the most common allergen to induce allergic sensitization among children in Taipei City, while cockroach and mold sensitivities have dramatically declined. Food allergens should also be considered as a trigger of respiratory allergy. Except for dust mites, American cockroach and crab, allergens commonly reported to induce sensitization in other Asian counties are not common in Taiwan.
Understanding Statistics - Cancer Statistics
Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.
García Parra, P; Anaya Rojas, M; Jiménez Bravo, B; González Oria, M O; Lisbona Muñoz, M; Gil Álvarez, J J; Cano Luis, P
2016-01-01
Only a few clinical exploratory manoeuvres are truly discriminatory and useful in shoulder disease. The aim of this study is to correlate the physical examination results of the shoulder with the true diagnosis found by arthroscopy. A retrospective case series of 150 patients with the most common surgical conditions of the shoulder. Data were collected on the suspicion of each pathology, the physical examination of the patient, and the actual discovery of the disease during arthroscopic surgery. The Bankart examination manoeuvres of the lesion show the best results, with a 92.1% positive prediction value (PPV), a 99.1% negative predictive value (NPV), followed by the impingement syndrome, with a PPV of 94.4%, and total cuff rupture with a PPV of 92.3%.Exploration of the superior labrum anterior to posterior (SLAP) lesion had an NPV of 99.1%. Physical examination is sufficient to diagnose or rule out Bankart. A positive physical examination provides the complete rupture of the rotator cuff, and requires further studies. The patients suspected of subacromial syndrome only need an NMR if the physical tests are negative. The conclusions drawn from this work can have a significant impact on both cost savings (by reducing forward tests), and saving time in certain cases in which, after appropriate physical examination, surgery may be indicated without losing time in intermediate steps. Copyright © 2016 SECOT. Publicado por Elsevier España, S.L.U. All rights reserved.
Edward M Grant
Full Text Available We examined associations among longitudinal, multilevel variables and girls' physical activity to determine the important predictors for physical activity change at different adolescent ages. The Trial of Activity for Adolescent Girls 2 study (Maryland contributed participants from 8th (2009 to 11th grade (2011 (n=561. Questionnaires were used to obtain demographic, and psychosocial information (individual- and social-level variables; height, weight, and triceps skinfold to assess body composition; interviews and surveys for school-level data; and self-report for neighborhood-level variables. Moderate to vigorous physical activity minutes were assessed from accelerometers. A doubly regularized linear mixed effects model was used for the longitudinal multilevel data to identify the most important covariates for physical activity. Three fixed effects at the individual level and one random effect at the school level were chosen from an initial total of 66 variables, consisting of 47 fixed effects and 19 random effects variables, in additional to the time effect. Self-management strategies, perceived barriers, and social support from friends were the three selected fixed effects, and whether intramural or interscholastic programs were offered in middle school was the selected random effect. Psychosocial factors and friend support, plus a school's physical activity environment, affect adolescent girl's moderate to vigorous physical activity longitudinally.
Mortality associated with particulate concentration and Asian dust storms in Metropolitan Taipei
Wang, Yu-Chun; Lin, Yu-Kai
2015-09-01
This study evaluates mortality risks from all causes, circulatory diseases, and respiratory diseases associated with particulate matter (PM10 and PM2.5) concentrations and Asian dust storms (ADS) from 2000 to 2008 in Metropolitan Taipei. This study uses a distributed lag non-linear model with Poisson distribution to estimate the cumulative 5-day (lags 0-4) relative risks (RRs) and confidence intervals (CIs) of cause-specific mortality associated with daily PM10 and PM2.5 concentrations, as well as ADS, for total (all ages) and elderly (≥65 years) populations based on study periods (ADS frequently inflicted period: 2000-2004; and less inflicted period: 2005-2008). Risks associated with ADS characteristics, including inflicted season (winter and spring), strength (the ratio of stations with Pollutant Standard Index >100 is increase in PM10 from 10 μg/m3 to 50 μg/m3 was associated with increased all-cause mortality risk with cumulative 5-day RR of 1.10 (95% CI: 1.04, 1.17) for the total population and 1.10 (95% CI: 1.02, 1.18) for elders. Mortality from circulatory diseases for the elderly was related to increased PM2.5 from 5 μg/m3 to 30 μg/m3, with cumulative 5-day RR of 1.21 (95% CI: 1.02, 1.44) from 2005 to 2008. Compared with normal days, the mortality from all causes and circulatory diseases for the elderly population was associated with winter ADS with RRs of 1.05 (95% CI: 1.01, 1.08) and 1.08 (95% CI: 1.01, 1.15), respectively. Moreover, all-cause mortality was associated with shorter and less area-affected ADS with an RR of 1.04 for total and elderly populations from 2000 to 2004. Population health risk differed not only with PM concentration but also with ADS characteristics.
Harn, H J; Shen, K L; Ho, L I; Yu, K W; Liu, G C; Yueh, K C; Lee, J H
1997-01-01
AIMS: To determine, by strain identification of Mycobacterium tuberculosis, whether transmission has occurred between individuals or whether new strains are present. METHODS: A rapid protocol for random amplified polymorphic DNA (RAPD) analysis was developed. This protocol was applied to 64 strains of M tuberculosis that had been confirmed by culture and microbiological methods. RESULTS: There are five groups of M tuberculosis prevalent in Taipei city, Taiwan. The major types are groups I and III. Groups I and II had been prevalent until the end of last year when, according to our group analysis, they had been eradicated. However, group III was continuously present from the middle of 1995 to the middle of 1996, and group IV was present at the end of both years, which indicated that both groups were transmitted continuously. These clustered strains had demographic characteristics consistent with a finding of transmission tuberculosis. Also, there were 13 of 64 strains with unique RAPD fingerprints that were inferred to be due primarily to the reactivation of infection. In the drug resistance analysis, the major type represented included group III and part of group IV. CONCLUSIONS: Our preliminary data imply, not only that the prevalence of M tuberculosis in Taipei city is due to transmission rather than reactivation, but that drug resistance also may play a role in tuberculosis transmission. Images PMID:9378819
Jana, Madhusudan
2015-01-01
Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...
Rodríguez, Nancy
2015-03-01
The use of mathematical tools has long proved to be useful in gaining understanding of complex systems in physics [1]. Recently, many researchers have realized that there is an analogy between emerging phenomena in complex social systems and complex physical or biological systems [4,5,12]. This realization has particularly benefited the modeling and understanding of crime, a ubiquitous phenomena that is far from being understood. In fact, when one is interested in the bulk behavior of patterns that emerge from small and seemingly unrelated interactions as well as decisions that occur at the individual level, the mathematical tools that have been developed in statistical physics, game theory, network theory, dynamical systems, and partial differential equations can be useful in shedding light into the dynamics of these patterns [2-4,6,12].
Sargolzaeipor, S.; Hassanabadi, H.; Chung, W. S.
2018-04-01
In this paper, we study the T -fluctuated form of superstatistics. In this form, some thermodynamic quantities such as the Helmholtz energy, the entropy and the internal energy, are expressed in terms of the T -fluctuated form for a canonical ensemble. In addition, the partition functions in the formalism for 2-level and 3-level distributions are derived. Then we make use of the T -fluctuated superstatistics for a quantum harmonic oscillator problem and the thermal properties of the system for three statistics of the Bose-Einstein, Maxwell-Boltzmann and Fermi-Dirac statistics are calculated. The effect of the deformation parameter on these properties is examined. All the results recover the well-known results by removing the deformation parameter.
Detomi, Anine Cristina; Filho, Sergio Luiz Moni Ribeiro; Panzera, Túlio H C; Schiavon, Marco Antonio; Silva, Vania R V; Scarpa, Fabrizio
2016-01-01
This work investigates the mechanical behavior of cementitious composites (mortar) when quartz inclusions are totally or partially replaced with polyethylene terephthalate (PET) particles. A full factorial design is performed to identify the effect of the water/cement ratio and the range of quartz particles size used in the replacement on the different mechanical and physical parameters (bulk density, apparent porosity, water absorption, oxygen permeability, compressive strength, and modulus ...
National transportation statistics 2011
2011-04-01
Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics : (BTS), National Transportation Statistics presents information on the U.S. transportation system, including : its physical components, safety reco...
National Transportation Statistics 2008
2009-01-08
Compiled and published by the U.S. Department of Transportations Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record...
National Transportation Statistics 2009
2010-01-21
Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record, ...
National transportation statistics 2010
2010-01-01
National Transportation Statistics presents statistics on the U.S. transportation system, including its physical components, safety record, economic performance, the human and natural environment, and national security. This is a large online documen...
Andrade, João Rodrigo; Martins, Ramon Silva; Thompson, Roney Leon; Mompean, Gilmar; da Silveira Neto, Aristeu
2018-04-01
The present paper provides an analysis of the statistical uncertainties associated with direct numerical simulation (DNS) results and experimental data for turbulent channel and pipe flows, showing a new physically based quantification of these errors, to improve the determination of the statistical deviations between DNSs and experiments. The analysis is carried out using a recently proposed criterion by Thompson et al. ["A methodology to evaluate statistical errors in DNS data of plane channel flows," Comput. Fluids 130, 1-7 (2016)] for fully turbulent plane channel flows, where the mean velocity error is estimated by considering the Reynolds stress tensor, and using the balance of the mean force equation. It also presents how the residual error evolves in time for a DNS of a plane channel flow, and the influence of the Reynolds number on its convergence rate. The root mean square of the residual error is shown in order to capture a single quantitative value of the error associated with the dimensionless averaging time. The evolution in time of the error norm is compared with the final error provided by DNS data of similar Reynolds numbers available in the literature. A direct consequence of this approach is that it was possible to compare different numerical results and experimental data, providing an improved understanding of the convergence of the statistical quantities in turbulent wall-bounded flows.
Auluck, F.C.
1982-01-01
There is a deep connection between problems in statistical mechanics and problems in the partition theory of numbers. The number-theoretic methods are elegant and also more powerful. The theory of pressure ionization leads to a mass-radius relation for white dwarf stars and planets. There is a shift in the energy levels of an atom surrounded by an electromagnetic radiation, which is dependent on the intensity of the radiation field. The Thomas-Fermi theory with exchange and correlation can be used to explain the observed properties of atoms, their periodic features and their behaviour under high pressures. A gravitating fluid sphere becomes spheroidal under the influence of magnetic field and rotation and exhibits various modes of oscillation. The theory of random fragmentation can be used to explain the mass function for stars and galaxies. (author)
Batchelor, Murray T.; Wille, Luc T.
The Table of Contents for the book is as follows: * Preface * Modelling the Immune System - An Example of the Simulation of Complex Biological Systems * Brief Overview of Quantum Computation * Quantal Information in Statistical Physics * Modeling Economic Randomness: Statistical Mechanics of Market Phenomena * Essentially Singular Solutions of Feigenbaum- Type Functional Equations * Spatiotemporal Chaotic Dynamics in Coupled Map Lattices * Approach to Equilibrium of Chaotic Systems * From Level to Level in Brain and Behavior * Linear and Entropic Transformations of the Hydrophobic Free Energy Sequence Help Characterize a Novel Brain Polyprotein: CART's Protein * Dynamical Systems Response to Pulsed High-Frequency Fields * Bose-Einstein Condensates in the Light of Nonlinear Physics * Markov Superposition Expansion for the Entropy and Correlation Functions in Two and Three Dimensions * Calculation of Wave Center Deflection and Multifractal Analysis of Directed Waves Through the Study of su(1,1)Ferromagnets * Spectral Properties and Phases in Hierarchical Master Equations * Universality of the Distribution Functions of Random Matrix Theory * The Universal Chiral Partition Function for Exclusion Statistics * Continuous Space-Time Symmetries in a Lattice Field Theory * Quelques Cas Limites du Problème à N Corps Unidimensionnel * Integrable Models of Correlated Electrons * On the Riemann Surface of the Three-State Chiral Potts Model * Two Exactly Soluble Lattice Models in Three Dimensions * Competition of Ferromagnetic and Antiferromagnetic Order in the Spin-l/2 XXZ Chain at Finite Temperature * Extended Vertex Operator Algebras and Monomial Bases * Parity and Charge Conjugation Symmetries and S Matrix of the XXZ Chain * An Exactly Solvable Constrained XXZ Chain * Integrable Mixed Vertex Models Ftom the Braid-Monoid Algebra * From Yang-Baxter Equations to Dynamical Zeta Functions for Birational Tlansformations * Hexagonal Lattice Directed Site Animals * Direction in
Yen, Hung-Ju [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Chemistry Division
2016-11-14
These slides cover Hung-Ju Yen's recent work in the synthesis and structural design of functional materials, which were further used for optoelectronic and energy applications, such as lithium ion battery, solar cell, LED, electrochromic, and fuel cells. This was for a job interview at National Taipei University of Technology. The following topics are detailed: current challenges for lithium-ion batteries; graphene, graphene oxide and nanographene; nanographenes with various functional groups; fine tune d-spacing through organic synthesis: varying functional group; schematic view of LIBs; nanographenes as LIB anode; rate performance (charging-discharging); electrochromic technology; electrochromic materials; advantages of triphenylamine; requirement of electrochromic materials for practical applications; low driving voltage and long cycle life; increasing the electroactive sites by multi-step synthetic procedures; synthetic route to starburst triarylamine-based polyamide; electrochromism ranging from visible to NIR region; transmissive to black electrochromism; RGB and CMY electrochromism.
S. Ye
2012-11-01
Full Text Available The flow duration curve (FDC is a classical method used to graphically represent the relationship between the frequency and magnitude of streamflow. In this sense it represents a compact signature of temporal runoff variability that can also be used to diagnose catchment rainfall-runoff responses, including similarity and differences between catchments. This paper is aimed at extracting regional patterns of the FDCs from observed daily flow data and elucidating the physical controls underlying these patterns, as a way to aid towards their regionalization and predictions in ungauged basins. The FDCs of total runoff (TFDC using multi-decadal streamflow records for 197 catchments across the continental United States are separated into the FDCs of two runoff components, i.e., fast flow (FFDC and slow flow (SFDC. In order to compactly display these regional patterns, the 3-parameter mixed gamma distribution is employed to characterize the shapes of the normalized FDCs (i.e., TFDC, FFDC and SFDC over the entire data record. This is repeated to also characterize the between-year variability of "annual" FDCs for 8 representative catchments chosen across a climate gradient. Results show that the mixed gamma distribution can adequately capture the shapes of the FDCs and their variation between catchments and also between years. Comparison between the between-catchment and between-year variability of the FDCs revealed significant space-time symmetry. Possible relationships between the parameters of the fitted mixed gamma distribution and catchment climatic and physiographic characteristics are explored in order to decipher and point to the underlying physical controls. The baseflow index (a surrogate for the collective impact of geology, soils, topography and vegetation, as well as climate is found to be the dominant control on the shapes of the normalized TFDC and SFDC, whereas the product of maximum daily precipitation and the fraction of non-rainy days
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
Gershberg, R.E.
1985-01-01
Accounting the observed power character of the energy spectrum of flares of the UV Cet-type stars, several statistical characterisitics of there stars are considered. It is shown that a mean amplitude of flares is mainly determined with an amplitude of the faintest flare that can be registered at the star under consideration and therefore - contrary to tradition - the mean flare amplitude cannot be used as a measure of a flare activity of the star. Mean frequencuy of flares registered at a flare star dependes statisticaally certainly ona an absolute magneitude of the star - contary to wide spread belief, true mean frequencies are higher at brighter stars. On the basis of the Cataloque of flare stars in Pleiades by Haro, Chavira and Gonzalez a luminosity function of therese stars is constructed. Using this function and the revealed dependence of flare mean frequencies on stellar absolute magnitudes, a distribution of flare stars in Pleiades along flare mean frequencies is constructed. This shows that the cluster contains flare stars with mean frequencies of photographically registered flares from 10 -4 to 10 -2 hour -1 or within even narrower interval of frequencies and the total number of such stars in the cluster exceeds 1100
Kontosic, I; Vukelić, M; Pancić, M; Kunisek, J
1994-12-01
Physical work load was estimated in a female conveyor-belt worker in a bottling plant. Estimation was based on continuous measurement and on calculation of average heart rate values in three-minute and one-hour periods and during the total measuring period. The thermal component of the heart rate was calculated by means of the corrected effective temperature, for the one-hour periods. The average heart rate at rest was also determined. The work component of the heart rate was calculated by subtraction of the resting heart rate and the heart rate measured at 50 W, using a regression equation. The average estimated gross energy expenditure during the work was 9.6 +/- 1.3 kJ/min corresponding to the category of light industrial work. The average estimated oxygen uptake was 0.42 +/- 0.06 L/min. The average performed mechanical work was 12.2 +/- 4.2 W, i.e. the energy expenditure was 8.3 +/- 1.5%.
Ben Khemis, Ismahene; Mechi, Nesrine; Ben Lamine, Abdelmottaleb
2018-02-10
In the biosensor system, olfactory receptor sites could be activated by odorant molecules and then the biological interactions are converted into electrical signals by a signal transduction cascade that leads the toopening of ion channels, generating a current that leads into the cilia and depolarizes the membrane. The aim of this paper is to present a new investigation that allows determining the olfactory band using a monolayer adsorption with identical sites modeling which may also describe the static and the dynamic sensitivities through the expression of the olfactory response. Moreover, knowing the size of receptor site in olfactory sensory neurons provides valuable information about the relationship between molecular structure and biological activity. The determination of microreceptors and mesoreceptors is mostly carried out via physical adsorption and the radius is calculated using the Kelvin equation. The mean values of radius obtained from the maximum of the receptor size distributions peaks are 4 nm for ℓ-muscone and 6 nm for d-muscone. Copyright © 2018. Published by Elsevier Ltd.
Calvet, D.
2000-03-01
Systems for on-line event selection in future high energy physics experiments will use advanced distributed computing techniques and will need high speed networks. After a brief description of projects at the Large Hadron Collider, the architectures initially proposed for the Trigger and Data AcQuisition (TD/DAQ) systems of ATLAS and CMS experiments are presented and analyzed. A new architecture for the ATLAS T/DAQ is introduced. Candidate network technologies for this system are described. This thesis focuses on ATM. A variety of network structures and topologies suited to partial and full event building are investigated. The need for efficient networking is shown. Optimization techniques for high speed messaging and their implementation on ATM components are described. Small scale demonstrator systems consisting of up to 48 computers (∼1:20 of the final level 2 trigger) connected via ATM are described. Performance results are presented. Extrapolation of measurements and evaluation of needs lead to a proposal of implementation for the main network of the ATLAS T/DAQ system. (author)
Victor Hugo Pereira Moutinho
2017-01-01
Full Text Available The study aimed to analyze the physical and mechanical properties of charcoal from eucalypt clones by principal component analysis and demonstrate the relationships between these properties, in order to assess which charcoal property should aimed in the process to obtain a higher quality product. In this way, was cut eight clones of Eucalyptus and two of Corymbia , collecting three trees per clone and five disk in different heights. The disks were transformed into test samples, totaling an average of 75 samples per clones, which were carbonized under specific conditions for analysis of apparent density, compressive strength parallel to grain and linear and volumetric degradation due to high temperature. It is noteworthy that the data were weighted by disk and per tree, to an average closer to reality. For correlations, was used multivariate analysis of principal components. Herein, it is found that the apparent density of charcoal acts as the focal point of the other properties studied, and observed that as the higher the density, higher will be the compressive strength parallel to grain, the elastic modulus and the gravimetric yield.
Lopes, T.J.; Fossum, K.D.; Phillips, J.V.; Monical, J.E.
1995-01-01
Stormwater and streamflow in the Phoenix, Arizona, area were monitored to determine the physical, chemical, and microbial characteristics of storm- water from areas having different land uses; to describe the characteristics of streamflow in a river that receives urban stormwater; and to estimate constituent loads in stormwater from unmonitored areas in Maricopa County, Arizona. Land use affects urban stormwater chemistry mostly because the percentage of impervious area controls the suspended-solids concentrations and varies with the type of land use. Urban activities also seem to concentrate cadmium, lead, and zinc in sediments. Urban stormwater had larger concentrations of chemical oxygen demand and biological oxygen demand, oil and grease, and higher counts of fecal bacteria than streamflow and could degrade the quality of the Salt River. Most regression equations for estimating constituent loads require three explanatory variables (total rainfall, drainage area, and per- centage of impervious area) and had standard errors that were from 65 to 266 percent. Localized areas that appear to contribute a large proportion of the constituent loads typically have 40 percent or more impervious area and are associated with industrial, commercial, and high-density residential land uses. The use of the mean value of the event-mean constituent concentrations measured in stormwater may be the best way of estimating constituent concentrations.
Wjihi, Sarra [Unité de Recherche de Physique Quantique, 11 ES 54, Faculté des Science de Monastir (Tunisia); Dhaou, Houcine [Laboratoire des Etudes des Systèmes Thermiques et Energétiques (LESTE), ENIM, Route de Kairouan, 5019 Monastir (Tunisia); Yahia, Manel Ben; Knani, Salah [Unité de Recherche de Physique Quantique, 11 ES 54, Faculté des Science de Monastir (Tunisia); Jemni, Abdelmajid [Laboratoire des Etudes des Systèmes Thermiques et Energétiques (LESTE), ENIM, Route de Kairouan, 5019 Monastir (Tunisia); Lamine, Abdelmottaleb Ben, E-mail: abdelmottaleb.benlamine@gmail.com [Unité de Recherche de Physique Quantique, 11 ES 54, Faculté des Science de Monastir (Tunisia)
2015-12-15
Statistical physics treatment is used to study the desorption of hydrogen on LaNi{sub 4.75}Fe{sub 0.25}, in order to obtain new physicochemical interpretations at the molecular level. Experimental desorption isotherms of hydrogen on LaNi{sub 4.75}Fe{sub 0.25} are fitted at three temperatures (293 K, 303 K and 313 K), using a monolayer desorption model. Six parameters of the model are fitted, namely the number of molecules per site n{sub α} and n{sub β}, the receptor site densities N{sub αM} and N{sub βM}, and the energetic parameters P{sub α} and P{sub β}. The behaviors of these parameters are discussed in relationship with desorption process. A dynamic study of the α and β phases in the desorption process was then carried out. Finally, the different thermodynamical potential functions are derived by statistical physics calculations from our adopted model.
P. Sphicas
There have been three physics meetings since the last CMS week: “physics days” on March 27-29, the Physics/ Trigger week on April 23-27 and the most recent physics days on May 22-24. The main purpose of the March physics days was to finalize the list of “2007 analyses”, i.e. the few topics that the physics groups will concentrate on for the rest of this calendar year. The idea is to carry out a full physics exercise, with CMSSW, for select physics channels which test key features of the physics objects, or represent potential “day 1” physics topics that need to be addressed in advance. The list of these analyses was indeed completed and presented in the plenary meetings. As always, a significant amount of time was also spent in reviewing the status of the physics objects (reconstruction) as well as their usage in the High-Level Trigger (HLT). The major event of the past three months was the first “Physics/Trigger week” in Apri...
Jean-Claude Gadmer
2012-01-01
16 February 2012 - Chinese Taipei Ambassador to Switzerland F. Hsieh in the ATLAS visitor centre, ATLAS experimental area and LHC tunnel at Point 1 with Collaboration Deputy Sookesperson A. Lankford, throughout accompanied by International Relations Adviser R. Voss.
Maximilien Brice
2012-01-01
8 October 2012 - Taipei Cultural and Economic Delegation, Geneva Office Ambassador A. Tah-Ray Yui visiting the LHC superconducting magnet test hall with International Relations Office Adviser R. Voss.
2003-01-01
The overall aim of BIOCLIM is to assess the possible long term impacts due to climate change on the safety of radioactive waste repositories in deep formations. The main aim of this deliverable is to provide time series of climatic variables at the high resolution as needed by performance assessment (PA) of radioactive waste repositories, on the basis of coarse output from the CLIMBER-GREMLINS climate model. The climatological variables studied here are long-term (monthly) mean temperature and precipitation, as these are the main variables of interest for performance assessment. CLIMBER-GREMLINS is an earth-system model of intermediate complexity (EMIC), designed for long climate simulations (glacial cycles). Thus, this model has a coarse resolution (about 50 degrees in longitude) and other limitations which are sketched in this report. For the purpose of performance assessment, the climatological variables are required at scales pertinent for the knowledge of the conditions at the depository site. In this work, the final resolution is that of the best available global gridded present-day climatology, which is 1/6 degree in both longitude and latitude. To obtain climate-change information at this high resolution on the basis of the climate model outputs, a 2-step down-scaling method is designed. First, physical considerations are used to define variables which are expected to have links which climatological values; secondly a statistical model is used to find the links between these variables and the high-resolution climatology of temperature and precipitation. Thus the method is termed as 'physical/statistical': it involves physically based assumptions to compute predictors from model variables and then relies on statistics to find empirical links between these predictors and the climatology. The simple connection of coarse model results to regional values can not be done on a purely empirical way because the model does not provide enough information - it is both
Equilibrium statistical mechanics
Mayer, J E
1968-01-01
The International Encyclopedia of Physical Chemistry and Chemical Physics, Volume 1: Equilibrium Statistical Mechanics covers the fundamental principles and the development of theoretical aspects of equilibrium statistical mechanics. Statistical mechanical is the study of the connection between the macroscopic behavior of bulk matter and the microscopic properties of its constituent atoms and molecules. This book contains eight chapters, and begins with a presentation of the master equation used for the calculation of the fundamental thermodynamic functions. The succeeding chapters highlight t
Dolores Corella
Full Text Available BACKGROUND: Fat mass and obesity (FTO and melanocortin-4 receptor (MC4R and are relevant genes associated with obesity. This could be through food intake, but results are contradictory. Modulation by diet or other lifestyle factors is also not well understood. OBJECTIVE: To investigate whether MC4R and FTO associations with body-weight are modulated by diet and physical activity (PA, and to study their association with alcohol and food intake. METHODS: Adherence to Mediterranean diet (AdMedDiet and physical activity (PA were assessed by validated questionnaires in 7,052 high cardiovascular risk subjects. MC4R rs17782313 and FTO rs9939609 were determined. Independent and joint associations (aggregate genetic score as well as statistical and biological gene-lifestyle interactions were analyzed. RESULTS: FTO rs9939609 was associated with higher body mass index (BMI, waist circumference (WC and obesity (P<0.05 for all. A similar, but not significant trend was found for MC4R rs17782313. Their additive effects (aggregate score were significant and we observed a 7% per-allele increase of being obese (OR=1.07; 95%CI 1.01-1.13. We found relevant statistical interactions (P<0.05 with PA. So, in active individuals, the associations with higher BMI, WC or obesity were not detected. A biological (non-statistical interaction between AdMedDiet and rs9939609 and the aggregate score was found. Greater AdMedDiet in individuals carrying 4 or 3-risk alleles counterbalanced their genetic predisposition, exhibiting similar BMI (P=0.502 than individuals with no risk alleles and lower AdMedDiet. They also had lower BMI (P=0.021 than their counterparts with low AdMedDiet. We did not find any consistent association with energy or macronutrients, but found a novel association between these polymorphisms and lower alcohol consumption in variant-allele carriers (B+/-SE: -0.57+/-0.16 g/d per-score-allele; P=0.001. CONCLUSION: Statistical and biological interactions with PA
D. Acosta
2010-01-01
A remarkable amount of progress has been made in Physics since the last CMS Week in June given the exponential growth in the delivered LHC luminosity. The first major milestone was the delivery of a variety of results to the ICHEP international conference held in Paris this July. For this conference, CMS prepared 15 Physics Analysis Summaries on physics objects and 22 Summaries on new and interesting physics measurements that exploited the luminosity recorded by the CMS detector. The challenge was incorporating the largest batch of luminosity that was delivered only days before the conference (300 nb-1 total). The physics covered from this initial running period spanned hadron production measurements, jet production and properties, electroweak vector boson production, and even glimpses of the top quark. Since then, the accumulated integrated luminosity has increased by a factor of more than 100, and all groups have been working tremendously hard on analysing this dataset. The September Physics Week was held ...
Liao, C.-H.; Tseng, P.-H.; Cullinane, Kevin; Lu, C.-S.
2010-01-01
This study analyzes the changes in carbon dioxide (CO 2 ) emissions resulting from the movement of containers from established ports through the emerging port of Taipei in Northern Taiwan. An activity-based emissions model is used to estimate the CO 2 emissions of container transport under four scenarios where there are switches of market share from existing ports to the emerging port. The results show that there are greater reductions in CO 2 when transhipment routes are changed from the ports of Kaohsiung, Taichung and Keelung to the emerging port of Taipei. The paper concludes that the analytical approach adopted in the paper can help decision-makers understand potential CO 2 emissions reduction strategies in the route selection of inland container transportation and such consideration should provide a broader and more meaningful basis for the socio-economic evaluation of port investment projects.
Lin, Cheng-Horng
2016-12-23
There are more than 7 million people living near the Tatun volcano group in northern Taiwan. For the safety of the Taipei metropolis, in particular, it has been debated for decades whether or not these volcanoes are active. Here I show evidence of a deep magma reservoir beneath the Taipei metropolis from both S-wave shadows and P-wave delays. The reservoir is probably composed of either a thin magma layer overlay or many molten sills within thick partially molten rocks. Assuming that 40% of the reservoir is partially molten, its total volume could be approximately 350 km 3 . The exact location and geometry of the magma reservoir will be obtained after dense seismic arrays are deployed in 2017-2020.
Lim, Gyeong Hui
2008-03-01
This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics
J. Incandela
There have been numerous developments in the physics area since the September CMS week. The biggest single event was the Physics/Trigger week in the end of Octo¬ber, whereas in terms of ongoing activities the “2007 analyses” went into high gear. This was in parallel with participation in CSA07 by the physics groups. On the or¬ganizational side, the new conveners of the physics groups have been selected, and a new database for man¬aging physics analyses has been deployed. Physics/Trigger week The second Physics-Trigger week of 2007 took place during the week of October 22-26. The first half of the week was dedicated to working group meetings. The ple¬nary Joint Physics-Trigger meeting took place on Wednesday afternoon and focused on the activities of the new Trigger Studies Group (TSG) and trigger monitoring. Both the Physics and Trigger organizations are now focused on readiness for early data-taking. Thus, early trigger tables and preparations for calibr...
P. Sphicas
The CPT project came to an end in December 2006 and its original scope is now shared among three new areas, namely Computing, Offline and Physics. In the physics area the basic change with respect to the previous system (where the PRS groups were charged with detector and physics object reconstruction and physics analysis) was the split of the detector PRS groups (the old ECAL-egamma, HCAL-jetMET, Tracker-btau and Muons) into two groups each: a Detector Performance Group (DPG) and a Physics Object Group. The DPGs are now led by the Commissioning and Run Coordinator deputy (Darin Acosta) and will appear in the correspond¬ing column in CMS bulletins. On the physics side, the physics object groups are charged with the reconstruction of physics objects, the tuning of the simulation (in collaboration with the DPGs) to reproduce the data, the provision of code for the High-Level Trigger, the optimization of the algorithms involved for the different physics analyses (in collaboration with the analysis gr...
Yu, Hwa-Lung; Chien, Lung-Chang; Yang, Chiang-Hsing
2012-01-01
Concerns have been raised about the adverse impact of Asian dust storms (ADS) on human health; however, few studies have examined the effect of these events on children's health. Using databases from the Taiwan National Health Insurance and Taiwan Environmental Protection Agency, this study investigates the documented daily visits of children to respiratory clinics during and after ADS that occurred from 1997 to 2007 among 12 districts across Taipei City by applying a Bayesian structural addi...
Rastaetter, L.; Kuznetsova, M.; Hesse, M.; Pulkkinen, A.; Glocer, A.; Yu, Y.; Meng, X.; Raeder, J.; Wiltberger, M.; Welling, D.;
2011-01-01
In this paper the metrics-based results of the Dst part of the 2008-2009 GEM Metrics Challenge are reported. The Metrics Challenge asked modelers to submit results for 4 geomagnetic storm events and 5 different types of observations that can be modeled by statistical or climatological or physics-based (e.g. MHD) models of the magnetosphere-ionosphere system. We present the results of over 25 model settings that were run at the Community Coordinated Modeling Center (CCMC) and at the institutions of various modelers for these events. To measure the performance of each of the models against the observations we use comparisons of one-hour averaged model data with the Dst index issued by the World Data Center for Geomagnetism, Kyoto, Japan, and direct comparison of one-minute model data with the one-minute Dst index calculated by the United States Geologic Survey (USGS).
Growth Normal Faulting at the Western Edge of the Metropolitan Taipei Basin since the Last Glacial Maximum, Northern Taiwan
Chih-Tung Chen
2010-01-01
Full Text Available Growth strata analysis is an useful tool in understanding kinematics and the evolution of active faults as well as the close relationship between sedimentation and tectonics. Here we present the Shanchiao Fault as a case study which is an active normal fault responsible for the formation of the 700-m-thick late Quaternary deposits in Taipei Basin at the northern tip of the Taiwan mountain belt. We compiled a sedimentary record, particularly the depositional facies and their dated ages, at three boreholes (SCF-1, SCF-2 and WK-1, from west to east along the Wuku Profile that traverses the Shanchiao Fault at its central segment. By incorporating the global sea level change curve, we find that thickness changes of sediments and changes of depositional environments in the Wuku area are in a good agreement with a rapid sea level rise since the Last Glacial Maximum (LGM of about 23 ka. Combining depositional facies changes and their ages with their thickness, we are able to introduce a simple back-stripping method to reconstruct the evolution of growing strata across the Shanchiao Fault since the LGM. We then estimate the vertical tectonic slip rate since 23 ka, which exhibits 2.2 mm yr-1 between SCF-2 and WK-1 and 1.1 mm yr-1 between SCF-1 and SCF-2. We also obtain the Holocene tectonic subsidence rate of 2.3 mm yr-1 at WK-1 and 0.9 mm yr-1 at SCF-2 since 8.4 ka. We thus conclude that the fault zone consists of a high-angle main fault to the east between SCF-2 and WK-1 and a western lower-angle branch fault between SCF-1 and SCF-2, resembling a tulip structure developed under sinistral transtensional tectonism. We find that a short period of 600-yr time span in 9 - 8.4 ka shows important tectonic subsidence of 7.4 and 3.3 m for the main and branch fault, respectively, consistent with possible earthquake events proposed by previous studies during that time. A correlation between geomorphology and subsurface geology in the Shanchiao Fault zone shows
Yi-Ling Huang
2010-01-01
Full Text Available According to reconstructed ground motion snapshots of the northern Taiwan area during the MW 7.0 eastern Taiwan offshore earthquake of 31 March 2002, the composite effects indicated complicated wave propagation behavior in the ground motion of the Taipei basin. A major low frequency pulse arose after the S-wave with the duration of about 20 seconds was observed in northern Taiwan and dominated the radial direction. Observed waveforms of a low frequency pulse show amplification during the seismic wave across the Taipei basin from its eastern edge to western portion. This effect has been considered to be generated by an unusual source radiation, deep Moho reflection or basin bottom surface. In this study, recorded ground motions from a dense seismic network were analyzed using a frequency-wavenumber spectrum analysis for seismic wave propagation properties. We investigated temporal and spatial variations in strong shaking in different frequency bands. Results show that a simple pulse incident seismic wave strongly interacts with inside soft sediments and the surrounding topography of the Taipei basin which in turn extends its shaking duration. Evidence showed that seismic waves have been reflected back from its western boundary of basin with a dominant frequency near one Hz. Findings in this study have been rarely reported and may provide useful information to further constrain a three-dimensional numerical simulation for the basin response and velocity structure, and to predict ground motions of further large earthquakes.
Submitted by
Physics Week: plenary meeting on physics groups plans for startup (14–15 May 2008) The Physics Objects (POG) and Physics Analysis (PAG) Groups presented their latest developments at the plenary meeting during the Physics Week. In the presentations particular attention was given to startup plans and readiness for data-taking. Many results based on the recent cosmic run were shown. A special Workshop on SUSY, described in a separate section, took place the day before the plenary. At the meeting, we had also two special DPG presentations on “Tracker and Muon alignment with CRAFT” (Ernesto Migliore) and “Calorimeter studies with CRAFT” (Chiara Rovelli). We had also a report from Offline (Andrea Rizzi) and Computing (Markus Klute) on the San Diego Workshop, described elsewhere in this bulletin. Tracking group (Boris Mangano). The level of sophistication of the tracking software increased significantly over the last few months: V0 (K0 and Λ) reconstr...
Marzouk, Youssef [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)
2016-08-31
Predictive simulation of complex physical systems increasingly rests on the interplay of experimental observations with computational models. Key inputs, parameters, or structural aspects of models may be incomplete or unknown, and must be developed from indirect and limited observations. At the same time, quantified uncertainties are needed to qualify computational predictions in the support of design and decision-making. In this context, Bayesian statistics provides a foundation for inference from noisy and limited data, but at prohibitive computional expense. This project intends to make rigorous predictive modeling *feasible* in complex physical systems, via accelerated and scalable tools for uncertainty quantification, Bayesian inference, and experimental design. Specific objectives are as follows: 1. Develop adaptive posterior approximations and dimensionality reduction approaches for Bayesian inference in high-dimensional nonlinear systems. 2. Extend accelerated Bayesian methodologies to large-scale {\\em sequential} data assimilation, fully treating nonlinear models and non-Gaussian state and parameter distributions. 3. Devise efficient surrogate-based methods for Bayesian model selection and the learning of model structure. 4. Develop scalable simulation/optimization approaches to nonlinear Bayesian experimental design, for both parameter inference and model selection. 5. Demonstrate these inferential tools on chemical kinetic models in reacting flow, constructing and refining thermochemical and electrochemical models from limited data. Demonstrate Bayesian filtering on canonical stochastic PDEs and in the dynamic estimation of inhomogeneous subsurface properties and flow fields.
Statistical physics of language dynamics
Loreto, Vittorio; Baronchelli, Andrea; Mukherjee, Animesh; Puglisi, Andrea; Tria, Francesca
2011-04-01
Language dynamics is a rapidly growing field that focuses on all processes related to the emergence, evolution, change and extinction of languages. Recently, the study of self-organization and evolution of language and meaning has led to the idea that a community of language users can be seen as a complex dynamical system, which collectively solves the problem of developing a shared communication framework through the back-and-forth signaling between individuals. We shall review some of the progress made in the past few years and highlight potential future directions of research in this area. In particular, the emergence of a common lexicon and of a shared set of linguistic categories will be discussed, as examples corresponding to the early stages of a language. The extent to which synthetic modeling is nowadays contributing to the ongoing debate in cognitive science will be pointed out. In addition, the burst of growth of the web is providing new experimental frameworks. It makes available a huge amount of resources, both as novel tools and data to be analyzed, allowing quantitative and large-scale analysis of the processes underlying the emergence of a collective information and language dynamics.
The statistical stability phenomenon
Gorban, Igor I
2017-01-01
This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations – the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers...
Lai, Li-Wei
2012-01-01
This study focuses on the relationship between photochemical smog (PS) (hourly ozone conc. >100 ppb), PS-related diseases, and the synoptic weather patterns during 2005-2009 in metropolitan Taipei. The results show that compared to respiratory ailments (ICD code 460-519) and asthma (ICD code 493), cardiovascular ailments (ICD code 390-459) were more clearly influenced by PS events. On the PS event day, the number of admissions of babies, children, and adults increased by 0.04 [95% CI 0.01-0.06], 0.03 [95% CI 0.00-0.07], and 1.12 [95% CI 0.36-1.89] (P < 0.05), respectively. The investigation showed that more than 70% of the PS events were associated primarily with the peripheral circulation of typhoons, Pacific pressure, and discrete Pacific pressure. PS events are a threat to public health. To avoid the ill effects of air pollution, residents should be informed about the daily weather patterns and air quality.
Hsieh, Chun-Ming; Aramaki, Toshiya; Hanaki, Keisuke [The University of Tokyo, Bunkyo-ku, Tokyo (Japan). Department of Urban Engineering
2007-09-15
The main work in the research focuses on the analysis and mitigation of the anthropogenic heat discharged from buildings, which is one of the main reasons leading to the heat island effect. The residential and commercial buildings, divided into 10 categories, with HVAC systems were analyzed by the building energy program, EnergyPlus. With the help of GIS, the heat rejection of all the residential and commercial buildings in DaAn Ward of Taipei City were evaluated, in which the spatial data and diurnal variation of the heat rejection were described by 3-h time periods. Furthermore, the effect of mitigation strategies was discussed. The first strategy was to change the wall/roof material of building envelope. The second and third strategies, from the viewpoint of energy saving, were to change the temperature setting of air conditioners and to turn off the lighting and equipment when not in use. The fourth strategy was to use a better efficiency of the cooling systems. Finally, the evaluation of installing the water-cooled cooling system, which discharges heat in the form of sensible and latent heat, was also included. (author)
Hwa-Lung Yu
Full Text Available Concerns have been raised about the adverse impact of Asian dust storms (ADS on human health; however, few studies have examined the effect of these events on children's health. Using databases from the Taiwan National Health Insurance and Taiwan Environmental Protection Agency, this study investigates the documented daily visits of children to respiratory clinics during and after ADS that occurred from 1997 to 2007 among 12 districts across Taipei City by applying a Bayesian structural additive regressive model controlled for spatial and temporal patterns. This study finds that the significantly impact of elevated children's respiratory clinic visits happened after ADS. Five of the seven lagged days had increasing percentages of relative rate, which was consecutively elevated from a 2-day to a 5-day lag by 0.63%∼2.19% for preschool children (i.e., 0∼6 years of age and 0.72%∼3.17% for school children (i.e., 7∼14 years of age. The spatial pattern of clinic visits indicated that geographical heterogeneity was possibly associated with the clinic's location and accessibility. Moreover, day-of-week effects were elevated on Monday, Friday, and Saturday. We concluded that ADS may significantly increase the risks of respiratory diseases consecutively in the week after exposure, especially in school children.
Wei-I Lee
2016-12-01
Full Text Available The New Taipei City Government developed a Code-checking System (CCS using Building Information Modeling (BIM technology to facilitate an architectural design review in 2014. This system was intended to solve problems caused by cognitive gaps between designer and reviewer in the design review process. Along with considering information technology, the most important issue for the system’s development has been the logicalization of literal building codes. Therefore, to enhance the reliability and performance of the CCS, this study uses the Fuzzy Delphi Method (FDM on the basis of design thinking and communication theory to investigate the semantic difference and cognitive gaps among participants in the design review process and to propose the direction of system development. Our empirical results lead us to recommend grouping multi-stage screening and weighted assisted logicalization of non-quantitative building codes to improve the operability of CCS. Furthermore, CCS should integrate the Expert Evaluation System (EES to evaluate the design value under qualitative building codes.
D. Futyan
A lot has transpired on the “Physics” front since the last CMS Bulletin. The summer was filled with preparations of new Monte Carlo samples based on CMSSW_3, the finalization of all the 10 TeV physics analyses [in total 50 analyses were approved] and the preparations for the Physics Week in Bologna. A couple weeks later, the “October Exercise” commenced and ran through an intense two-week period. The Physics Days in October were packed with a number of topics that are relevant to data taking, in a number of “mini-workshops”: the luminosity measurement, the determination of the beam spot and the measurement of the missing transverse energy (MET) were the three main topics. Physics Week in Bologna The second physics week in 2009 took place in Bologna, Italy, on the week of Sep 7-11. The aim of the week was to review and establish how ready we are to do physics with the early collisions at the LHC. The agenda of the week was thus pac...
D. Futyan
A lot has transpired on the “Physics” front since the last CMS Bulletin. The summer was filled with preparations of new Monte Carlo samples based on CMSSW_3, the finalization of all the 10 TeV physics analyses [in total 50 analyses were approved] and the preparations for the Physics Week in Bologna. A couple weeks later, the “October Exercise” commenced and ran through an intense two-week period. The Physics Days in October were packed with a number of topics that are relevant to data taking, in a number of “mini-workshops”: the luminosity measurement, the determination of the beam spot and the measurement of the missing transverse energy (MET) were the three main topics. Physics Week in Bologna The second physics week in 2009 took place in Bologna, Italy, on the week of Sep 7-11. The aim of the week was to review and establish (we hoped) the readiness of CMS to do physics with the early collisions at the LHC. The agenda of the...
... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...
J. Incandela
The all-plenary format of the CMS week in Cyprus gave the opportunity to the conveners of the physics groups to present the plans of each physics analysis group for tackling early physics analyses. The presentations were complete, so all are encouraged to browse through them on the Web. There is a wealth of information on what is going on, by whom and on what basis and priority. The CMS week was followed by two CMS “physics events”, the ICHEP08 days and the physics days in July. These were two weeks dedicated to either the approval of all the results that would be presented at ICHEP08, or to the review of all the other Monte-Carlo based analyses that were carried out in the context of our preparations for analysis with the early LHC data (the so-called “2008 analyses”). All this was planned in the context of the beginning of a ramp down of these Monte Carlo efforts, in anticipation of data. The ICHEP days are described below (agenda and talks at: http://indic...
Joe Incandela
There have been two plenary physics meetings since the December CMS week. The year started with two workshops, one on the measurements of the Standard Model necessary for “discovery physics” as well as one on the Physics Analysis Toolkit (PAT). Meanwhile the tail of the “2007 analyses” is going through the last steps of approval. It is expected that by the end of January all analyses will have converted to using the data from CSA07 – which include the effects of miscalibration and misalignment. January Physics Days The first Physics Days of 2008 took place on January 22-24. The first two days were devoted to comprehensive re¬ports from the Detector Performance Groups (DPG) and Physics Objects Groups (POG) on their planning and readiness for early data-taking followed by approvals of several recent studies. Highlights of POG presentations are included below while the activities of the DPGs are covered elsewhere in this bulletin. January 24th was devo...
Cullen, Katherine
2005-01-01
Defined as the scientific study of matter and energy, physics explains how all matter behaves. Separated into modern and classical physics, the study attracts both experimental and theoretical physicists. From the discovery of the process of nuclear fission to an explanation of the nature of light, from the theory of special relativity to advancements made in particle physics, this volume profiles 10 pioneers who overcame tremendous odds to make significant breakthroughs in this heavily studied branch of science. Each chapter contains relevant information on the scientist''s childhood, research, discoveries, and lasting contributions to the field and concludes with a chronology and a list of print and Internet references specific to that individual.
Sjögren, M; Li, H; Banner, C; Rafter, J; Westerholm, R; Rannug, U
1996-01-01
The emission of diesel exhaust particulates is associated with potentially severe biological effects, e.g., cancer. The aim of the present study was to apply multivariate statistical methods to identify factors that affect the biological potency of these exhausts. Ten diesel fuels were analyzed regarding physical and chemical characteristics. Particulate exhaust emissions were sampled after combustion of these fuels on two makes of heavy duty diesel engines. Particle extracts were chemically analyzed and tested for mutagenicity in the Ames test. Also, the potency of the extracts to competitively inhibit the binding of 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) to the Ah receptor was assessed. Relationships between fuel characteristics and biological effects of the extracts were studied, using partial least squares regression (PLS). The most influential chemical fuel parameters included the contents of sulfur, certain polycyclic aromatic compounds (PAC), and naphthenes. Density and flash point were positively correlated with genotoxic potency. Cetane number and upper distillation curve points were negatively correlated with both mutagenicity and Ah receptor affinity. Between 61% and 70% of the biological response data could be explained by the measured chemical and physical factors of the fuels. By PLS modeling of extract data versus the biological response data, 66% of the genotoxicity could be explained, by 41% of the chemical variation. The most important variables, associated with both mutagenicity and Ah receptor affinity, included 1-nitropyrene, particle bound nitrate, indeno[1,2,3-cd]pyrene, and emitted mass of particles. S9-requiring mutagenicity was highly correlated with certain PAC, whereas S9-independent mutagenicity was better correlated with nitrates and 1-nitropyrene. The emission of sulfates also showed a correlation both with the emission of particles and with the biological effects. The results indicate that fuels with biologically less hazardous
Guenther Dissertori
The time period between the last CMS week and this June was one of intense activity with numerous get-together targeted at addressing specific issues on the road to data-taking. The two series of workshops, namely the “En route to discoveries” series and the “Vertical Integration” meetings continued. The first meeting of the “En route to discoveries” sequence (end 2007) had covered the measurements of the Standard Model signals as necessary prerequisite to any claim of signals beyond the Standard Model. The second meeting took place during the Feb CMS week and concentrated on the commissioning of the Physics Objects, whereas the third occurred during the April Physics Week – and this time the theme was the strategy for key new physics signatures. Both of these workshops are summarized below. The vertical integration meetings also continued, with two DPG-physics get-togethers on jets and missing ET and on electrons and photons. ...
Chris Hill
2012-01-01
The months that have passed since the last CMS Bulletin have been a very busy and exciting time for CMS physics. We have gone from observing the very first 8TeV collisions produced by the LHC to collecting a dataset of the collisions that already exceeds that recorded in all of 2011. All in just a few months! Meanwhile, the analysis of the 2011 dataset and publication of the subsequent results has continued. These results come from all the PAGs in CMS, including searches for the Higgs boson and other new phenomena, that have set the most stringent limits on an ever increasing number of models of physics beyond the Standard Model including dark matter, Supersymmetry, and TeV-scale gravity scenarios, top-quark physics where CMS has overtaken the Tevatron in the precision of some measurements, and bottom-quark physics where CMS made its first discovery of a new particle, the Ξ*0b baryon (candidate event pictured below). Image 2: A Ξ*0b candidate event At the same time POGs and PAGs...
D. Acosta
2011-01-01
Since the last CMS Week, all physics groups have been extremely active on analyses based on the full 2010 dataset, with most aiming for a preliminary measurement in time for the winter conferences. Nearly 50 analyses were approved in a “marathon” of approval meetings during the first two weeks of March, and the total number of approved analyses reached 90. The diversity of topics is very broad, including precision QCD, Top, and electroweak measurements, the first observation of single Top production at the LHC, the first limits on Higgs production at the LHC including the di-tau final state, and comprehensive searches for new physics in a wide range of topologies (so far all with null results unfortunately). Most of the results are based on the full 2010 pp data sample, which corresponds to 36 pb-1 at √s = 7 TeV. This report can only give a few of the highlights of a very rich physics program, which is listed below by physics group...
Probability and Statistical Inference
Prosper, Harrison B.
2006-01-01
These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.
van Lier-Walqui, M.; Morrison, H.; Kumjian, M. R.; Prat, O. P.
2016-12-01
Microphysical parameterization schemes have reached an impressive level of sophistication: numerous prognostic hydrometeor categories, and either size-resolved (bin) particle size distributions, or multiple prognostic moments of the size distribution. Yet, uncertainty in model representation of microphysical processes and the effects of microphysics on numerical simulation of weather has not shown a improvement commensurate with the advanced sophistication of these schemes. We posit that this may be caused by unconstrained assumptions of these schemes, such as ad-hoc parameter value choices and structural uncertainties (e.g. choice of a particular form for the size distribution). We present work on development and observational constraint of a novel microphysical parameterization approach, the Bayesian Observationally-constrained Statistical-physical Scheme (BOSS), which seeks to address these sources of uncertainty. Our framework avoids unnecessary a priori assumptions, and instead relies on observations to provide probabilistic constraint of the scheme structure and sensitivities to environmental and microphysical conditions. We harness the rich microphysical information content of polarimetric radar observations to develop and constrain BOSS within a Bayesian inference framework using a Markov Chain Monte Carlo sampler (see Kumjian et al., this meeting for details on development of an associated polarimetric forward operator). Our work shows how knowledge of microphysical processes is provided by polarimetric radar observations of diverse weather conditions, and which processes remain highly uncertain, even after considering observations.
Lotfi Sellaoui
Full Text Available Two equilibrium models based on statistical physics, i.e., monolayer model with single energy and multilayer model with saturation, were developed and employed to access the steric and energetic aspects in the adsorption of reactive violet 5 dye (RV-5 on cocoa shell activated carbon (AC and commercial activated carbon (CAC, at different temperatures (from 298 to 323Â K. The results showed that the multilayer model with saturation was able to represent the adsorption system. This model assumes that the adsorption occurs by a formation of certain number of layers. The n values ranged from 1.10 to 2.98, indicating that the adsorbate molecules interacted in an inclined position on the adsorbent surface and aggregate in solution. The study of the total number of the formed layers (1Â +Â L2 showed that the steric hindrance is the dominant factor. The description of the adsorbateâadsorbent interactions by calculation of the adsorption energy indicated that the process occurred by physisorption in nature, since the values were lower than 40Â kJÂ molâ1. Keywords: RV-5 dye, Activated carbon, Modeling, Aggregation
Darin Acosta
2010-01-01
The collisions last year at 900 GeV and 2.36 TeV provided the long anticipated collider data to the CMS physics groups. Quite a lot has been accomplished in a very short time. Although the delivered luminosity was small, CMS was able to publish its first physics paper (with several more in preparation), and commence the commissioning of physics objects for future analyses. Many new performance results have been approved in advance of this CMS Week. One remarkable outcome has been the amazing agreement between out-of-the-box data with simulation at these low energies so early in the commissioning of the experiment. All of this is testament to the hard work and preparation conducted beforehand by many people in CMS. These analyses could not have happened without the dedicated work of the full collaboration on building and commissioning the detector, computing, and software systems combined with the tireless work of many to collect, calibrate and understand the data and our detector. To facilitate the efficien...
D. Acosta
2010-01-01
The Physics Groups are actively engaged on analyses of the first data from the LHC at 7 TeV, targeting many results for the ICHEP conference taking place in Paris this summer. The first large batch of physics approvals is scheduled for this CMS Week, to be followed by four more weeks of approvals and analysis updates leading to the start of the conference in July. Several high priority analysis areas were organized into task forces to ensure sufficient coverage from the relevant detector, object, and analysis groups in the preparation of these analyses. Already some results on charged particle correlations and multiplicities in 7 TeV minimum bias collisions have been approved. Only one small detail remains before ICHEP: further integrated luminosity delivered by the LHC! Beyond the Standard Model measurements that can be done with these data, the focus changes to the search for new physics at the TeV scale and for the Higgs boson in the period after ICHEP. Particle Flow The PFT group is focusing on the ...
the PAG conveners
2011-01-01
The delivered LHC integrated luminosity of more than 1 inverse femtobarn by summer and more than 5 by the end of 2011 has been a gold mine for the physics groups. With 2011 data, we have submitted or published 14 papers, 7 others are in collaboration-wide review, and 75 Physics Analysis Summaries have been approved already. They add to the 73 papers already published based on the 2010 and 2009 datasets. Highlights from each physics analysis group are described below. Heavy ions Many important results have been obtained from the first lead-ion collision run in 2010. The published measurements include the first ever indications of Υ excited state suppression (PRL synopsis), long-range correlation in PbPb, and track multiplicity over a wide η range. Preliminary results include the first ever measurement of isolated photons (showing no modification), J/ψ suppression including the separation of the non-prompt component, further study of jet fragmentation, nuclear modification factor...
L. Demortier
Physics-wise, the CMS week in December was dominated by discussions of the analyses that will be carried out in the “next six months”, i.e. while waiting for the first LHC collisions. As presented in December, analysis approvals based on Monte Carlo simulation were re-opened, with the caveat that for this work to be helpful to the goals of CMS, it should be carried out using the new software (CMSSW_2_X) and associated samples. By the end of the week, the goal for the physics groups was set to be the porting of our physics commissioning methods and plans, as well as the early analyses (based an integrated luminosity in the range 10-100pb-1) into this new software. Since December, the large data samples from CMSSW_2_1 were completed. A big effort by the production group gave a significant number of events over the end-of-year break – but also gave out the first samples with the fast simulation. Meanwhile, as mentioned in December, the arrival of 2_2 meant that ...
C. Hill
2012-01-01
2012 has started off as a very busy year for the CMS Physics Groups. Planning for the upcoming higher luminosity/higher energy (8 TeV) operation of the LHC and relatively early Rencontres de Moriond are the high-priority activities for the group at the moment. To be ready for the coming 8-TeV data, CMS has made a concerted effort to perform and publish analyses on the 5 fb−1 dataset recorded in 2011. This has resulted in the submission of 16 papers already, including nine on the search for the Higgs boson. In addition, a number of preliminary results on the 2011 dataset have been released to the public. The Exotica and SUSY groups approved several searches for new physics in January, such as searches for W′ and exotic highly ionising particles. These were highlighted at a CERN seminar given on 24th January. Many more analyses, from all the PAGs, including the newly formed SMP (Standard Model Physics) and FSQ (Forward and Small-x QCD), were approved in February. The ...
C. Hill
2012-01-01
The period since the last CMS Bulletin has been historic for CMS Physics. The pinnacle of our physics programme was an observation of a new particle – a strong candidate for a Higgs boson – which has captured worldwide interest and made a profound impact on the very field of particle physics. At the time of the discovery announcement on 4 July, 2012, prominent signals were observed in the high-resolution H→γγ and H→ZZ(4l) modes. Corroborating excess was observed in the H→W+W– mode as well. The fermionic channel analyses (H→bb, H→ττ), however, yielded less than the Standard Model (SM) expectation. Collectively, the five channels established the signal with a significance of five standard deviations. With the exception of the diphoton channel, these analyses have all been updated in the last months and several new channels have been added. With improved analyses and more than twice the i...
Lectures on statistical mechanics
Bowler, M G
1982-01-01
Anyone dissatisfied with the almost ritual dullness of many 'standard' texts in statistical mechanics will be grateful for the lucid explanation and generally reassuring tone. Aimed at securing firm foundations for equilibrium statistical mechanics, topics of great subtlety are presented transparently and enthusiastically. Very little mathematical preparation is required beyond elementary calculus and prerequisites in physics are limited to some elementary classical thermodynamics. Suitable as a basis for a first course in statistical mechanics, the book is an ideal supplement to more convent
Yen, Yung-Feng; Yen, Muh-Yong; Lin, Yi-Ping; Shih, Hsiu-Chen; Li, Lan-Huei; Chou, Pesus; Deng, Chung-Yeh
2013-01-01
To determine the effect of directly observed therapy (DOT) on tuberculosis-specific mortality and non-TB-specific mortality and identify prognostic factors associated with mortality among adults with culture-positive pulmonary TB (PTB). All adult Taiwanese with PTB in Taipei, Taiwan were included in a retrospective cohort study in 2006-2010. Backward stepwise multinomial logistic regression was used to identify risk factors associated with each mortality outcome. Mean age of the 3,487 patients was 64.2 years and 70.4% were male. Among 2471 patients on DOT, 4.2% (105) died of TB-specific causes and 15.4% (381) died of non-TB-specific causes. Among 1016 patients on SAT, 4.4% (45) died of TB-specific causes and 11.8% (120) died of non-TB-specific causes. , After adjustment for potential confounders, the odds ratio for TB-specific mortality was 0.45 (95% CI: 0.30-0.69) among patients treated with DOT as compared with those on self-administered treatment. Independent predictors of TB-specific and non-TB-specific mortality included older age (ie, 65-79 and ≥80 years vs. 18-49 years), being unemployed, a positive sputum smear for acid-fast bacilli, and TB notification from a general ward or intensive care unit (reference: outpatient services). Male sex, end-stage renal disease requiring dialysis, malignancy, and pleural effusion on chest radiography were associated with increased risk of non-TB-specific mortality, while presence of lung cavities on chest radiography was associated with lower risk. DOT reduced TB-specific mortality by 55% among patients with PTB, after controlling for confounders. DOT should be given to all TB patients to further reduce TB-specific mortality.
Wen-Yuan Ho
2018-06-01
Full Text Available Fine particulate matter (PM2.5 has a small particle size, which allows it to directly enter the respiratory mucosa and reach the alveoli and even the blood. Many countries are already aware of the adverse effects of PM2.5, and determination of the sources of PM2.5 is a critical step in reducing its concentration to protect public health. This study monitored PM2.5 in the summer (during the southwest monsoon season of 2017. Three online monitoring systems were used to continuously collect hourly concentrations of key chemical components of PM2.5, including anions, cations, carbon, heavy metals, and precursor gases, for 24 h per day. The sum of the concentrations of each compound obtained from the online monitoring systems is similar to the actual PM2.5 concentration (98.75%. This result suggests that the on-line monitoring system of this study covers relatively complete chemical compounds. Positive matrix factorization (PMF was adopted to explore and examine the proportion of each source that contributed to the total PM2.5 concentration. According to the source contribution analysis, 55% of PM2.5 can be attributed to local pollutant sources, and the remaining 45% can be attributed to pollutants emitted outside Taipei City. During the high-PM2.5-concentration (episode period, the pollutant conversion rates were higher than usual due to the occurrence of vigorous photochemical reactions. Moreover, once pollutants are emitted by external stationary pollutant sources, they move with pollution air masses and undergo photochemical reactions, resulting in increases in the secondary pollutant concentrations of PM2.5. The vertical monitoring data indicate that there is a significant increase in PM2.5 concentration at high altitudes. High-altitude PM2.5 will descend to the ground and thereby affect the ground-level PM2.5 concentration.
Liu, Wen-Te; Ma, Chih-Ming; Liu, I-Jung; Han, Bor-Cheng; Chuang, Hsiao-Chi; Chuang, Kai-Jen
2015-05-01
The association between traffic-related air pollution and adverse cardiovascular effects has been well documented; however, little is known about whether different commuting modes can modify the effects of air pollution on the cardiovascular system in human subjects in urban areas with heavy traffic. We recruited 120 young, healthy subjects in Taipei, Taiwan. Each participant was classified with different commuting modes according to his/her own commuting style. Three repeated measurements of heart rate variability (HRV) indices {standard deviation of NN intervals (SDNN) and the square root of the mean of the sum of the squares of differences between adjacent NN intervals (r-MSSD)}, particulate matter with an aerodynamic diameter ≤ 2.5 μm (PM2.5), temperature, humidity and noise level were conducted for each subject during 1-h morning commutes (0900-1000 h) in four different commuting modes, including an electrically powered subway, a gas-powered bus, a gasoline-powered car, and walking. Linear mixed-effects models were used to investigate the association of PM2.5 with HRV indices. The results showed that decreases in the HRV indices were associated with increased levels of PM2.5. The personal exposure levels to PM2.5 were the highest in the walking mode. The effects of PM2.5 on cardiovascular endpoints were the lowest in the subway mode compared to the effects in the walking mode. The participants in the car and bus modes had reduced effects on their cardiovascular endpoints compared to the participants in the walking mode. We concluded that traffic-related PM2.5 is associated with autonomic alteration. Commuting modes can modify the effects of PM2.5 on HRV indices among young, healthy subjects. Copyright © 2015 Elsevier GmbH. All rights reserved.
Simulation of Land-Cover Change in Taipei Metropolitan Area under Climate Change Impact
Huang, Kuo-Ching; Huang, Thomas C C
2014-01-01
Climate change causes environment change and shows up on land covers. Through observing the change of land use, researchers can find out the trend and potential mechanism of the land cover change. Effective adaptation policies can affect pattern of land cover change and may decrease the risks of climate change impacts. By simulating land use dynamics with scenario settings, this paper attempts to explore the relationship between climate change and land-cover change through efficient adaptation polices. It involves spatial statistical model in estimating possibility of land-cover change, cellular automata model in modeling land-cover dynamics, and scenario analysis in response to adaptation polices. The results show that, without any control, the critical eco-areas, such as estuarine areas, will be destroyed and people may move to the vulnerable and important economic development areas. In the other hand, under the limited development condition for adaptation, people migration to peri-urban and critical eco-areas may be deterred
J. D'Hondt
The Electroweak and Top Quark Workshop (16-17th of July) A Workshop on Electroweak and Top Quark Physics, dedicated on early measurements, took place on 16th-17th July. We had more than 40 presentations at the Workshop, which was an important milestone for 2007 physics analyses in the EWK and TOP areas. The Standard Model has been tested empirically by many previous experiments. Observables which are nowadays known with high precision will play a major role for data-based CMS calibrations. A typical example is the use of the Z to monitor electron and muon reconstruction in di-lepton inclusive samples. Another example is the use of the W mass as a constraint for di-jets in the kinematic fitting of top-quark events, providing information on the jet energy scale. The predictions of the Standard Model, for what concerns proton collisions at the LHC, are accurate to a level that the production of W/Z and top-quark events can be used as a powerful tool to commission our experiment. On the other hand the measure...
Christopher Hill
2013-01-01
Since the last CMS Bulletin, the CMS Physics Analysis Groups have completed more than 70 new analyses, many of which are based on the complete Run 1 dataset. In parallel the Snowmass whitepaper on projected discovery potential of CMS for HL-LHC has been completed, while the ECFA HL-LHC future physics studies has been summarised in a report and nine published benchmark analyses. Run 1 summary studies on b-tag and jet identification, quark-gluon discrimination and boosted topologies have been documented in BTV-13-001 and JME-13-002/005/006, respectively. The new tracking alignment and performance papers are being prepared for submission as well. The Higgs analysis group produced several new results including the search for ttH with H decaying to ZZ, WW, ττ+bb (HIG-13-019/020) where an excess of ~2.5σ is observed in the like-sign di-muon channel, and new searches for high-mass Higgs bosons (HIG-13-022). Search for invisible Higgs decays have also been performed both using the associ...
C. Hill
2013-01-01
In the period since the last CMS Bulletin, the LHC – and CMS – have entered LS1. During this time, CMS Physics Analysis Groups have performed more than 40 new analyses, many of which are based on the complete 8 TeV dataset delivered by the LHC in 2012 (and in some cases on the full Run 1 dataset). These results were shown at, and well received by, several high-profile conferences in the spring of 2013, including the inaugural meeting of the Large Hadron Collider Physics Conference (LHCP) in Barcelona, and the 26th International Symposium on Lepton Photon Interactions at High Energies (LP) in San Francisco. In parallel, there have been significant developments in preparations for Run 2 of the LHC and on “future physics” studies for both Phase 1 and Phase 2 upgrades of the CMS detector. The Higgs analysis group produced five new results for LHCP including a new H-to-bb search in VBF production (HIG-13-011), ttH with H to γ&ga...
C. Hill
2013-01-01
The period since the last CMS bulletin has seen the end of proton collisions at a centre-of-mass energy 8 TeV, a successful proton-lead collision run at 5 TeV/nucleon, as well as a “reference” proton run at 2.76 TeV. With these final LHC Run 1 datasets in hand, CMS Physics Analysis Groups have been busy analysing these data in preparation for the winter conferences. Moreover, despite the fact that the pp run only concluded in mid-December (and there was consequently less time to complete data analyses), CMS again made a strong showing at the Rencontres de Moriond in La Thuile (EW and QCD) where nearly 40 new results were presented. The highlight of these preliminary results was the eagerly anticipated updated studies of the properties of the Higgs boson discovered in July of last year. Meanwhile, preparations for Run 2 and physics performance studies for Phase 1 and Phase 2 upgrade scenarios are ongoing. The Higgs analysis group produced updated analyses on the full Run 1 dataset (~25 f...
... this page: https://medlineplus.gov/usestatistics.html MedlinePlus Statistics To use the sharing features on this page, ... By Quarter View image full size Quarterly User Statistics Quarter Page Views Unique Visitors Oct-Dec-98 ...
Pestman, Wiebe R
2009-01-01
This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.
Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics
V.Ciulli
2011-01-01
The main programme of the Physics Week held between 16th and 20th May was a series of topology-oriented workshops on di-leptons, di-photons, inclusive W, and all-hadronic final states. The goal of these workshops was to reach a common understanding for the set of objects (ID, cleaning...), the handling of pile-up, calibration, efficiency and purity determination, as well as to revisit critical common issues such as the trigger. Di-lepton workshop Most analysis groups use a di-lepton trigger or a combination of single and di-lepton triggers in 2011. Some groups need to collect leptons with as low PT as possible with strong isolation and identification requirements as for Higgs into WW at low mass, others with intermediate PT values as in Drell-Yan studies, or high PT as in the Exotica group. Electron and muon reconstruction, identification and isolation, was extensively described in the workshop. For electrons, VBTF selection cuts for low PT and HEEP cuts for high PT were discussed, as well as more complex d...
Wang, Yu-Chun; Lin, Yu-Kai
2014-01-01
This study evaluated risks of the emergency room visits (ERV) for cerebrovascular diseases, heart diseases, ischemic heart disease, hypertensive diseases, chronic renal failure (CRF), diabetes mellitus (DM), asthma, chronic airway obstruction not elsewhere classified (CAO), and accidents associated with the ambient temperature from 2000 to 2009 in metropolitan Taipei. The distributed lag non-linear model was used to estimate the cumulative relative risk (RR) and confidence interval (CI) of cause-specific ERV associated with daily temperature from lag 0 to lag 3 after controlling for potential confounders. This study identified that temperatures related to the lowest risk of ERV was 26 °C for cerebrovascular diseases, 18 °C for CRF, DM, and accidents, and 30 °C for hypertensive diseases, asthma, and CAO. These temperatures were used as the reference temperatures to measure RR for the corresponding diseases. A low temperature (14°C) increased the ERV risk for cerebrovascular diseases, hypertensive diseases, and asthma, with respective cumulative 4-day RRs of 1.56 (95% CI: 1.23, 1.97), 1.78 (95% CI: 1.37, 2.34), and 2.93 (95% CI: 1.26, 6.79). The effects were greater on, or after, lag one. At 32°C, the cumulative 4-day RR for ERV was significant for CRF (RR = 2.36; 95% CI: 1.33, 4.19) and accidents (RR = 1.23; 95% CI: 1.14, 1.33) and the highest RR was seen on lag 0 for CRF (RR = 1.69; 95% CI: 1.01, 3.58), DM (RR = 1.69; 95% CI: 1.09, 2.61), and accidents (RR = 1.19; 95% CI: 1.11, 1.27). Higher temperatures are associated with the increased ERV risks for CRF, DM, and accidents and lower temperatures with the increased ERV risks for cerebrovascular diseases, hypertensive diseases, and asthma in the subtropical metropolitan.
Huang, Wen-Chan; Huang, Li-Min; Kao, Chuan-Liang; Lu, Chun-Yi; Shao, Pei-Lan; Cheng, Ai-Ling; Fan, Tsui-Yien; Chi, Hsin; Chang, Luan-Yin
2012-04-01
Enterovirus 71 (EV71) infection may cause severe neurological and cardiopulmonary complications, especially in preschool children. This study is to investigate the seroprevalence and seroconversion of EV71, and the crossprotection of EV71 antibody against other enteroviruses among kindergarteners. Overall 228 children in a public kindergarten were enrolled during two academic years, 2006 and 2007, in Taipei, Taiwan and we measured their EV71 neutralizing antibody. When the participants had herpangina; hand, foot and mouth disease (HFMD); febrile illness or respiratory symptoms, throat swabs were sampled and processed for viral culture and enterovirus real-time reverse transcriptase polymerase chain reaction (RT-PCR). Questionnaires, completed by the participants' guardians, surveyed the history of allergy and annual incidence of symptoms related to enterovirus infection. Seropositive rates of EV71 were 20% (32/163) in 2006 and 6% (4/65) in 2007. The rate of EV71 seropositivity increased with age (p Taipei City from September 2006 to June 2008. Presence of EV71 neutralizing antibody was not associated with lower incidence of enterovirus infection caused by non-71 serotypes. Copyright © 2011. Published by Elsevier B.V.
Chung-Yuan Huang
2014-01-01
Full Text Available Immediate treatment with an automated external defibrillator (AED increases out-of-hospital cardiac arrest (OHCA patient survival potential. While considerable attention has been given to determining optimal public AED locations, spatial and temporal factors such as time of day and distance from emergency medical services (EMSs are understudied. Here we describe a geocomputational genetic algorithm with a new stirring operator (GANSO that considers spatial and temporal cardiac arrest occurrence factors when assessing the feasibility of using Taipei 7-Eleven stores as installation locations for AEDs. Our model is based on two AED conveyance modes, walking/running and driving, involving service distances of 100 and 300 meters, respectively. Our results suggest different AED allocation strategies involving convenience stores in urban settings. In commercial areas, such installations can compensate for temporal gaps in EMS locations when responding to nighttime OHCA incidents. In residential areas, store installations can compensate for long distances from fire stations, where AEDs are currently held in Taipei.
Shang-Shyue Tsai
2014-05-01
Full Text Available This study was undertaken to determine whether there was an association between PM2.5 levels and daily mortality in Taipei, Taiwan, the largest metropolitan city with a subtropical climate. Daily mortality, air pollution, and weather data for Taipei were obtained for the period from 2006–2008. The relative risk of daily mortality was estimated using a time-stratified case-crossover approach, controlling for weather variables, day of the week, seasonality, and long-term time trends. For the single pollutant model, PM2.5 showed association with total mortality both on warm (>23 °C and cool days (<23 °C. There is no indication of an association between PM2.5 and risk of death due to respiratory diseases both on warm and cool days. PM2.5 had effects on the risk of death from cardiovascular diseases only on cool days. In the two-pollutant models, PM2.5 remained effects on the risk of mortality for all cause and cardiovascular disease after the inclusion of SO2 and O3 both on warm and cool days. This study provides evidence that short-term exposure to PM2.5 increased the risk of death for all cause and cardiovascular disease.
Stavrianaki, K.; Vallianatos, F.; Sammonds, P. R.; Ross, G. J.
2014-12-01
Fracturing is the most prevalent deformation mechanism in rocks deformed in the laboratory under simulated upper crustal conditions. Fracturing produces acoustic emissions (AE) at the laboratory scale and earthquakes on a crustal scale. The AE technique provides a means to analyse microcracking activity inside the rock volume and since experiments can be performed under confining pressure to simulate depth of burial, AE can be used as a proxy for natural processes such as earthquakes. Experimental rock deformation provides us with several ways to investigate time-dependent brittle deformation. Two main types of experiments can be distinguished: (1) "constant strain rate" experiments in which stress varies as a result of deformation, and (2) "creep" experiments in which deformation and deformation rate vary over time as a result of an imposed constant stress. We conducted constant strain rate experiments on air-dried Darley Dale sandstone samples in a variety of confining pressures (30MPa, 50MPa, 80MPa) and in water saturated samples with 20 MPa initial pore fluid pressure. The results from these experiments used to determine the initial loading in the creep experiments. Non-extensive statistical physics approach was applied to the AE data in order to investigate the spatio-temporal pattern of cracks close to failure. A more detailed study was performed for the data from the creep experiments. When axial stress is plotted against time we obtain the trimodal creep curve. Calculation of Tsallis entropic index q is performed to each stage of the curve and the results are compared with the ones from the constant strain rate experiments. The Epidemic Type Aftershock Sequence model (ETAS) is also applied to each stage of the creep curve and the ETAS parameters are calculated. We investigate whether these parameters are constant across all stages of the curve, or whether there are interesting patterns of variation. This research has been co-funded by the European Union
Goodman, Joseph W
2015-01-01
This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications. The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i
Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il
2017-05-15
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
Eliazar, Iddo
2017-01-01
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
Szulc, Stefan
1965-01-01
Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then
... Testing Treatment & Outcomes Health Professionals Statistics More Resources Candidiasis Candida infections of the mouth, throat, and esophagus Vaginal candidiasis Invasive candidiasis Definition Symptoms Risk & Prevention Sources Diagnosis ...
Petocz, Peter; Sowey, Eric
2012-01-01
The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…
Petocz, Peter; Sowey, Eric
2008-01-01
In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…
Glaz, Joseph
2009-01-01
Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.
Nick, Todd G
2007-01-01
Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.
Scheck, Florian
2016-01-01
Scheck’s textbook starts with a concise introduction to classical thermodynamics, including geometrical aspects. Then a short introduction to probabilities and statistics lays the basis for the statistical interpretation of thermodynamics. Phase transitions, discrete models and the stability of matter are explained in great detail. Thermodynamics has a special role in theoretical physics. Due to the general approach of thermodynamics the field has a bridging function between several areas like the theory of condensed matter, elementary particle physics, astrophysics and cosmology. The classical thermodynamics describes predominantly averaged properties of matter, reaching from few particle systems and state of matter to stellar objects. Statistical Thermodynamics covers the same fields, but explores them in greater depth and unifies classical statistical mechanics with quantum theory of multiple particle systems. The content is presented as two tracks: the fast track for master students, providing the essen...
Blakemore, J S
1962-01-01
Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
Tonchev, N.; Shumovskij, A.S.
1986-01-01
The history of investigations, conducted at the JINR in the field of statistical mechanics, beginning with the fundamental works by Bogolyubov N.N. on superconductivity microscopic theory is presented. Ideas, introduced in these works and methods developed in them, have largely determined the ways for developing statistical mechanics in the JINR and Hartree-Fock-Bogolyubov variational principle has become an important method of the modern nucleus theory. A brief review of the main achievements, connected with the development of statistical mechanics methods and their application in different fields of physical science is given
Wallis, W Allen
2014-01-01
Focusing on everyday applications as well as those of scientific research, this classic of modern statistical methods requires little to no mathematical background. Readers develop basic skills for evaluating and using statistical data. Lively, relevant examples include applications to business, government, social and physical sciences, genetics, medicine, and public health. ""W. Allen Wallis and Harry V. Roberts have made statistics fascinating."" - The New York Times ""The authors have set out with considerable success, to write a text which would be of interest and value to the student who,
Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-08
In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.
Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...
U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...
Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Data about the usage of the WPRDC site and its various datasets, obtained by combining Google Analytics statistics with information from the WPRDC's data portal.
Serdobolskii, Vadim Ivanovich
2007-01-01
This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...
... Search Form Controls Cancel Submit Search the CDC Gonorrhea Note: Javascript is disabled or is not supported ... Twitter STD on Facebook Sexually Transmitted Diseases (STDs) Gonorrhea Statistics Recommend on Facebook Tweet Share Compartir Gonorrhea ...
Tryggestad, Kjell
2004-01-01
The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...
New concept of statistical ensembles
Gorenstein, M.I.
2009-01-01
An extension of the standard concept of the statistical ensembles is suggested. Namely, the statistical ensembles with extensive quantities fluctuating according to an externally given distribution is introduced. Applications in the statistical models of multiple hadron production in high energy physics are discussed.
Ting Chia Weng
Full Text Available School children may transmit pathogens with cluster cases occurring on campuses and in families. In response to the 2009 influenza A (H1N1 pandemic, Taipei City Government officials developed a School-based Infectious Disease Syndromic Surveillance System (SID-SSS. Teachers and nurses from preschools to universities in all 12 districts within Taipei are required to daily report cases of symptomatic children or sick leave requests through the SID-SSS. The pre-diagnosis at schools is submitted firstly as common pediatric disease syndrome-groups and re-submitted after confirmation by physicians. We retrieved these data from January 2010 to August 2011 for spatio-temporal analysis and evaluated the temporal trends with cases obtained from both the Emergency Department-based Syndromic Surveillance System (ED-SSS and the Longitudinal Health Insurance Database 2005 (LHID2005. Through the SID-SSS, enterovirus-like illness (EVI and influenza-like illness (ILI were the two most reported syndrome groups (77.6% and 15.8% among a total of 19,334 cases, respectively. The pre-diagnosis judgments made by school teachers and nurses showed high consistency with physicians' clinical diagnoses for EVI (97.8% and ILI (98.9%. Most importantly, the SID-SSS had better timeliness with earlier peaks of EVI and ILI than those in the ED-SSS. Furthermore, both of the syndrome groups in these two surveillance systems had the best correlation reaching 0.98 and 0.95, respectively (p<0.01. Spatio-temporal analysis observed the patterns of EVI and ILI both diffuse from the northern suburban districts to central Taipei, with ILI spreading faster. This novel system can identify early suspected cases of two important pediatric infections occurring at schools, and clusters from schools/families. It was also cost-effective (95.5% of the operation cost reduced and 59.7% processing time saved. The timely surveillance of mild EVI and ILI cases integrated with spatial analysis may help
Preibus-Norquist, R. N. C.-Grover; Bush-Romney, G. W.-Willard-Mitt; Dimon, J. P.; Adelson-Koch, Sheldon-Charles-David-Sheldon; Krugman-Axelrod, Paul-David; Siegel, Edward Carl-Ludwig; D. N. C./O. F. P./''47''%/50% Collaboration; R. N. C./G. O. P./''53''%/49% Collaboration; Nyt/Wp/Cnn/Msnbc/Pbs/Npr/Ft Collaboration; Ftn/Fnc/Fox/Wsj/Fbn Collaboration; Lb/Jpmc/Bs/Boa/Ml/Wamu/S&P/Fitch/Moodys/Nmis Collaboration
2013-03-01
``Models''? CAVEAT EMPTOR!!!: ``Toy Models Too-Often Yield Toy-Results''!!!: Goldenfeld[``The Role of Models in Physics'', in Lects.on Phase-Transitions & R.-G.(92)-p.32-33!!!]: statistics(Silver{[NYTimes; Bensinger, ``Math-Geerks Clearly-Defeated Pundits'', LATimes, (11/9/12)])}, polls, politics, economics, elections!!!: GRAPH/network/net/...-PHYSICS Barabasi-Albert[RMP (02)] (r,t)-space VERSUS(???) [Where's the Inverse/ Dual/Integral-Transform???] (Benjamin)Franklin(1795)-Fourier(1795; 1897;1822)-Laplace(1850)-Mellin (1902) Brillouin(1922)-...(k,)-space, {Hubbard [The World According to Wavelets,Peters (96)-p.14!!!/p.246: refs.-F2!!!]},and then (2) Albert-Barabasi[]Bose-Einstein quantum-statistics(BEQS) Bose-Einstein CONDENSATION (BEC) versus Bianconi[pvt.-comm.; arXiv:cond-mat/0204506; ...] -Barabasi [???] Fermi-Dirac
CERN. Geneva
2005-01-01
The three lectures will present an introduction to statistical methods as used in High Energy Physics. As the time will be very limited, the course will seek mainly to define the important issues and to introduce the most wide used tools. Topics will include the interpretation and use of probability, estimation of parameters and testing of hypotheses.
Introduction to Statistics course
CERN. Geneva HR-RFA
2006-01-01
The four lectures will present an introduction to statistical methods as used in High Energy Physics. As the time will be very limited, the course will seek mainly to define the important issues and to introduce the most wide used tools. Topics will include the interpretation and use of probability, estimation of parameters and testing of hypotheses.
CERN. Geneva
2004-01-01
The three lectures will present an introduction to statistical methods as used in High Energy Physics. As the time will be very limited, the course will seek mainly to define the important issues and to introduce the most wide used tools. Topics will include the interpretation and use of probability, estimation of parameters and testing of hypotheses.
On quantum statistical inference
Barndorff-Nielsen, O.E.; Gill, R.D.; Jupp, P.E.
2001-01-01
Recent developments in the mathematical foundations of quantum mechanics have brought the theory closer to that of classical probability and statistics. On the other hand, the unique character of quantum physics sets many of the questions addressed apart from those met classically in stochastics.
Hu, S.-C.; Lee, J.-H.
2004-01-01
This investigation studies how platform screen doors (PSD) affect the energy consumption of the environmental control system of a mass rapid transit (MRT) system in Taipei. The environmental parameter simulation was conducted using the subway environmental simulation (SES) program, while the associated air conditioning (A/C) cooling load was predicted with the carrier E20-II program. Results show that PSD can significantly decrease average and peak cooling load, thus reducing the capacity/size of cooling equipment and allowing the chiller cooling load to be abridged. However, electricity consumption by ventilation equipment increases notably when PSD are used, particularly the electricity consumption by the under platform exhaust (UPE) fan, and thus, ultimately, little difference exists in the overall energy consumption with and without UPE
Hu, S.-C. E-mail: f10870@ntut.edu.tw; Lee, J.-H
2004-03-01
This investigation studies how platform screen doors (PSD) affect the energy consumption of the environmental control system of a mass rapid transit (MRT) system in Taipei. The environmental parameter simulation was conducted using the subway environmental simulation (SES) program, while the associated air conditioning (A/C) cooling load was predicted with the carrier E20-II program. Results show that PSD can significantly decrease average and peak cooling load, thus reducing the capacity/size of cooling equipment and allowing the chiller cooling load to be abridged. However, electricity consumption by ventilation equipment increases notably when PSD are used, particularly the electricity consumption by the under platform exhaust (UPE) fan, and thus, ultimately, little difference exists in the overall energy consumption with and without UPE.
Goodman, J. W.
This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.
The Concise Encyclopedia of Statistics
Dodge, Yadolah
2008-01-01
The Concise Encyclopedia of Statistics presents the essential information about statistical tests, concepts, and analytical methods in language that is accessible to practitioners and students of the vast community using statistics in medicine, engineering, physical science, life science, social science, and business/economics. The reference is alphabetically arranged to provide quick access to the fundamental tools of statistical methodology and biographies of famous statisticians. The more than 500 entries include definitions, history, mathematical details, limitations, examples, references,
Rohatgi, Vijay K
2003-01-01
Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth
Levine-Wissing, Robin
2012-01-01
All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep
Davidson, Norman
2003-01-01
Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody
Israel physical society 1993 annual meeting
1993-01-01
The publication includes abstracts from several fields of physics: particle and fields, medical physics, astrophysics, condensed matter, plasma, computational physics, statistical physics, nuclear physics, lasers and optics
Israel Physical Society annual meeting 1996
1996-01-01
The publication includes abstracts from several fields of physics: particle and fields, medical physics, astrophysics, condensed matter, plasma, computational physics, statistical physics, nuclear physics, lasers and optics
inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.
Schrödinger, Erwin
1952-01-01
Nobel Laureate's brilliant attempt to develop a simple, unified standard method of dealing with all cases of statistical thermodynamics - classical, quantum, Bose-Einstein, Fermi-Dirac, and more.The work also includes discussions of Nernst theorem, Planck's oscillator, fluctuations, the n-particle problem, problem of radiation, much more.
Pereira, Wagner de S.
2013-01-01
The Ore Treatment Unit (UTM) is a uranium mine off. The statistical analysis of clustering was used to evaluate the behavior of stable chemical elements and physico-chemical variables in their effluents. The use of cluster analysis proved effective in the evaluation, allowing to identify groups of chemical elements in physico-chemical variables and group analyzes (element and variables ). As a result, we can say, based on the analysis of the data, a strong link between Ca and Mg and between Al and TR 2 O 3 (rare earth oxides) in the UTM effluents. The SO 4 was also identified as strongly linked to total solids and dissolved and these linked to electrical conductivity. Other associations existed, but were not as strongly linked. Additional collections for seasonal evaluation are required so that assessments can be confirmed. Additional statistics analysis (ordination techniques) should be used to help identify the origins of the groups identified in this analysis. (author)
Di Florio, Adriano
2017-10-01
In order to test the computing capabilities of GPUs with respect to traditional CPU cores a high-statistics toy Monte Carlo technique has been implemented both in ROOT/RooFit and GooFit frameworks with the purpose to estimate the statistical significance of the structure observed by CMS close to the kinematical boundary of the J/ψϕ invariant mass in the three-body decay B + → J/ψϕK +. GooFit is a data analysis open tool under development that interfaces ROOT/RooFit to CUDA platform on nVidia GPU. The optimized GooFit application running on GPUs hosted by servers in the Bari Tier2 provides striking speed-up performances with respect to the RooFit application parallelised on multiple CPUs by means of PROOF-Lite tool. The considerable resulting speed-up, evident when comparing concurrent GooFit processes allowed by CUDA Multi Process Service and a RooFit/PROOF-Lite process with multiple CPU workers, is presented and discussed in detail. By means of GooFit it has also been possible to explore the behaviour of a likelihood ratio test statistic in different situations in which the Wilks Theorem may or may not apply because its regularity conditions are not satisfied.
1993-11-01
theories of turbulence and the e expansion. Phys. Fluids, 30, 2021-2029. FORSTER, D., NELSON, D.R., AND STEPHEN , M.J., 1977: Large distance and long-time...Branover and Y. Unger, Vol. 149, Progress in Astron. and Aeron., AIAA, pp. 159-164. STARR, V.P., 1968: Physics of Negative Viscosity Phenomena. Mc- Graw Hill
Anon.
1994-01-01
For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources
Pivato, Marcus
2013-01-01
We show that, in a sufficiently large population satisfying certain statistical regularities, it is often possible to accurately estimate the utilitarian social welfare function, even if we only have very noisy data about individual utility functions and interpersonal utility comparisons. In particular, we show that it is often possible to identify an optimal or close-to-optimal utilitarian social choice using voting rules such as the Borda rule, approval voting, relative utilitarianism, or a...
Natrella, Mary Gibbons
1963-01-01
Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations
Petocz, Peter; Sowey, Eric
2008-01-01
When people speak of "the Law of Gravity" they are generally referring to what is more specifically known as "Newton's Law of Gravitation." This law states that the gravitational force (that is, the mutual attraction) between any two physical bodies is directly proportional to the product of their individual masses and inversely proportional to…
Klevens, Joanne; Ports, Katie A
2017-11-01
Gender inequity is proposed as a societal-level risk factor for child maltreatment. However, most cross-national research examining this association is limited to developing countries and has used limited measures of gender inequity and child homicides as a proxy for child maltreatment. To examine the relationship between gender inequity and child maltreatment, we used caregivers' reported use of severe physical punishment (proxy for physical abuse) and children under 5 left alone or under the care of another child younger than 10 years of age (supervisory neglect) and three indices of gender inequity (the Social and Institutional Gender Index, the Gender Inequality Index, and the Gender Gap Index) from 57 countries, over half of which were developing countries. We found all three gender inequity indices to be significantly associated with physical abuse and two of the three to be significantly associated with neglect, after controlling for country-level development. Based on these findings, efforts to prevent child abuse and neglect might benefit from reducing gender inequity.
KUDRYAVTSEV Pavel Gennadievich
2015-04-01
Full Text Available The paper deals with possibilities to use quasi-homogenous approximation for discription of properties of dispersed systems. The authors applied statistical polymer ethod based on consideration of average structures of all possible macromolecules of the same weight. The equiations which allow evaluating many additive parameters of macromolecules and the systems with them were deduced. Statistical polymer method makes it possible to model branched, cross-linked macromolecules and the systems with them which are in equilibrium or non-equilibrium state. Fractal analysis of statistical polymer allows modeling different types of random fractal and other objects examined with the mehods of fractal theory. The method of fractal polymer can be also applied not only to polymers but also to composites, gels, associates in polar liquids and other packaged systems. There is also a description of the states of colloid solutions of silica oxide from the point of view of statistical physics. This approach is based on the idea that colloid solution of silica dioxide – sol of silica dioxide – consists of enormous number of interacting particles which are always in move. The paper is devoted to the research of ideal system of colliding but not interacting particles of sol. The analysis of behavior of silica sol was performed according to distribution Maxwell-Boltzmann and free path length was calculated. Using this data the number of the particles which can overcome the potential barrier in collision was calculated. To model kinetics of sol-gel transition different approaches were studied.
Anon.
1989-01-01
World data from the United Nation's latest Energy Statistics Yearbook, first published in our last issue, are completed here. The 1984-86 data were revised and 1987 data added for world commercial energy production and consumption, world natural gas plant liquids production, world LP-gas production, imports, exports, and consumption, world residual fuel oil production, imports, exports, and consumption, world lignite production, imports, exports, and consumption, world peat production and consumption, world electricity production, imports, exports, and consumption (Table 80), and world nuclear electric power production