WorldWideScience

Sample records for equilibrium statistical physics

  1. Non-equilibrium statistical physics with application to disordered systems

    CERN Document Server

    Cáceres, Manuel Osvaldo

    2017-01-01

    This textbook is the result of the enhancement of several courses on non-equilibrium statistics, stochastic processes, stochastic differential equations, anomalous diffusion and disorder. The target audience includes students of physics, mathematics, biology, chemistry, and engineering at undergraduate and graduate level with a grasp of the basic elements of mathematics and physics of the fourth year of a typical undergraduate course. The little-known physical and mathematical concepts are described in sections and specific exercises throughout the text, as well as in appendices. Physical-mathematical motivation is the main driving force for the development of this text. It presents the academic topics of probability theory and stochastic processes as well as new educational aspects in the presentation of non-equilibrium statistical theory and stochastic differential equations.. In particular it discusses the problem of irreversibility in that context and the dynamics of Fokker-Planck. An introduction on fluc...

  2. Equilibrium statistical mechanics

    CERN Document Server

    Mayer, J E

    1968-01-01

    The International Encyclopedia of Physical Chemistry and Chemical Physics, Volume 1: Equilibrium Statistical Mechanics covers the fundamental principles and the development of theoretical aspects of equilibrium statistical mechanics. Statistical mechanical is the study of the connection between the macroscopic behavior of bulk matter and the microscopic properties of its constituent atoms and molecules. This book contains eight chapters, and begins with a presentation of the master equation used for the calculation of the fundamental thermodynamic functions. The succeeding chapters highlight t

  3. Some recent developments in non-equilibrium statistical physics

    Indian Academy of Sciences (India)

    : ... This canonical prescription is the starting point for studying a system in ... abilistic approach to non-equilibrium dynamics by treating the case of Markovian ..... equation in this network between the incoming flux and the outgoing flux at each.

  4. Statistical physics

    CERN Document Server

    Sadovskii, Michael V

    2012-01-01

    This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.

  5. Equilibrium statistical mechanics

    CERN Document Server

    Jackson, E Atlee

    2000-01-01

    Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t

  6. Statistical physics

    CERN Document Server

    Guénault, Tony

    2007-01-01

    In this revised and enlarged second edition of an established text Tony Guénault provides a clear and refreshingly readable introduction to statistical physics, an essential component of any first degree in physics. The treatment itself is self-contained and concentrates on an understanding of the physical ideas, without requiring a high level of mathematical sophistication. A straightforward quantum approach to statistical averaging is adopted from the outset (easier, the author believes, than the classical approach). The initial part of the book is geared towards explaining the equilibrium properties of a simple isolated assembly of particles. Thus, several important topics, for example an ideal spin-½ solid, can be discussed at an early stage. The treatment of gases gives full coverage to Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein statistics. Towards the end of the book the student is introduced to a wider viewpoint and new chapters are included on chemical thermodynamics, interactions in, for exam...

  7. Teaching at the edge of knowledge: Non-equilibrium statistical physics

    Science.gov (United States)

    Schmittmann, Beate

    2007-03-01

    As physicists become increasingly interested in biological problems, we frequently find ourselves confronted with complex open systems, involving many interacting constituents and characterized by non-vanishing fluxes of mass or energy. Faced with the task of predicting macroscopic behaviors from microscopic information for these non-equilibrium systems, the familiar Gibbs-Boltzmann framework fails. The development of a comprehensive theoretical characterization of non-equilibrium behavior is one of the key challenges of modern condensed matter physics. In its absence, several approaches have been developed, from master equations to thermostatted molecular dynamics, which provide key insights into the rich and often surprising phenomenology of systems far from equilibrium. In my talk, I will address some of these methods, selecting those that are most relevant for a broad range of interdisciplinary problems from biology to traffic, finance, and sociology. The ``portability'' of these methods makes them valuable for graduate students from a variety of disciplines. To illustrate how different methods can complement each other when probing a problem from, e.g., the life sciences, I will discuss some recent attempts at modeling translation, i.e., the process by which the genetic information encoded on an mRNA is translated into the corresponding protein.

  8. Renyi statistics in equilibrium statistical mechanics

    International Nuclear Information System (INIS)

    Parvan, A.S.; Biro, T.S.

    2010-01-01

    The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.

  9. Statistical Physics

    CERN Document Server

    Mandl, Franz

    1988-01-01

    The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient

  10. A simple non-equilibrium, statistical-physics toy model of thin-film growth

    International Nuclear Information System (INIS)

    Ochab, Jeremi K; Nagel, Hannes; Janke, Wolfhard; Waclaw, Bartlomiej

    2015-01-01

    We present a simple non-equilibrium model of mass condensation with Lennard–Jones interactions between particles and the substrate. We show that when some number of particles is deposited onto the surface and the system is left to equilibrate, particles condense into an island if the density of particles becomes higher than some critical density. We illustrate this with numerically obtained phase diagrams for three-dimensional systems. We also solve a two-dimensional counterpart of this model analytically and show that not only the phase diagram but also the shape of the cross-sections of three-dimensional condensates qualitatively matches the two-dimensional predictions. Lastly, we show that when particles are being deposited with a constant rate, the system has two phases: a single condensate for low deposition rates, and multiple condensates for fast deposition. The behaviour of our model is thus similar to that of thin film growth processes, and in particular to Stranski–Krastanov growth. (paper)

  11. Line radiative transfer and statistical equilibrium

    NARCIS (Netherlands)

    Kamp, Inga

    Atomic and molecular line emission from protoplanetary disks contains key information of their detailed physical and chemical structures. To unravel those structures, we need to understand line radiative transfer in dusty media and the statistical equilibrium, especially of molecules. I describe

  12. Statistical Physics

    CERN Document Server

    Wannier, Gregory Hugh

    1966-01-01

    Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for

  13. Line radiative transfer and statistical equilibrium*

    Directory of Open Access Journals (Sweden)

    Kamp Inga

    2015-01-01

    Full Text Available Atomic and molecular line emission from protoplanetary disks contains key information of their detailed physical and chemical structures. To unravel those structures, we need to understand line radiative transfer in dusty media and the statistical equilibrium, especially of molecules. I describe here the basic principles of statistical equilibrium and illustrate them through the two-level atom. In a second part, the fundamentals of line radiative transfer are introduced along with the various broadening mechanisms. I explain general solution methods with their drawbacks and also specific difficulties encountered in solving the line radiative transfer equation in disks (e.g. velocity gradients. I am closing with a few special cases of line emission from disks: Radiative pumping, masers and resonance scattering.

  14. Gyrokinetic Statistical Absolute Equilibrium and Turbulence

    International Nuclear Information System (INIS)

    Zhu, Jian-Zhou; Hammett, Gregory W.

    2011-01-01

    A paradigm based on the absolute equilibrium of Galerkin-truncated inviscid systems to aid in understanding turbulence (T.-D. Lee, 'On some statistical properties of hydrodynamical and magnetohydrodynamical fields,' Q. Appl. Math. 10, 69 (1952)) is taken to study gyrokinetic plasma turbulence: A finite set of Fourier modes of the collisionless gyrokinetic equations are kept and the statistical equilibria are calculated; possible implications for plasma turbulence in various situations are discussed. For the case of two spatial and one velocity dimension, in the calculation with discretization also of velocity v with N grid points (where N + 1 quantities are conserved, corresponding to an energy invariant and N entropy-related invariants), the negative temperature states, corresponding to the condensation of the generalized energy into the lowest modes, are found. This indicates a generic feature of inverse energy cascade. Comparisons are made with some classical results, such as those of Charney-Hasegawa-Mima in the cold-ion limit. There is a universal shape for statistical equilibrium of gyrokinetics in three spatial and two velocity dimensions with just one conserved quantity. Possible physical relevance to turbulence, such as ITG zonal flows, and to a critical balance hypothesis are also discussed.

  15. Statistical fluctuations and correlations in hadronic equilibrium systems

    Energy Technology Data Exchange (ETDEWEB)

    Hauer, Michael

    2010-06-17

    This thesis is dedicated to the study of fluctuation and correlation observables of hadronic equilibrium systems. The statistical hadronization model of high energy physics, in its ideal, i.e. non-interacting, gas approximation is investigated in different ensemble formulations. The hypothesis of thermal and chemical equilibrium in high energy interaction is tested against qualitative and quantitative predictions. (orig.)

  16. Statistical fluctuations and correlations in hadronic equilibrium systems

    International Nuclear Information System (INIS)

    Hauer, Michael

    2010-01-01

    This thesis is dedicated to the study of fluctuation and correlation observables of hadronic equilibrium systems. The statistical hadronization model of high energy physics, in its ideal, i.e. non-interacting, gas approximation is investigated in different ensemble formulations. The hypothesis of thermal and chemical equilibrium in high energy interaction is tested against qualitative and quantitative predictions. (orig.)

  17. Gyrokinetic statistical absolute equilibrium and turbulence

    International Nuclear Information System (INIS)

    Zhu Jianzhou; Hammett, Gregory W.

    2010-01-01

    A paradigm based on the absolute equilibrium of Galerkin-truncated inviscid systems to aid in understanding turbulence [T.-D. Lee, Q. Appl. Math. 10, 69 (1952)] is taken to study gyrokinetic plasma turbulence: a finite set of Fourier modes of the collisionless gyrokinetic equations are kept and the statistical equilibria are calculated; possible implications for plasma turbulence in various situations are discussed. For the case of two spatial and one velocity dimension, in the calculation with discretization also of velocity v with N grid points (where N+1 quantities are conserved, corresponding to an energy invariant and N entropy-related invariants), the negative temperature states, corresponding to the condensation of the generalized energy into the lowest modes, are found. This indicates a generic feature of inverse energy cascade. Comparisons are made with some classical results, such as those of Charney-Hasegawa-Mima in the cold-ion limit. There is a universal shape for statistical equilibrium of gyrokinetics in three spatial and two velocity dimensions with just one conserved quantity. Possible physical relevance to turbulence, such as ITG zonal flows, and to a critical balance hypothesis are also discussed.

  18. Important contributions of M.C. Wang and C.S. Wang Chang to non-equilibrium statistical physics

    International Nuclear Information System (INIS)

    Liu Jixing

    2004-01-01

    In the middle of the 20th century two Chinese women physicists, Ming-Chen Wang and Cheng-Shu Wang Chang made great contributions to statistical physics. The famous review article 'On the theory of the Brownian motion II' by Ming-Chen Wang and G.E. Uhlenbeck published in Rev. of Mod. Phys. in 1945 provided a complete scientific classification of stochastic processes which is still adopted by the scientific community as the standard classification. The Wang-Chang-Uhlenbeck (WCU) equation proposed jointly by C.S. Wang-Chang and Uhlenbeck became the fundamental kinetic equation for the treatment of transport properties of multi-atomic gases with internal degrees of freedom in the physics literature. These important scientific contributions are analyzed and reviewed

  19. Stability and equilibrium in quantum statistical mechanics

    International Nuclear Information System (INIS)

    Kastler, Daniel.

    1975-01-01

    A derivation of the Gibbs Ansatz, base of the equilibrium statistical mechanics is provided from a stability requirements, in technical connection with the harmonic analysis of non-commutative dynamical systems. By the same token a relation is established between stability and the positivity of Hamiltonian in the zero temperature case [fr

  20. A statistical mechanical model for equilibrium ionization

    International Nuclear Information System (INIS)

    Macris, N.; Martin, P.A.; Pule, J.

    1990-01-01

    A quantum electron interacts with a classical gas of hard spheres and is in thermal equilibrium with it. The interaction is attractive and the electron can form a bound state with the classical particles. It is rigorously shown that in a well defined low density and low temperature limit, the ionization probability for the electron tends to the value predicted by the Saha formula for thermal ionization. In this regime, the electron is found to be in a statistical mixture of a bound and a free state. (orig.)

  1. Statistical approach to partial equilibrium analysis

    Science.gov (United States)

    Wang, Yougui; Stanley, H. E.

    2009-04-01

    A statistical approach to market equilibrium and efficiency analysis is proposed in this paper. One factor that governs the exchange decisions of traders in a market, named willingness price, is highlighted and constitutes the whole theory. The supply and demand functions are formulated as the distributions of corresponding willing exchange over the willingness price. The laws of supply and demand can be derived directly from these distributions. The characteristics of excess demand function are analyzed and the necessary conditions for the existence and uniqueness of equilibrium point of the market are specified. The rationing rates of buyers and sellers are introduced to describe the ratio of realized exchange to willing exchange, and their dependence on the market price is studied in the cases of shortage and surplus. The realized market surplus, which is the criterion of market efficiency, can be written as a function of the distributions of willing exchange and the rationing rates. With this approach we can strictly prove that a market is efficient in the state of equilibrium.

  2. Statistical equilibrium equations for trace elements in stellar atmospheres

    OpenAIRE

    Kubat, Jiri

    2010-01-01

    The conditions of thermodynamic equilibrium, local thermodynamic equilibrium, and statistical equilibrium are discussed in detail. The equations of statistical equilibrium and the supplementary equations are shown together with the expressions for radiative and collisional rates with the emphasize on the solution for trace elements.

  3. Equilibrium statistical mechanics of lattice models

    CERN Document Server

    Lavis, David A

    2015-01-01

    Most interesting and difficult problems in equilibrium statistical mechanics concern models which exhibit phase transitions. For graduate students and more experienced researchers this book provides an invaluable reference source of approximate and exact solutions for a comprehensive range of such models. Part I contains background material on classical thermodynamics and statistical mechanics, together with a classification and survey of lattice models. The geometry of phase transitions is described and scaling theory is used to introduce critical exponents and scaling laws. An introduction is given to finite-size scaling, conformal invariance and Schramm—Loewner evolution. Part II contains accounts of classical mean-field methods. The parallels between Landau expansions and catastrophe theory are discussed and Ginzburg—Landau theory is introduced. The extension of mean-field theory to higher-orders is explored using the Kikuchi—Hijmans—De Boer hierarchy of approximations. In Part III the use of alge...

  4. Open problems in non-equilibrium physics

    International Nuclear Information System (INIS)

    Kusnezov, D.

    1997-01-01

    The report contains viewgraphs on the following: approaches to non-equilibrium statistical mechanics; classical and quantum processes in chaotic environments; classical fields in non-equilibrium situations: real time dynamics at finite temperature; and phase transitions in non-equilibrium conditions

  5. Open problems in non-equilibrium physics

    Energy Technology Data Exchange (ETDEWEB)

    Kusnezov, D.

    1997-09-22

    The report contains viewgraphs on the following: approaches to non-equilibrium statistical mechanics; classical and quantum processes in chaotic environments; classical fields in non-equilibrium situations: real time dynamics at finite temperature; and phase transitions in non-equilibrium conditions.

  6. Theoretical physics 8 statistical physics

    CERN Document Server

    Nolting, Wolfgang

    2018-01-01

    This textbook offers a clear and comprehensive introduction to statistical physics, one of the core components of advanced undergraduate physics courses. It follows on naturally from the previous volumes in this series, using methods of probability theory and statistics to solve physical problems. The first part of the book gives a detailed overview on classical statistical physics and introduces all mathematical tools needed. The second part of the book covers topics related to quantized states, gives a thorough introduction to quantum statistics, followed by a concise treatment of quantum gases. Ideally suited to undergraduate students with some grounding in quantum mechanics, the book is enhanced throughout with learning features such as boxed inserts and chapter summaries, with key mathematical derivations highlighted to aid understanding. The text is supported by numerous worked examples and end of chapter problem sets. About the Theoretical Physics series Translated from the renowned and highly successf...

  7. Proton-rich nuclear statistical equilibrium

    International Nuclear Information System (INIS)

    Seitenzahl, I.R.; Timmes, F.X.; Marin-Lafleche, A.; Brown, E.; Magkotsios, G.; Truran, J.

    2008-01-01

    Proton-rich material in a state of nuclear statistical equilibrium (NSE) is one of the least studied regimes of nucleosynthesis. One reason for this is that after hydrogen burning, stellar evolution proceeds at conditions of an equal number of neutrons and protons or at a slight degree of neutron-richness. Proton-rich nucleosynthesis in stars tends to occur only when hydrogen-rich material that accretes onto a white dwarf or a neutron star explodes, or when neutrino interactions in the winds from a nascent proto-neutron star or collapsar disk drive the matter proton-rich prior to or during the nucleosynthesis. In this Letter we solve the NSE equations for a range of proton-rich thermodynamic conditions. We show that cold proton-rich NSE is qualitatively different from neutron-rich NSE. Instead of being dominated by the Fe-peak nuclei with the largest binding energy per nucleon that have a proton-to-nucleon ratio close to the prescribed electron fraction, NSE for proton-rich material near freezeout temperature is mainly composed of 56Ni and free protons. Previous results of nuclear reaction network calculations rely on this nonintuitive high-proton abundance, which this Letter explains. We show how the differences and especially the large fraction of free protons arises from the minimization of the free energy as a result of a delicate competition between the entropy and nuclear binding energy.

  8. Statistical thermodynamics of equilibrium polymers at interfaces

    NARCIS (Netherlands)

    Gucht, van der J.; Besseling, N.A.M.

    2002-01-01

    The behavior of a solution of equilibrium polymers (or living polymers) at an interface is studied, using a Bethe-Guggenheim lattice model for molecules with orientation dependent interactions. The density profile of polymers and the chain length distribution are calculated. For equilibrium polymers

  9. Statistical Physics An Introduction

    CERN Document Server

    Yoshioka, Daijiro

    2007-01-01

    This book provides a comprehensive presentation of the basics of statistical physics. The first part explains the essence of statistical physics and how it provides a bridge between microscopic and macroscopic phenomena, allowing one to derive quantities such as entropy. Here the author avoids going into details such as Liouville’s theorem or the ergodic theorem, which are difficult for beginners and unnecessary for the actual application of the statistical mechanics. In the second part, statistical mechanics is applied to various systems which, although they look different, share the same mathematical structure. In this way readers can deepen their understanding of statistical physics. The book also features applications to quantum dynamics, thermodynamics, the Ising model and the statistical dynamics of free spins.

  10. Statistical symmetries in physics

    International Nuclear Information System (INIS)

    Green, H.S.; Adelaide Univ., SA

    1994-01-01

    Every law of physics is invariant under some group of transformations and is therefore the expression of some type of symmetry. Symmetries are classified as geometrical, dynamical or statistical. At the most fundamental level, statistical symmetries are expressed in the field theories of the elementary particles. This paper traces some of the developments from the discovery of Bose statistics, one of the two fundamental symmetries of physics. A series of generalizations of Bose statistics is described. A supersymmetric generalization accommodates fermions as well as bosons, and further generalizations, including parastatistics, modular statistics and graded statistics, accommodate particles with properties such as 'colour'. A factorization of elements of ggl(n b ,n f ) can be used to define truncated boson operators. A general construction is given for q-deformed boson operators, and explicit constructions of the same type are given for various 'deformed' algebras. A summary is given of some of the applications and potential applications. 39 refs., 2 figs

  11. Methods of statistical physics

    CERN Document Server

    Akhiezer, Aleksandr I

    1981-01-01

    Methods of Statistical Physics is an exposition of the tools of statistical mechanics, which evaluates the kinetic equations of classical and quantized systems. The book also analyzes the equations of macroscopic physics, such as the equations of hydrodynamics for normal and superfluid liquids and macroscopic electrodynamics. The text gives particular attention to the study of quantum systems. This study begins with a discussion of problems of quantum statistics with a detailed description of the basics of quantum mechanics along with the theory of measurement. An analysis of the asymptotic be

  12. Basics of statistical physics

    CERN Document Server

    Müller-Kirsten, Harald J W

    2013-01-01

    Statistics links microscopic and macroscopic phenomena, and requires for this reason a large number of microscopic elements like atoms. The results are values of maximum probability or of averaging. This introduction to statistical physics concentrates on the basic principles, and attempts to explain these in simple terms supplemented by numerous examples. These basic principles include the difference between classical and quantum statistics, a priori probabilities as related to degeneracies, the vital aspect of indistinguishability as compared with distinguishability in classical physics, the differences between conserved and non-conserved elements, the different ways of counting arrangements in the three statistics (Maxwell-Boltzmann, Fermi-Dirac, Bose-Einstein), the difference between maximization of the number of arrangements of elements, and averaging in the Darwin-Fowler method. Significant applications to solids, radiation and electrons in metals are treated in separate chapters, as well as Bose-Eins...

  13. Elementary statistical physics

    CERN Document Server

    Kittel, C

    1965-01-01

    This book is intended to help physics students attain a modest working knowledge of several areas of statistical mechanics, including stochastic processes and transport theory. The areas discussed are among those forming a useful part of the intellectual background of a physicist.

  14. A modern course in statistical physics

    CERN Document Server

    Reichl, Linda E

    2016-01-01

    "A Modern Course in Statistical Physics" is a textbook that illustrates the foundations of equilibrium and non-equilibrium statistical physics, and the universal nature of thermodynamic processes, from the point of view of contemporary research problems. The book treats such diverse topics as the microscopic theory of critical phenomena, superfluid dynamics, quantum conductance, light scattering, transport processes, and dissipative structures, all in the framework of the foundations of statistical physics and thermodynamics. It shows the quantum origins of problems in classical statistical physics. One focus of the book is fluctuations that occur due to the discrete nature of matter, a topic of growing importance for nanometer scale physics and biophysics. Another focus concerns classical and quantum phase transitions, in both monatomic and mixed particle systems. This fourth edition extends the range of topics considered to include, for example, entropic forces, electrochemical processes in biological syste...

  15. Limiting processes in non-equilibrium classical statistical mechanics

    International Nuclear Information System (INIS)

    Jancel, R.

    1983-01-01

    After a recall of the basic principles of the statistical mechanics, the results of ergodic theory, the transient at the thermodynamic limit and his link with the transport theory near the equilibrium are analyzed. The fundamental problems put by the description of non-equilibrium macroscopic systems are investigated and the kinetic methods are stated. The problems of the non-equilibrium statistical mechanics are analyzed: irreversibility and coarse-graining, macroscopic variables and kinetic description, autonomous reduced descriptions, limit processes, BBGKY hierarchy, limit theorems [fr

  16. Non-equilibrium thermodynamics and physical kinetics

    CERN Document Server

    Bikkin, Halid

    2014-01-01

    This graduate textbook covers contemporary directions of non-equilibrium statistical mechanics as well as classical methods of kinetics. With one of the main propositions being to avoid terms such as "obviously" and "it is easy to show", this treatise is an easy-to-read introduction into this traditional, yet vibrant field.

  17. Equilibrium statistical mechanics on correlated random graphs

    Science.gov (United States)

    Barra, Adriano; Agliari, Elena

    2011-02-01

    Biological and social networks have recently attracted great attention from physicists. Among several aspects, two main ones may be stressed: a non-trivial topology of the graph describing the mutual interactions between agents and, typically, imitative, weighted, interactions. Despite such aspects being widely accepted and empirically confirmed, the schemes currently exploited in order to generate the expected topology are based on a priori assumptions and, in most cases, implement constant intensities for links. Here we propose a simple shift [-1,+1]\\to [0,+1] in the definition of patterns in a Hopfield model: a straightforward effect is the conversion of frustration into dilution. In fact, we show that by varying the bias of pattern distribution, the network topology (generated by the reciprocal affinities among agents, i.e. the Hebbian rule) crosses various well-known regimes, ranging from fully connected, to an extreme dilution scenario, then to completely disconnected. These features, as well as small-world properties, are, in this context, emergent and no longer imposed a priori. The model is throughout investigated also from a thermodynamics perspective: the Ising model defined on the resulting graph is analytically solved (at a replica symmetric level) by extending the double stochastic stability technique, and presented together with its fluctuation theory for a picture of criticality. Overall, our findings show that, at least at equilibrium, dilution (of whatever kind) simply decreases the strength of the coupling felt by the spins, but leaves the paramagnetic/ferromagnetic flavors unchanged. The main difference with respect to previous investigations is that, within our approach, replicas do not appear: instead of (multi)-overlaps as order parameters, we introduce a class of magnetizations on all the possible subgraphs belonging to the main one investigated: as a consequence, for these objects a closure for a self-consistent relation is achieved.

  18. Equilibrium statistical mechanics on correlated random graphs

    International Nuclear Information System (INIS)

    Barra, Adriano; Agliari, Elena

    2011-01-01

    Biological and social networks have recently attracted great attention from physicists. Among several aspects, two main ones may be stressed: a non-trivial topology of the graph describing the mutual interactions between agents and, typically, imitative, weighted, interactions. Despite such aspects being widely accepted and empirically confirmed, the schemes currently exploited in order to generate the expected topology are based on a priori assumptions and, in most cases, implement constant intensities for links. Here we propose a simple shift [-1,+1]→[0,+1] in the definition of patterns in a Hopfield model: a straightforward effect is the conversion of frustration into dilution. In fact, we show that by varying the bias of pattern distribution, the network topology (generated by the reciprocal affinities among agents, i.e. the Hebbian rule) crosses various well-known regimes, ranging from fully connected, to an extreme dilution scenario, then to completely disconnected. These features, as well as small-world properties, are, in this context, emergent and no longer imposed a priori. The model is throughout investigated also from a thermodynamics perspective: the Ising model defined on the resulting graph is analytically solved (at a replica symmetric level) by extending the double stochastic stability technique, and presented together with its fluctuation theory for a picture of criticality. Overall, our findings show that, at least at equilibrium, dilution (of whatever kind) simply decreases the strength of the coupling felt by the spins, but leaves the paramagnetic/ferromagnetic flavors unchanged. The main difference with respect to previous investigations is that, within our approach, replicas do not appear: instead of (multi)-overlaps as order parameters, we introduce a class of magnetizations on all the possible subgraphs belonging to the main one investigated: as a consequence, for these objects a closure for a self-consistent relation is achieved

  19. Statistical mechanics out of equilibrium the irreversibility

    International Nuclear Information System (INIS)

    Alvarez Estrada, R. F.

    2001-01-01

    A Round Table about the issue of Irreversibility and related matters has taken place during the last (20th) Statistical Mechanics Conference, held in Paris (July 1998). This article tries to provide a view (necessarily limited, and hence, uncompleted) of some approaches to the subject: the one based upon deterministic chaos (which is currently giving rise to a very active research) and the classical interpretation due to Boltzmann. An attempt has been made to write this article in a self-contained way, and to avoid a technical presentation wherever possible. (Author) 29 refs

  20. Evolution and non-equilibrium physics

    DEFF Research Database (Denmark)

    Becker, Nikolaj; Sibani, Paolo

    2014-01-01

    We argue that the stochastic dynamics of interacting agents which replicate, mutate and die constitutes a non-equilibrium physical process akin to aging in complex materials. Specifically, our study uses extensive computer simulations of the Tangled Nature Model (TNM) of biological evolution...

  1. Statistical physics of vaccination

    Science.gov (United States)

    Wang, Zhen; Bauch, Chris T.; Bhattacharyya, Samit; d'Onofrio, Alberto; Manfredi, Piero; Perc, Matjaž; Perra, Nicola; Salathé, Marcel; Zhao, Dawei

    2016-12-01

    Historically, infectious diseases caused considerable damage to human societies, and they continue to do so today. To help reduce their impact, mathematical models of disease transmission have been studied to help understand disease dynamics and inform prevention strategies. Vaccination-one of the most important preventive measures of modern times-is of great interest both theoretically and empirically. And in contrast to traditional approaches, recent research increasingly explores the pivotal implications of individual behavior and heterogeneous contact patterns in populations. Our report reviews the developmental arc of theoretical epidemiology with emphasis on vaccination, as it led from classical models assuming homogeneously mixing (mean-field) populations and ignoring human behavior, to recent models that account for behavioral feedback and/or population spatial/social structure. Many of the methods used originated in statistical physics, such as lattice and network models, and their associated analytical frameworks. Similarly, the feedback loop between vaccinating behavior and disease propagation forms a coupled nonlinear system with analogs in physics. We also review the new paradigm of digital epidemiology, wherein sources of digital data such as online social media are mined for high-resolution information on epidemiologically relevant individual behavior. Armed with the tools and concepts of statistical physics, and further assisted by new sources of digital data, models that capture nonlinear interactions between behavior and disease dynamics offer a novel way of modeling real-world phenomena, and can help improve health outcomes. We conclude the review by discussing open problems in the field and promising directions for future research.

  2. SRB states and nonequilibrium statistical mechanics close to equilibrium

    OpenAIRE

    Gallavotti, Giovannni; Ruelle, David

    1996-01-01

    Nonequilibrium statistical mechanics close to equilibrium is studied using SRB states and a formula for their derivatives with respect to parameters. We write general expressions for the thermodynamic fluxes (or currents) and the transport coefficients, generalizing previous results. In this framework we give a general proof of the Onsager reciprocity relations.

  3. Statistical equilibrium and symplectic geometry in general relativity

    International Nuclear Information System (INIS)

    Iglesias, P.

    1981-09-01

    A geometrical construction is given of the statistical equilibrium states of a system of particles in the gravitational field in general relativity. By a method of localization variables, the expression of thermodynamic values is given and the compatibility of this description is shown with a macroscopic model of a relativistic continuous medium for a given value of the free-energy function [fr

  4. Statistical Physics of Colloidal Dispersions.

    Science.gov (United States)

    Canessa, E.

    Available from UMI in association with The British Library. Requires signed TDF. This thesis is concerned with the equilibrium statistical mechanics of colloidal dispersions which represent useful model systems for the study of condensed matter physics; namely, charge stabilized colloidal dispersions and polymer stabilized colloidal dispersions. A one-component macroparticle approach is adopted in order to treat the macroscopic and microscopic properties of these systems in a simple and comprehensive manner. The thesis opens with the description of the nature of the colloidal state before reviewing some basic definitions and theory in Chapter II. In Chapter III a variational theory of phase equilibria based on the Gibbs-Bogolyobov inequality is applied to sterically stabilized colloidal dispersions. Hard spheres are chosen as the reference system for the disordered phases while an Einstein model is used for the ordered phases. The new choice of pair potential, taken for mathematical convenience, is a superposition of two Yukawa functions. By matching a double Yukawa potential to the van der Waals attractive potential at different temperatures and introducing a purely temperature dependent coefficient to the repulsive part, a rich variety of observed phase separation phenomena is qualitatively described. The behaviour of the potential is found to be consistent with a small decrease of the polymer layer thickness with increasing temperature. Using the same concept of a collapse transition the non-monotonic second virial coefficient is also explained and quantified. It is shown that a reduction of the effective macroparticle diameter with increasing temperature can only be partially examined from the point of view of a (binary-) polymer solution theory. This chapter concludes with the description of the observed, reversible, depletion flocculation behaviour. This is accomplished by using the variational formalism and by invoking the double Yukawa potential to allow

  5. Introduction to statistical physics

    CERN Document Server

    Huang, Kerson

    2009-01-01

    A Macroscopic View of MatterViewing the World at Different Scales Thermodynamics The Thermodynamic Limit Thermodynamic TransformationsClassic Ideal Gas First Law of Thermodynamics Magnetic SystemsHeat and EntropyThe Heat Equations Applications to Ideal Gas Carnot Cycle Second Law of Thermodynamics Absolute Temperature Temperature as Integrating Factor EntropyEntropy of Ideal Gas The Limits of ThermodynamicsUsing ThermodynamicsThe Energy EquationSome Measurable Coefficients Entropy and Loss TS Diagram Condition for Equilibrium Helmholtz Free EnergyGibbs Potential Maxwell Relations Chemical Pote

  6. Planar-channeling spatial density under statistical equilibrium

    International Nuclear Information System (INIS)

    Ellison, J.A.; Picraux, S.T.

    1978-01-01

    The phase-space density for planar channeled particles has been derived for the continuum model under statistical equilibrium. This is used to obtain the particle spatial probability density as a function of incident angle. The spatial density is shown to depend on only two parameters, a normalized incident angle and a normalized planar spacing. This normalization is used to obtain, by numerical calculation, a set of universal curves for the spatial density and also for the channeled-particle wavelength as a function of amplitude. Using these universal curves, the statistical-equilibrium spatial density and the channeled-particle wavelength can be easily obtained for any case for which the continuum model can be applied. Also, a new one-parameter analytic approximation to the spatial density is developed. This parabolic approximation is shown to give excellent agreement with the exact calculations

  7. Nonequilibrium statistical physics

    CERN Document Server

    Röpke, Gerd

    2013-01-01

    Authored by one of the top theoretical physicists in Germany, and a well-known authority in the field, this is the only coherent presentation of the subject suitable for masters and PhD students, as well as postdocs in physics and related disciplines.Starting from a general discussion of the nonequilibrium state, different standard approaches such as master equations, and kinetic and linear response theory, are derived after special assumptions. This allows for an insight into the problems of nonequilibrium physics, a discussion of the limits, and suggestions for improvements. Applications

  8. Statistical physics and condensed matter

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    This document is divided into 4 sections: 1) General aspects of statistical physics. The themes include: possible geometrical structures of thermodynamics, the thermodynamical foundation of quantum measurement, transport phenomena (kinetic theory, hydrodynamics and turbulence) and out of equilibrium systems (stochastic dynamics and turbulence). The techniques involved here are typical of applied analysis: stability criteria, mode decomposition, shocks and stochastic equations. 2) Disordered, glassy and granular systems: statics and dynamics. The complexity of the systems can be studied through the structure of their phase space. The geometry of this phase space is studied in several works: the overlap distribution can now be computed with a very high precision; the boundary energy between low lying states does not behave like in ordinary systems; and the Edward's hypothesis of equi-probability of low lying metastable states is invalidated. The phenomenon of aging, characteristic of glassy dynamics, is studied in several models. Dynamics of biological systems or of fracture is shown to bear some resemblance with that of disordered systems. 3) Quantum systems. The themes include: mesoscopic superconductors, supersymmetric approach to strongly correlated electrons, quantum criticality and heavy fermion compounds, optical sum rule violation in the cuprates, heat capacity of lattice spin models from high-temperature series expansion, Lieb-Schultz-Mattis theorem in dimension larger than one, quantum Hall effect, Bose-Einstein condensation and multiple-spin exchange model on the triangular lattice. 4) Soft condensed matter and biological systems. Path integral representations are invaluable to describe polymers, proteins and self-avoiding membranes. Using these methods, problems as diverse as the titration of a weak poly-acid by a strong base, the denaturation transition of DNA or bridge-hopping in conducting polymers have been addressed. The problems of RNA folding

  9. Statistical physics and condensed matter

    International Nuclear Information System (INIS)

    2003-01-01

    This document is divided into 4 sections: 1) General aspects of statistical physics. The themes include: possible geometrical structures of thermodynamics, the thermodynamical foundation of quantum measurement, transport phenomena (kinetic theory, hydrodynamics and turbulence) and out of equilibrium systems (stochastic dynamics and turbulence). The techniques involved here are typical of applied analysis: stability criteria, mode decomposition, shocks and stochastic equations. 2) Disordered, glassy and granular systems: statics and dynamics. The complexity of the systems can be studied through the structure of their phase space. The geometry of this phase space is studied in several works: the overlap distribution can now be computed with a very high precision; the boundary energy between low lying states does not behave like in ordinary systems; and the Edward's hypothesis of equi-probability of low lying metastable states is invalidated. The phenomenon of aging, characteristic of glassy dynamics, is studied in several models. Dynamics of biological systems or of fracture is shown to bear some resemblance with that of disordered systems. 3) Quantum systems. The themes include: mesoscopic superconductors, supersymmetric approach to strongly correlated electrons, quantum criticality and heavy fermion compounds, optical sum rule violation in the cuprates, heat capacity of lattice spin models from high-temperature series expansion, Lieb-Schultz-Mattis theorem in dimension larger than one, quantum Hall effect, Bose-Einstein condensation and multiple-spin exchange model on the triangular lattice. 4) Soft condensed matter and biological systems. Path integral representations are invaluable to describe polymers, proteins and self-avoiding membranes. Using these methods, problems as diverse as the titration of a weak poly-acid by a strong base, the denaturation transition of DNA or bridge-hopping in conducting polymers have been addressed. The problems of RNA folding has

  10. Thermal equilibrium and statistical thermometers in special relativity.

    Science.gov (United States)

    Cubero, David; Casado-Pascual, Jesús; Dunkel, Jörn; Talkner, Peter; Hänggi, Peter

    2007-10-26

    There is an intense debate in the recent literature about the correct generalization of Maxwell's velocity distribution in special relativity. The most frequently discussed candidate distributions include the Jüttner function as well as modifications thereof. Here we report results from fully relativistic one-dimensional molecular dynamics simulations that resolve the ambiguity. The numerical evidence unequivocally favors the Jüttner distribution. Moreover, our simulations illustrate that the concept of "thermal equilibrium" extends naturally to special relativity only if a many-particle system is spatially confined. They make evident that "temperature" can be statistically defined and measured in an observer frame independent way.

  11. Quantum physics and statistical physics. 5. ed.

    International Nuclear Information System (INIS)

    Alonso, Marcelo; Finn, Edward J.

    2012-01-01

    By logical and uniform presentation this recognized introduction in modern physics treats both the experimental and theoretical aspects. The first part of the book deals with quantum mechanics and their application to atoms, molecules, nuclei, solids, and elementary particles. The statistical physics with classical statistics, thermodynamics, and quantum statistics is theme of the second part. Alsonso and Finn avoid complicated mathematical developments; by numerous sketches and diagrams as well as many problems and examples they make the reader early and above all easily understandably familiar with the formations of concepts of modern physics.

  12. Statistical methods in radiation physics

    CERN Document Server

    Turner, James E; Bogard, James S

    2012-01-01

    This statistics textbook, with particular emphasis on radiation protection and dosimetry, deals with statistical solutions to problems inherent in health physics measurements and decision making. The authors begin with a description of our current understanding of the statistical nature of physical processes at the atomic level, including radioactive decay and interactions of radiation with matter. Examples are taken from problems encountered in health physics, and the material is presented such that health physicists and most other nuclear professionals will more readily understand the application of statistical principles in the familiar context of the examples. Problems are presented at the end of each chapter, with solutions to selected problems provided online. In addition, numerous worked examples are included throughout the text.

  13. From statistic mechanic outside equilibrium to transport equations

    International Nuclear Information System (INIS)

    Balian, R.

    1995-01-01

    This lecture notes give a synthetic view on the foundations of non-equilibrium statistical mechanics. The purpose is to establish the transport equations satisfied by the relevant variables, starting from the microscopic dynamics. The Liouville representation is introduced, and a projection associates with any density operator , for given choice of relevant observables, a reduced density operator. An exact integral-differential equation for the relevant variables is thereby derived. A short-memory approximation then yields the transport equations. A relevant entropy which characterizes the coarseness of the description is associated with each level of description. As an illustration, the classical gas, with its three levels of description and with the Chapman-Enskog method, is discussed. (author). 3 figs., 5 refs

  14. Phase Equilibrium, Chemical Equilibrium, and a Test of the Third Law: Experiments for Physical Chemistry.

    Science.gov (United States)

    Dannhauser, Walter

    1980-01-01

    Described is an experiment designed to provide an experimental basis for a unifying point of view (utilizing theoretical framework and chemistry laboratory experiments) for physical chemistry students. Three experiments are described: phase equilibrium, chemical equilibrium, and a test of the third law of thermodynamics. (Author/DS)

  15. Brownian quasi-particles in statistical physics

    International Nuclear Information System (INIS)

    Tellez-Arenas, A.; Fronteau, J.; Combis, P.

    1979-01-01

    The idea of a Brownian quasi-particle and the associated differentiable flow (with nonselfadjoint forces) are used here in the context of a stochastic description of the approach towards statistical equilibrium. We show that this quasi-particle flow acquires, at equilibrium, the principal properties of a conservative Hamiltonian flow. Thus the model of Brownian quasi-particles permits us to establish a link between the stochastic description and the Gibbs description of statistical equilibrium

  16. Statistical methods for physical science

    CERN Document Server

    Stanford, John L

    1994-01-01

    This volume of Methods of Experimental Physics provides an extensive introduction to probability and statistics in many areas of the physical sciences, with an emphasis on the emerging area of spatial statistics. The scope of topics covered is wide-ranging-the text discusses a variety of the most commonly used classical methods and addresses newer methods that are applicable or potentially important. The chapter authors motivate readers with their insightful discussions, augmenting their material withKey Features* Examines basic probability, including coverage of standard distributions, time s

  17. Introduction to mathematical statistical physics

    CERN Document Server

    Minlos, R A

    1999-01-01

    This book presents a mathematically rigorous approach to the main ideas and phenomena of statistical physics. The introduction addresses the physical motivation, focussing on the basic concept of modern statistical physics, that is the notion of Gibbsian random fields. Properties of Gibbsian fields are analyzed in two ranges of physical parameters: "regular" (corresponding to high-temperature and low-density regimes) where no phase transition is exhibited, and "singular" (low temperature regimes) where such transitions occur. Next, a detailed approach to the analysis of the phenomena of phase transitions of the first kind, the Pirogov-Sinai theory, is presented. The author discusses this theory in a general way and illustrates it with the example of a lattice gas with three types of particles. The conclusion gives a brief review of recent developments arising from this theory. The volume is written for the beginner, yet advanced students will benefit from it as well. The book will serve nicely as a supplement...

  18. Nonequilibrium statistical physics a modern perspective

    CERN Document Server

    Livi, Roberto

    2017-01-01

    Statistical mechanics has been proven to be successful at describing physical systems at thermodynamic equilibrium. Since most natural phenomena occur in nonequilibrium conditions, the present challenge is to find suitable physical approaches for such conditions: this book provides a pedagogical pathway that explores various perspectives. The use of clear language, and explanatory figures and diagrams to describe models, simulations and experimental findings makes the book a valuable resource for undergraduate and graduate students, and also for lecturers organizing teaching at varying levels of experience in the field. Written in three parts, it covers basic and traditional concepts of nonequilibrium physics, modern aspects concerning nonequilibrium phase transitions, and application-orientated topics from a modern perspective. A broad range of topics is covered, including Langevin equations, Levy processes, directed percolation, kinetic roughening and pattern formation.

  19. Statistical equilibrium of copper in the solar atmosphere

    International Nuclear Information System (INIS)

    Shi, J. R.; Mashonkina, L.; Zhao, G.; Gehren, T.; Zeng, J. L.

    2014-01-01

    Non-local thermodynamic equilibrium (NLTE) line formation for neutral copper in the one-dimensional solar atmospheres is presented for the atomic model, including 96 terms of Cu I and the ground state of Cu II. The accurate oscillator strengths for all the line transitions in model atom and photoionization cross sections were calculated using the R-matrix method in the Russell-Saunders coupling scheme. The main NLTE mechanism for Cu I is the ultraviolet overionization. We find that NLTE leads to systematically depleted total absorption in the Cu I lines and, accordingly, positive abundance corrections. Inelastic collisions with neutral hydrogen atoms produce minor effects on the statistical equilibrium of Cu I in the solar atmosphere. For the solar Cu I lines, the departures from LTE are found to be small, the mean NLTE abundance correction of ∼0.01 dex. It was found that the six low-excitation lines, with excitation energy of the lower level E exc ≤ 1.64 eV, give a 0.14 dex lower mean solar abundance compared to that from the six E exc > 3.7 eV lines, when applying experimental gf-values of Kock and Richter. Without the two strong resonance transitions, the solar mean NLTE abundance from 10 lines of Cu I is log ε ☉ (Cu) = 4.19 ± 0.10, which is consistent within the error bars with the meteoritic value 4.25 ± 0.05 of Lodders et al. The discrepancy between E exc = 1.39-1.64 eV and E exc > 3.7 eV lines can be removed when the calculated gf-values are adopted and a mean solar abundance of log ε ☉ (Cu) = 4.24 ± 0.08 is derived.

  20. Conceptual Integration of Chemical Equilibrium by Prospective Physical Sciences Teachers

    Science.gov (United States)

    Ganaras, Kostas; Dumon, Alain; Larcher, Claudine

    2008-01-01

    This article describes an empirical study concerning the mastering of the chemical equilibrium concept by prospective physical sciences teachers. The main objective was to check whether the concept of chemical equilibrium had become an integrating and unifying concept for them, that is to say an operational and functional knowledge to explain and…

  1. Statistical mechanics and the physics of fluids

    CERN Document Server

    Tosi, Mario

    This volume collects the lecture notes of a course on statistical mechanics, held at Scuola Normale Superiore di Pisa for third-to-fifth year students in physics and chemistry. Three main themes are covered in the book. The first part gives a compact presentation of the foundations of statistical mechanics and their connections with thermodynamics. Applications to ideal gases of material particles and of excitation quanta are followed by a brief introduction to a real classical gas and to a weakly coupled classical plasma, and by a broad overview on the three states of matter.The second part is devoted to fluctuations around equilibrium and their correlations. Coverage of liquid structure and critical phenomena is followed by a discussion of irreversible processes as exemplified by diffusive motions and by the dynamics of density and heat fluctuations. Finally, the third part is an introduction to some advanced themes: supercooling and the glassy state, non-Newtonian fluids including polymers and liquid cryst...

  2. Problems of a Statistical Ensemble Theory for Systems Far from Equilibrium

    Science.gov (United States)

    Ebeling, Werner

    The development of a general statistical physics of nonequilibrium systems was one of the main unfinished tasks of statistical physics of the 20th century. The aim of this work is the study of a special class of nonequilibrium systems where the formulation of an ensemble theory of some generality is possible. These are the so-called canonical-dissipative systems, where the driving terms are determined by invariants of motion. We construct canonical-dissipative systems which are ergodic on certain surfaces on the phase plane. These systems may be described by a non-equilibrium microcanocical ensemble, corresponding to an equal distribution on the target surface. Next we construct and solve Fokker-Planck equations; this leads to a kind of canonical-dissipative ensemble. In the last part we discuss the thoretical problem how to define bifurcations in the framework of nonequilibrium statistics and several possible applications.

  3. Statistical methods in physical mapping

    International Nuclear Information System (INIS)

    Nelson, D.O.

    1995-05-01

    One of the great success stories of modern molecular genetics has been the ability of biologists to isolate and characterize the genes responsible for serious inherited diseases like fragile X syndrome, cystic fibrosis and myotonic muscular dystrophy. This dissertation concentrates on constructing high-resolution physical maps. It demonstrates how probabilistic modeling and statistical analysis can aid molecular geneticists in the tasks of planning, execution, and evaluation of physical maps of chromosomes and large chromosomal regions. The dissertation is divided into six chapters. Chapter 1 provides an introduction to the field of physical mapping, describing the role of physical mapping in gene isolation and ill past efforts at mapping chromosomal regions. The next two chapters review and extend known results on predicting progress in large mapping projects. Such predictions help project planners decide between various approaches and tactics for mapping large regions of the human genome. Chapter 2 shows how probability models have been used in the past to predict progress in mapping projects. Chapter 3 presents new results, based on stationary point process theory, for progress measures for mapping projects based on directed mapping strategies. Chapter 4 describes in detail the construction of all initial high-resolution physical map for human chromosome 19. This chapter introduces the probability and statistical models involved in map construction in the context of a large, ongoing physical mapping project. Chapter 5 concentrates on one such model, the trinomial model. This chapter contains new results on the large-sample behavior of this model, including distributional results, asymptotic moments, and detection error rates. In addition, it contains an optimality result concerning experimental procedures based on the trinomial model. The last chapter explores unsolved problems and describes future work

  4. Statistical methods in physical mapping

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, David O. [Univ. of California, Berkeley, CA (United States)

    1995-05-01

    One of the great success stories of modern molecular genetics has been the ability of biologists to isolate and characterize the genes responsible for serious inherited diseases like fragile X syndrome, cystic fibrosis and myotonic muscular dystrophy. This dissertation concentrates on constructing high-resolution physical maps. It demonstrates how probabilistic modeling and statistical analysis can aid molecular geneticists in the tasks of planning, execution, and evaluation of physical maps of chromosomes and large chromosomal regions. The dissertation is divided into six chapters. Chapter 1 provides an introduction to the field of physical mapping, describing the role of physical mapping in gene isolation and ill past efforts at mapping chromosomal regions. The next two chapters review and extend known results on predicting progress in large mapping projects. Such predictions help project planners decide between various approaches and tactics for mapping large regions of the human genome. Chapter 2 shows how probability models have been used in the past to predict progress in mapping projects. Chapter 3 presents new results, based on stationary point process theory, for progress measures for mapping projects based on directed mapping strategies. Chapter 4 describes in detail the construction of all initial high-resolution physical map for human chromosome 19. This chapter introduces the probability and statistical models involved in map construction in the context of a large, ongoing physical mapping project. Chapter 5 concentrates on one such model, the trinomial model. This chapter contains new results on the large-sample behavior of this model, including distributional results, asymptotic moments, and detection error rates. In addition, it contains an optimality result concerning experimental procedures based on the trinomial model. The last chapter explores unsolved problems and describes future work.

  5. Statistics for High Energy Physics

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    The lectures emphasize the frequentist approach used for Dark Matter search and the Higgs search, discovery and measurements of its properties. An emphasis is put on hypothesis test using the asymptotic formulae formalism and its derivation, and on the derivation of the trial factor formulae in one and two dimensions. Various test statistics and their applications are discussed.  Some keywords: Profile Likelihood, Neyman Pearson, Feldman Cousins, Coverage, CLs. Nuisance Parameters Impact, Look Elsewhere Effect... Selected Bibliography: G. J. Feldman and R. D. Cousins, A Unified approach to the classical statistical analysis of small signals, Phys.\\ Rev.\\ D {\\bf 57}, 3873 (1998). A. L. Read, Presentation of search results: The CL(s) technique,'' J.\\ Phys.\\ G {\\bf 28}, 2693 (2002). G. Cowan, K. Cranmer, E. Gross and O. Vitells,  Asymptotic formulae for likelihood-based tests of new physics,' Eur.\\ Phys.\\ J.\\ C {\\bf 71}, 1554 (2011) Erratum: [Eur.\\ Phys.\\ J.\\ C {\\bf 73}...

  6. Non-equilibrium statistical theory about microscopic fatigue cracks of metal in magnetic field

    International Nuclear Information System (INIS)

    Zhao-Long, Liu; Hai-Yun, Hu; Tian-You, Fan; Xiu-San, Xing

    2010-01-01

    This paper develops the non-equilibrium statistical fatigue damage theory to study the statistical behaviour of micro-crack for metals in magnetic field. The one-dimensional homogeneous crack system is chosen for study. To investigate the effect caused by magnetic field on the statistical distribution of micro-crack in the system, the theoretical analysis on microcrack evolution equation, the average length of micro-crack, density distribution function of micro-crack and fatigue fracture probability have been performed. The derived results relate the changes of some quantities, such as average length, density distribution function and fatigue fracture probability, to the applied magnetic field, the magnetic and mechanical properties of metals. It gives a theoretical explanation on the change of fatigue damage due to magnetic fields observed by experiments, and presents an analytic approach on studying the fatigue damage of metal in magnetic field. (cross-disciplinary physics and related areas of science and technology)

  7. Thermodynamics and statistical physics. 2. rev. ed.

    International Nuclear Information System (INIS)

    Schnakenberg, J.

    2002-01-01

    This textbook covers tthe following topics: Thermodynamic systems and equilibrium, irreversible thermodynamics, thermodynamic potentials, stability, thermodynamic processes, ideal systems, real gases and phase transformations, magnetic systems and Landau model, low temperature thermodynamics, canonical ensembles, statistical theory, quantum statistics, fermions and bosons, kinetic theory, Bose-Einstein condensation, photon gas

  8. Statistical forces from close-to-equilibrium media

    Czech Academy of Sciences Publication Activity Database

    Basu, U.; Maes, C.; Netočný, Karel

    2015-01-01

    Roč. 17, Nov (2015), s. 115006 ISSN 1367-2630 Institutional support: RVO:68378271 Keywords : stochastic thermodynamics * nonequilibrium steady states * active matter Subject RIV: BE - Theoretical Physics Impact factor: 3.570, year: 2015

  9. Thermodynamic formalism the mathematical structures of equilibrium statistical mechanics

    CERN Document Server

    Ruelle, David

    2004-01-01

    Reissued in the Cambridge Mathematical Library, this classic book outlines the theory of thermodynamic formalism which was developed to describe the properties of certain physical systems consisting of a large number of subunits. Background material on physics has been collected in appendices to help the reader. Supplementary work is provided in the form of exercises and problems that were "open" at the original time of writing.

  10. Non-equilibrium statistical thermodynamics of neutron gas in reactor

    International Nuclear Information System (INIS)

    Hayasaka, Hideo

    1977-01-01

    The thermodynamic structures of non-equilibrium steady states of highly rarefied neutron gas in various media are considered for the irreversible processes owing to creative and destructive reactions of neutrons with nuclei of these media and supply from the external sources. Under the so-called clean and cold condition in reactor, the medium is regarded virtually as offering the different chemical potential fields for each subsystem of a steady neutron gas system. The fluctuations around a steady state are considered in a Markovian-Gaussian process. The generalized Einstein relations are derived for stationary neutron gas systems. The forces and flows of neutron gases in a medium are defined upon the general stationary solution of the Fokker-Planck equation. There exist the symmetry of the kinetic coefficients, and the minimum entropy production upon neutron-nuclear reactions. The distribution functions in various media are determined by each corresponding extremum condition under the vanishing of changes of the respective total entropies in the Gibbs equation. (auth.)

  11. Statistical and thermal physics with computer applications

    CERN Document Server

    Gould, Harvey

    2010-01-01

    This textbook carefully develops the main ideas and techniques of statistical and thermal physics and is intended for upper-level undergraduate courses. The authors each have more than thirty years' experience in teaching, curriculum development, and research in statistical and computational physics. Statistical and Thermal Physics begins with a qualitative discussion of the relation between the macroscopic and microscopic worlds and incorporates computer simulations throughout the book to provide concrete examples of important conceptual ideas. Unlike many contemporary texts on the

  12. Coronal emission-line polarization from the statistical equilibrium of magnetic sublevels. I. Fe XII

    International Nuclear Information System (INIS)

    House, L.L.

    1977-01-01

    A general formulation for the polarization of coronal emission lines is presented, and the physics is illustrated through application of the formulation to the lines of Fe XIII at 10747 and 10798 A. The goal is to present a foundation for the determination of the orientation of coronal magnetic fields from emission-line polarization measurements. The physics of emission-line polarization is discussed using the statistical equilibrium equations for the magnetic sublevels of a coronal ion. The formulation of these equations, which describe the polarization of the radiation field in terms of Stokes parameters, is presented; and the various rate parameters: both radiative and collisional: are considered. The emission Stokes vector is constructed from the solution of the equilibrium equations for a point in the corona where the magnetic field has an arbitrary orientation. On the basis of a model, a computer code for the calculation of emission-line polarization is briefly described and illustrated with a number of sample calculations for Fe XIII. Calculations are carried out for three-dimensional models that demonstrate the physics of the formation of emission-line polarization and illustrate how the degree of polarization and angle of polarization and their variations over the corona are related to the density and magnetic field structure. The models considered range from simple cases in which the density distribution with height is spherically symmetric and the field is radial or dipole to a complex case in which both the density and magnetic field distributions are derived from realistic three-dimensional distributions for the 1973 eclipse on the basis of K-coronameter measurements for the density and potential-field extrapolation of surface magnetic fields in the corona

  13. A statistical method for evaluation of the experimental phase equilibrium data of simple clathrate hydrates

    DEFF Research Database (Denmark)

    Eslamimanesh, Ali; Gharagheizi, Farhad; Mohammadi, Amir H.

    2012-01-01

    We, herein, present a statistical method for diagnostics of the outliers in phase equilibrium data (dissociation data) of simple clathrate hydrates. The applied algorithm is performed on the basis of the Leverage mathematical approach, in which the statistical Hat matrix, Williams Plot, and the r......We, herein, present a statistical method for diagnostics of the outliers in phase equilibrium data (dissociation data) of simple clathrate hydrates. The applied algorithm is performed on the basis of the Leverage mathematical approach, in which the statistical Hat matrix, Williams Plot...... in exponential form is used to represent/predict the hydrate dissociation pressures for three-phase equilibrium conditions (liquid water/ice–vapor-hydrate). The investigated hydrate formers are methane, ethane, propane, carbon dioxide, nitrogen, and hydrogen sulfide. It is interpreted from the obtained results...

  14. The Non-Equilibrium Statistical Distribution Function for Electrons and Holes in Semiconductor Heterostructures in Steady-State Conditions

    Directory of Open Access Journals (Sweden)

    Krzysztof Jόzwikowska

    2015-06-01

    Full Text Available The main goal of this work is to determine a statistical non-equilibrium distribution function for the electron and holes in semiconductor heterostructures in steady-state conditions. Based on the postulates of local equilibrium, as well as on the integral form of the weighted Gyarmati’s variational principle in the force representation, using an alternative method, we have derived general expressions, which have the form of the Fermi–Dirac distribution function with four additional components. The physical interpretation of these components has been carried out in this paper. Some numerical results of a non-equilibrium distribution function for an electron in HgCdTe structures are also presented.

  15. Improvement of characteristic statistic algorithm and its application on equilibrium cycle reloading optimization

    International Nuclear Information System (INIS)

    Hu, Y.; Liu, Z.; Shi, X.; Wang, B.

    2006-01-01

    A brief introduction of characteristic statistic algorithm (CSA) is given in the paper, which is a new global optimization algorithm to solve the problem of PWR in-core fuel management optimization. CSA is modified by the adoption of back propagation neural network and fast local adjustment. Then the modified CSA is applied to PWR Equilibrium Cycle Reloading Optimization, and the corresponding optimization code of CSA-DYW is developed. CSA-DYW is used to optimize the equilibrium cycle of 18 month reloading of Daya bay nuclear plant Unit 1 reactor. The results show that CSA-DYW has high efficiency and good global performance on PWR Equilibrium Cycle Reloading Optimization. (authors)

  16. Statistical physics of human beings in games: Controlled experiments

    International Nuclear Information System (INIS)

    Liang Yuan; Huang Ji-Ping

    2014-01-01

    It is important to know whether the laws or phenomena in statistical physics for natural systems with non-adaptive agents still hold for social human systems with adaptive agents, because this implies whether it is possible to study or understand social human systems by using statistical physics originating from natural systems. For this purpose, we review the role of human adaptability in four kinds of specific human behaviors, namely, normal behavior, herd behavior, contrarian behavior, and hedge behavior. The approach is based on controlled experiments in the framework of market-directed resource-allocation games. The role of the controlled experiments could be at least two-fold: adopting the real human decision-making process so that the system under consideration could reflect the performance of genuine human beings; making it possible to obtain macroscopic physical properties of a human system by tuning a particular factor of the system, thus directly revealing cause and effect. As a result, both computer simulations and theoretical analyses help to show a few counterparts of some laws or phenomena in statistical physics for social human systems: two-phase phenomena or phase transitions, entropy-related phenomena, and a non-equilibrium steady state. This review highlights the role of human adaptability in these counterparts, and makes it possible to study or understand some particular social human systems by means of statistical physics coming from natural systems. (topical review - statistical physics and complex systems)

  17. Vol. 3: Statistical Physics and Phase Transitions

    International Nuclear Information System (INIS)

    Sitenko, A.

    1993-01-01

    Problems of modern physics and the situation with physical research in Ukraine are considered. Programme of the conference includes scientific and general problems. Its proceedings are published in 6 volumes. The papers presented in this volume refer to statistical physics and phase transition theory

  18. A statistical investigation of the effects of edge localized modes on the equilibrium reconstruction in JET

    International Nuclear Information System (INIS)

    Murari, A; Peluso, E; Gaudio, P; Gelfusa, M; Maviglia, F; Hawkes, N

    2012-01-01

    The configuration of magnetic fields is an essential ingredient of tokamak physics. In modern day devices, the magnetic topology is normally derived from equilibrium codes, which solve the Grad–Shafranov equation with constraints imposed by the available measurements. On JET, the main code used for this purpose is EFIT and the more commonly used diagnostics are external pick-up coils. Both the code and the measurements present worse performance during edge localized modes (ELMs). To quantify this aspect, various statistical indicators, based on the values of the residuals and their probability distribution, are defined and calculated. They all show that the quality of EFIT reconstructions is clearly better in the absence of ELMs. To investigate the possible causes of the detrimental effects of ELMs on the reconstruction, the pick-up coils are characterized individually and both the spatial distribution and time behaviour of their residuals are analysed in detail. The coils with a faster time response are the ones reproduced less well by EFIT. The constraints of current and pressure at the separatrix are also varied but the effects of such modifications do not result in decisive improvements in the quality of the reconstructions. The interpretation of this experimental evidence is not absolutely compelling but strongly indicative of deficiencies in the physics model on which the JET reconstruction code is based. (paper)

  19. Reconstructing Macroeconomics Based on Statistical Physics

    Science.gov (United States)

    Aoki, Masanao; Yoshikawa, Hiroshi

    We believe that time has come to integrate the new approach based on statistical physics or econophysics into macroeconomics. Toward this goal, there must be more dialogues between physicists and economists. In this paper, we argue that there is no reason why the methods of statistical physics so successful in many fields of natural sciences cannot be usefully applied to macroeconomics that is meant to analyze the macroeconomy comprising a large number of economic agents. It is, in fact, weird to regard the macroeconomy as a homothetic enlargement of the representative micro agent. We trust the bright future of the new approach to macroeconomies based on statistical physics.

  20. Understanding search trees via statistical physics

    Indian Academy of Sciences (India)

    ary search tree model (where stands for the number of branches of the search tree), an important problem for data storage in computer science, using a variety of statistical physics techniques that allow us to obtain exact asymptotic results.

  1. Statistical physics including applications to condensed matter

    CERN Document Server

    Hermann, Claudine

    2005-01-01

    Statistical Physics bridges the properties of a macroscopic system and the microscopic behavior of its constituting particles, otherwise impossible due to the giant magnitude of Avogadro's number. Numerous systems of today's key technologies -- as e.g. semiconductors or lasers -- are macroscopic quantum objects; only statistical physics allows for understanding their fundamentals. Therefore, this graduate text also focuses on particular applications such as the properties of electrons in solids with applications, and radiation thermodynamics and the greenhouse effect.

  2. Science Academies' Refresher Course in Statistical Physics

    Indian Academy of Sciences (India)

    The Course is aimed at college teachers of statistical physics at BSc/MSc level. ... teachers, with at least a masters degree in Physics/Mathematics/Engineering are ... Topics: There will be six courses dealing with, Basic principles and general ...

  3. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  4. Statistics for Physical Sciences An Introduction

    CERN Document Server

    Martin, Brian

    2012-01-01

    Statistical Methods for the Physical Sciences is an informal, relatively short, but systematic, guide to the more commonly used ideas and techniques in statistical analysis, as used in physical sciences, together with explanations of their origins. It steers a path between the extremes of a recipe of methods with a collection of useful formulas, and a full mathematical account of statistics, while at the same time developing the subject in a logical way. The book can be read in its entirety by anyone with a basic exposure to mathematics at the level of a first-year undergraduate student

  5. Statistical physics of human beings in games: Controlled experiments

    Science.gov (United States)

    Liang, Yuan; Huang, Ji-Ping

    2014-07-01

    It is important to know whether the laws or phenomena in statistical physics for natural systems with non-adaptive agents still hold for social human systems with adaptive agents, because this implies whether it is possible to study or understand social human systems by using statistical physics originating from natural systems. For this purpose, we review the role of human adaptability in four kinds of specific human behaviors, namely, normal behavior, herd behavior, contrarian behavior, and hedge behavior. The approach is based on controlled experiments in the framework of market-directed resource-allocation games. The role of the controlled experiments could be at least two-fold: adopting the real human decision-making process so that the system under consideration could reflect the performance of genuine human beings; making it possible to obtain macroscopic physical properties of a human system by tuning a particular factor of the system, thus directly revealing cause and effect. As a result, both computer simulations and theoretical analyses help to show a few counterparts of some laws or phenomena in statistical physics for social human systems: two-phase phenomena or phase transitions, entropy-related phenomena, and a non-equilibrium steady state. This review highlights the role of human adaptability in these counterparts, and makes it possible to study or understand some particular social human systems by means of statistical physics coming from natural systems.

  6. Statistical analysis of the equilibrium configurations of the W7-X stellarator

    Energy Technology Data Exchange (ETDEWEB)

    Sengupta, A [Max-Planck-Institut fuer Plasmaphysik, Euratom Association, Greifswald (Germany); Geiger, J [Max-Planck-Institut fuer Plasmaphysik, Euratom Association, Greifswald (Germany); Mc Carthy, P J [Department of Physics, University College Cork, Association EURATOM-DCU, Cork (Ireland)

    2007-05-15

    Equilibrium magnetic configurations of W7-X stellarator plasma were analysed in this study. The statistical method of function parametrization was used to recover the physical properties of the magnetic configurations, such as the flux surface geometry, the magnetic field and the iota profile from simulated experimental data. The study was carried out with a net toroidal current. Idealized 'measurements' were first used to recover the configuration. These ' measurements' were then perturbed with noise and the effect of this perturbation on the recovered configuration parameters was estimated. The noise was scanned over a range large enough to encompass that expected in the actual experiment. In the process, it was possible to ascertain the limit of tolerable noise that can be allowed in the inputs so as not to significantly perturb the outputs recovered with noiseless 'measurements'. Generally, a cubic polynomial model was found to be necessary for noise levels below 10%. For higher noise levels, a quadratic polynomial performed as well as the cubic. The noise level of 10% was also the approximate limit up to which the recovery with ideal measurements was generally reproduced. For the flux geometry recovery, however, the quadratic model performed similarly to the cubic for any value of noise, with the latter model proving to be significantly better only for the noiseless case. Also, with noisy predictors the recovery error for the flux surfaces increases linearly with effective radius from the plasma core up to the edge.

  7. Statistical thermodynamics of association colloids : the equilibrium structure of micelles, vesicles, and bilayer membranes

    NARCIS (Netherlands)

    Leermakers, F.A.M.

    1988-01-01

    The aim of the present study was to unravel the general equilibrium physical properties of lipid bilayer membranes. We consider four major questions:
    1. What determines the morphology of the association colloids (micelles, membranes, vesicles) in general?
    2. Do the

  8. Equilibrium statistical mechanics for self-gravitating systems: local ergodicity and extended Boltzmann-Gibbs/White-Narayan statistics

    Science.gov (United States)

    He, Ping

    2012-01-01

    The long-standing puzzle surrounding the statistical mechanics of self-gravitating systems has not yet been solved successfully. We formulate a systematic theoretical framework of entropy-based statistical mechanics for spherically symmetric collisionless self-gravitating systems. We use an approach that is very different from that of the conventional statistical mechanics of short-range interaction systems. We demonstrate that the equilibrium states of self-gravitating systems consist of both mechanical and statistical equilibria, with the former characterized by a series of velocity-moment equations and the latter by statistical equilibrium equations, which should be derived from the entropy principle. The velocity-moment equations of all orders are derived from the steady-state collisionless Boltzmann equation. We point out that the ergodicity is invalid for the whole self-gravitating system, but it can be re-established locally. Based on the local ergodicity, using Fermi-Dirac-like statistics, with the non-degenerate condition and the spatial independence of the local microstates, we rederive the Boltzmann-Gibbs entropy. This is consistent with the validity of the collisionless Boltzmann equation, and should be the correct entropy form for collisionless self-gravitating systems. Apart from the usual constraints of mass and energy conservation, we demonstrate that the series of moment or virialization equations must be included as additional constraints on the entropy functional when performing the variational calculus; this is an extension to the original prescription by White & Narayan. Any possible velocity distribution can be produced by the statistical-mechanical approach that we have developed with the extended Boltzmann-Gibbs/White-Narayan statistics. Finally, we discuss the questions of negative specific heat and ensemble inequivalence for self-gravitating systems.

  9. Statistical physics and computational methods for evolutionary game theory

    CERN Document Server

    Javarone, Marco Alberto

    2018-01-01

    This book presents an introduction to Evolutionary Game Theory (EGT) which is an emerging field in the area of complex systems attracting the attention of researchers from disparate scientific communities. EGT allows one to represent and study several complex phenomena, such as the emergence of cooperation in social systems, the role of conformity in shaping the equilibrium of a population, and the dynamics in biological and ecological systems. Since EGT models belong to the area of complex systems, statistical physics constitutes a fundamental ingredient for investigating their behavior. At the same time, the complexity of some EGT models, such as those realized by means of agent-based methods, often require the implementation of numerical simulations. Therefore, beyond providing an introduction to EGT, this book gives a brief overview of the main statistical physics tools (such as phase transitions and the Ising model) and computational strategies for simulating evolutionary games (such as Monte Carlo algor...

  10. Physical phenomena in a low-temperature non-equilibrium plasma and in MHD generators with non-equilibrium conductivity

    International Nuclear Information System (INIS)

    Velikhov, E.P.; Golubev, V.S.; Dykhne, A.M.

    1976-01-01

    The paper assesses the position in 1975 of theoretical and experimental work on the physics of a magnetohydrodynamic generator with non-equilibrium plasma conductivity. This research started at the beginning of the 1960s; as work on the properties of thermally non-equilibrium plasma in magnetic fields and also in MHD generator ducts progressed, a number of phenomena were discovered and investigated that had either been unknown in plasma physics or had remained uninvestigated until that time: ionization instability and ionization turbulence of plasma in a magnetic field, acoustic instability of a plasma with anisotropic conductivity, the non-equilibrium ionization wave and the energy balance of a non-equilibrium plasma. At the same time, it was discovered what physical requirements an MHD generator with non-equilibrium conductivity must satisfy to achieve high efficiency in converting the thermal or kinetic energy of the gas flow into electric energy. The experiments on MHD power generation with thermally non-equilibrium plasma carried out up to 1975 indicated that it should be possible to achieve conversion efficiencies of up to 20-30%. (author)

  11. Pre-equilibrium assumptions and statistical model parameters effects on reaction cross-section calculations

    International Nuclear Information System (INIS)

    Avrigeanu, M.; Avrigeanu, V.

    1992-02-01

    A systematic study on effects of statistical model parameters and semi-classical pre-equilibrium emission models has been carried out for the (n,p) reactions on the 56 Fe and 60 Co target nuclei. The results obtained by using various assumptions within a given pre-equilibrium emission model differ among them more than the ones of different models used under similar conditions. The necessity of using realistic level density formulas is emphasized especially in connection with pre-equilibrium emission models (i.e. with the exciton state density expression), while a basic support could be found only by replacement of the Williams exciton state density formula with a realistic one. (author). 46 refs, 12 figs, 3 tabs

  12. Statistical and physical evolution of QSO's

    International Nuclear Information System (INIS)

    Caditz, D.; Petrosian, V.

    1989-09-01

    The relationship between the physical evolution of discrete extragalactic sources, the statistical evolution of the observed population of sources, and the cosmological model is discussed. Three simple forms of statistical evolution: pure luminosity evolution (PLE), pure density evolution (PDE), and generalized luminosity evolution (GLE), are considered in detail together with what these forms imply about the physical evolution of individual sources. Two methods are used to analyze the statistical evolution of the observed distribution of QSO's (quasars) from combined flux limited samples. It is shown that both PLE and PDE are inconsistent with the data over the redshift range 0 less than z less than 2.2, and that a more complicated form of evolution such as GLE is required, independent of the cosmological model. This result is important for physical models of AGN, and in particular, for the accretion disk model which recent results show may be inconsistent with PLE

  13. Probability and statistics in particle physics

    International Nuclear Information System (INIS)

    Frodesen, A.G.; Skjeggestad, O.

    1979-01-01

    Probability theory is entered into at an elementary level and given a simple and detailed exposition. The material on statistics has been organised with an eye to the experimental physicist's practical need, which is likely to be statistical methods for estimation or decision-making. The book is intended for graduate students and research workers in experimental high energy and elementary particle physics, and numerous examples from these fields are presented. (JIW)

  14. Science Academies' Refresher Course in Statistical Physics

    Indian Academy of Sciences (India)

    The Course is aimed at college teachers of statistical physics at BSc/MSc level. It will cover basic principles and techniques, in a pedagogical manner, through lectures and tutorials, with illustrative problems. Some advanced topics, and common difficulties faced by students will also be discussed. College/University ...

  15. Statistical physics of pairwise probability models

    DEFF Research Database (Denmark)

    Roudi, Yasser; Aurell, Erik; Hertz, John

    2009-01-01

    (dansk abstrakt findes ikke) Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of  data......: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying...

  16. Topics in statistical and theoretical physics

    CERN Document Server

    Dobrushin, R L; Shubin, M A

    1996-01-01

    This is the second of two volumes dedicated to the scientific heritage of F. A. Berezin (1931-1980). Before his untimely death, Berezin had an important influence on physics and mathematics, discovering new ideas in mathematical physics, representation theory, analysis, geometry, and other areas of mathematics. His crowning achievements were the introduction of a new notion of deformation quantization and Grassmannian analysis ("supermathematics"). Collected here are papers by many of his colleagues and others who worked in related areas, representing a wide spectrum of topics in statistical a

  17. Conference: Statistical Physics and Biological Information

    International Nuclear Information System (INIS)

    Gross, David J.; Hwa, Terence

    2001-01-01

    In the spring of 2001, the Institute for Theoretical Physics ran a 6 month scientific program on Statistical Physics and Biological Information. This program was organized by Walter Fitch (UC Irvine), Terence Hwa (UC San Diego), Luca Peliti (University Federico II), Naples Gary Stormo (Washington University School of Medicine) and Chao Tang (NEC). Overall scientific supervision was provided by David Gross, Director, ITP. The ITP has an online conference/program proceeding which consists of audio and transparencies of almost all of the talks held during this program. Over 100 talks are available on the site at http://online.kitp.ucsb.edu/online/infobio01/

  18. Conference: Statistical Physics and Biological Information; F

    International Nuclear Information System (INIS)

    Gross, David J.; Hwa, Terence

    2001-01-01

    In the spring of 2001, the Institute for Theoretical Physics ran a 6 month scientific program on Statistical Physics and Biological Information. This program was organized by Walter Fitch (UC Irvine), Terence Hwa (UC San Diego), Luca Peliti (University Federico II), Naples Gary Stormo (Washington University School of Medicine) and Chao Tang (NEC). Overall scientific supervision was provided by David Gross, Director, ITP. The ITP has an online conference/program proceeding which consists of audio and transparencies of almost all of the talks held during this program. Over 100 talks are available on the site at http://online.kitp.ucsb.edu/online/infobio01/

  19. Statistical and thermal physics an introduction

    CERN Document Server

    Hoch, Michael JR

    2011-01-01

    ""When I started reading Michael J.R. Hoch's book Statistical and Thermal Physics: An Introduction I thought to myself that this is another book the same as a large group of others with similar content. … But during my reading this unjustified belief changed. … The main reason for this change was the way of information presentation: … the way of presentation is designed so that the reader receives only the information that is necessary to give the essence of the problem. … this book will provide an introduction to the subject especially for those who are interested in basic or applied physics.

  20. Statistical Issues in Searches for New Physics

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Given the cost, both financial and even more importantly in terms of human effort, in building High Energy Physics accelerators and detectors and running them, it is important to use good statistical techniques in analysing data. This talk covers some of the statistical issues that arise in searches for New Physics. They include topics such as: Should we insist on the 5 sigma criterion for discovery claims? What are the relative merits of a Raster Scan or a "2-D" approach? P(A|B) is not the same as P(B|A) The meaning of p-values Example of a problematic likelihood What is Wilks Theorem and when does it not apply? How should we deal with the "Look Elsewhere Effect"? Dealing with systematics such as background parametrisation Coverage: What is it and does my method have the correct coverage? The use of p0 vs. p1 plots

  1. Statistical physics of medical ultrasonic images

    International Nuclear Information System (INIS)

    Wagner, R.F.; Insana, M.F.; Brown, D.G.; Smith, S.W.

    1987-01-01

    The physical and statistical properties of backscattered signals in medical ultrasonic imaging are reviewed in terms of: 1) the radiofrequency signal; 2) the envelope (video or magnitude) signal; and 3) the density of samples in simple and in compounded images. There is a wealth of physical information in backscattered signals in medical ultrasound. This information is contained in the radiofrequency spectrum - which is not typically displayed to the viewer - as well as in the higher statistical moments of the envelope or video signal - which are not readily accessed by the human viewer of typical B-scans. This information may be extracted from the detected backscattered signals by straightforward signal processing techniques at low resolution

  2. Fluctuations of physical values in statistical mechanics

    International Nuclear Information System (INIS)

    Zaripov, R.G.

    1999-01-01

    The new matrix inequalities for the boundary of measurement accuracy of physical values in the ensemble of quantum systems were obtained. The multidimensional thermodynamical parameter measurement is estimated. The matrix inequalities obtained are quantum analogs of the Cramer-Rao information inequalities in mathematical statistics. The quantity of information in quantum mechanical measurement, connected with the boundaries of jointly measurable values in one macroscopic experiment was determined. The lower boundary of the variance of estimation of multidimensional quantum mechanical parameter was found. (author)

  3. Physics of future equilibrium state of nuclear energy utilization

    International Nuclear Information System (INIS)

    Sekimoto, H.

    1994-01-01

    The governing equations for future equilibrium nuclear state are presented and their characteristics are discussed. These equations are solved for several typical cases. In the present study on the equilibrium state, two coincidences are found. The first is the coincidence on the neutron balance performed by the nuclides satisfying the equilibrium condition. The finite neutron multiplication factor is near unity. The second is the coincidence on the toxicity. The produced long-life fission product toxicity is near the incinerated natural fuel toxicity. (author). 2 refs., 2 tabs., 4 figs

  4. College Physical Chemistry Students' Conceptions of Equilibrium and Fundamental Thermodynamics.

    Science.gov (United States)

    Thomas, Peter L.; Schwenz, Richard W.

    1998-01-01

    Focuses on many alternative conceptions and nonconceptions about material related to equilibrium and thermodynamics. Uses interviews and compares the concepts from these with those expressed by experts in textbooks. (DDR)

  5. Statistical physics of an anyon gas

    International Nuclear Information System (INIS)

    Dasnieres de Veigy, A.

    1994-01-01

    In quantum two-dimensional physics, anyons are particles which have an intermediate statistics between Bose-Einstein and Fermi-Dirac statistics. The wave amplitude can change by an arbitrary phase under particle exchanges. Contrary to bosons or fermions, the permutation group cannot uniquely characterize this phase and one must introduce the braid group. One shows that the statistical ''interaction'' is equivalent to an Aharonov-Bohm interaction which derives from a Chern-Simons lagrangian. The main subject of this thesis is the thermodynamics of an anyon gas. Since the complete spectrum of N anyons seems out of reach, we have done a perturbative computation of the equation of state at second order near Bose or Fermi statistics. One avoids ultraviolet divergences by noticing that the short-range singularities of the statistical interaction enforce the wave functions to vanish when two particles approach each other (statistical exclusion). The gas is confined in a harmonic well in order to obtain the thermodynamics limit when the harmonic attraction goes to zero. Infrared divergences thus cancel in this limit and a finite virial expansion is obtained. The complexity of the anyon model appears in this result. We have also computed the equation of state of an anyon gas in a magnetic field strong enough to project the system in its degenerate groundstate. This result concerns anyons with any statistics. One then finds an exclusion principle generalizing the Pauli principle to anyons. On the other hand, we have defined a model of two-dimensional particles topologically interacting at a distance. The anyon model is recovered as a particular case where all particles are identical. (orig.)

  6. Infinite-mode squeezed coherent states and non-equilibrium statistical mechanics (phase-space-picture approach)

    International Nuclear Information System (INIS)

    Yeh, L.

    1992-01-01

    The phase-space-picture approach to quantum non-equilibrium statistical mechanics via the characteristic function of infinite- mode squeezed coherent states is introduced. We use quantum Brownian motion as an example to show how this approach provides an interesting geometrical interpretation of quantum non-equilibrium phenomena

  7. PREFACE: Statistical Physics of Complex Fluids

    Science.gov (United States)

    Golestanian, R.; Khajehpour, M. R. H.; Kolahchi, M. R.; Rouhani, S.

    2005-04-01

    The field of complex fluids is a rapidly developing, highly interdisciplinary field that brings together people from a plethora of backgrounds such as mechanical engineering, chemical engineering, materials science, applied mathematics, physics, chemistry and biology. In this melting pot of science, the traditional boundaries of various scientific disciplines have been set aside. It is this very property of the field that has guaranteed its richness and prosperity since the final decade of the 20th century and into the 21st. The C3 Commission of the International Union of Pure and Applied Physics (IUPAP), which is the commission for statistical physics that organizes the international STATPHYS conferences, encourages various, more focused, satellite meetings to complement the main event. For the STATPHYS22 conference in Bangalore (July 2004), Iran was recognized by the STATPHYS22 organizers as suitable to host such a satellite meeting and the Institute for Advanced Studies in Basic Sciences (IASBS) was chosen to be the site of this meeting. It was decided to organize a meeting in the field of complex fluids, which is a fairly developed field in Iran. This international meeting, and an accompanying summer school, were intended to boost international connections for both the research groups working in Iran, and several other groups working in the Middle East, South Asia and North Africa. The meeting, entitled `Statistical Physics of Complex Fluids' was held at the Institute for Advanced Studies in Basic Sciences (IASBS) in Zanjan, Iran, from 27 June to 1 July 2004. The main topics discussed at the meeting included: biological statistical physics, wetting and microfluidics, transport in complex media, soft and granular matter, and rheology of complex fluids. At this meeting, 22 invited lectures by eminent scientists were attended by 107 participants from different countries. The poster session consisted of 45 presentations which, in addition to the main topics of the

  8. Statistical physics of hard optimization problems

    International Nuclear Information System (INIS)

    Zdeborova, L.

    2009-01-01

    Optimization is fundamental in many areas of science, from computer science and information theory to engineering and statistical physics, as well as to biology or social sciences. It typically involves a large number of variables and a cost function depending on these variables. Optimization problems in the non-deterministic polynomial (NP)-complete class are particularly difficult, it is believed that the number of operations required to minimize the cost function is in the most difficult cases exponential in the system size. However, even in an NP-complete problem the practically arising instances might, in fact, be easy to solve. The principal question we address in this article is: How to recognize if an NP-complete constraint satisfaction problem is typically hard and what are the main reasons for this? We adopt approaches from the statistical physics of disordered systems, in particular the cavity method developed originally to describe glassy systems. We describe new properties of the space of solutions in two of the most studied constraint satisfaction problems - random satisfy ability and random graph coloring. We suggest a relation between the existence of the so-called frozen variables and the algorithmic hardness of a problem. Based on these insights, we introduce a new class of problems which we named ”locked” constraint satisfaction, where the statistical description is easily solvable, but from the algorithmic point of view they are even more challenging than the canonical satisfy ability.

  9. Statistical physics of hard optimization problems

    International Nuclear Information System (INIS)

    Zdeborova, L.

    2009-01-01

    Optimization is fundamental in many areas of science, from computer science and information theory to engineering and statistical physics, as well as to biology or social sciences. It typically involves a large number of variables and a cost function depending on these variables. Optimization problems in the non-deterministic polynomial-complete class are particularly difficult, it is believed that the number of operations required to minimize the cost function is in the most difficult cases exponential in the system size. However, even in an non-deterministic polynomial-complete problem the practically arising instances might, in fact, be easy to solve. The principal the question we address in the article is: How to recognize if an non-deterministic polynomial-complete constraint satisfaction problem is typically hard and what are the main reasons for this? We adopt approaches from the statistical physics of disordered systems, in particular the cavity method developed originally to describe glassy systems. We describe new properties of the space of solutions in two of the most studied constraint satisfaction problems - random satisfiability and random graph coloring. We suggest a relation between the existence of the so-called frozen variables and the algorithmic hardness of a problem. Based on these insights, we introduce a new class of problems which we named 'locked' constraint satisfaction, where the statistical description is easily solvable, but from the algorithmic point of view they are even more challenging than the canonical satisfiability (Authors)

  10. Statistical physics of hard optimization problems

    Science.gov (United States)

    Zdeborová, Lenka

    2009-06-01

    Optimization is fundamental in many areas of science, from computer science and information theory to engineering and statistical physics, as well as to biology or social sciences. It typically involves a large number of variables and a cost function depending on these variables. Optimization problems in the non-deterministic polynomial (NP)-complete class are particularly difficult, it is believed that the number of operations required to minimize the cost function is in the most difficult cases exponential in the system size. However, even in an NP-complete problem the practically arising instances might, in fact, be easy to solve. The principal question we address in this article is: How to recognize if an NP-complete constraint satisfaction problem is typically hard and what are the main reasons for this? We adopt approaches from the statistical physics of disordered systems, in particular the cavity method developed originally to describe glassy systems. We describe new properties of the space of solutions in two of the most studied constraint satisfaction problems - random satisfiability and random graph coloring. We suggest a relation between the existence of the so-called frozen variables and the algorithmic hardness of a problem. Based on these insights, we introduce a new class of problems which we named "locked" constraint satisfaction, where the statistical description is easily solvable, but from the algorithmic point of view they are even more challenging than the canonical satisfiability.

  11. Networking—a statistical physics perspective

    International Nuclear Information System (INIS)

    Yeung, Chi Ho; Saad, David

    2013-01-01

    Networking encompasses a variety of tasks related to the communication of information on networks; it has a substantial economic and societal impact on a broad range of areas including transportation systems, wired and wireless communications and a range of Internet applications. As transportation and communication networks become increasingly more complex, the ever increasing demand for congestion control, higher traffic capacity, quality of service, robustness and reduced energy consumption requires new tools and methods to meet these conflicting requirements. The new methodology should serve for gaining better understanding of the properties of networking systems at the macroscopic level, as well as for the development of new principled optimization and management algorithms at the microscopic level. Methods of statistical physics seem best placed to provide new approaches as they have been developed specifically to deal with nonlinear large-scale systems. This review aims at presenting an overview of tools and methods that have been developed within the statistical physics community and that can be readily applied to address the emerging problems in networking. These include diffusion processes, methods from disordered systems and polymer physics, probabilistic inference, which have direct relevance to network routing, file and frequency distribution, the exploration of network structures and vulnerability, and various other practical networking applications. (topical review)

  12. Networking—a statistical physics perspective

    Science.gov (United States)

    Yeung, Chi Ho; Saad, David

    2013-03-01

    Networking encompasses a variety of tasks related to the communication of information on networks; it has a substantial economic and societal impact on a broad range of areas including transportation systems, wired and wireless communications and a range of Internet applications. As transportation and communication networks become increasingly more complex, the ever increasing demand for congestion control, higher traffic capacity, quality of service, robustness and reduced energy consumption requires new tools and methods to meet these conflicting requirements. The new methodology should serve for gaining better understanding of the properties of networking systems at the macroscopic level, as well as for the development of new principled optimization and management algorithms at the microscopic level. Methods of statistical physics seem best placed to provide new approaches as they have been developed specifically to deal with nonlinear large-scale systems. This review aims at presenting an overview of tools and methods that have been developed within the statistical physics community and that can be readily applied to address the emerging problems in networking. These include diffusion processes, methods from disordered systems and polymer physics, probabilistic inference, which have direct relevance to network routing, file and frequency distribution, the exploration of network structures and vulnerability, and various other practical networking applications.

  13. Quantum theoretical physics is statistical and relativistic

    International Nuclear Information System (INIS)

    Harding, C.

    1980-01-01

    A new theoretical framework for the quantum mechanism is presented. It is based on a strict deterministic behavior of single systems. The conventional QM equation, however, is found to describe statistical results of many classical systems. It will be seen, moreover, that a rigorous synthesis of our theory requires relativistic kinematics. So, QM is not only a classical statistical theory, it is, of necessity, a relativistic theory. The equation of the theory does not just duplicate QM, it indicates an inherent nonlinearity in QM which is subject to experimental verification. It is shown, therefore, that conventional QM is a corollary of classical deterministic principles. It is suggested that this concept of nature conflicts with that prevalent in modern physics. (author)

  14. Non-equilibrium physics at a holographic chiral phase transition

    Energy Technology Data Exchange (ETDEWEB)

    Evans, Nick; Kim, Keun-young [Southampton Univ. (United Kingdom). School of Physics and Astronomy; Kavli Institute for Theoretical Physics China, Beijing (China); Kalaydzhyan, Tigran; Kirsch, Ingo [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2010-11-15

    The D3/D7 system holographically describes an N=2 gauge theory which spontaneously breaks a chiral symmetry by the formation of a quark condensate in the presence of a magnetic field. At finite temperature it displays a first order phase transition. We study out of equilibrium dynamics associated with this transition by placing probe D7 branes in a geometry describing a boost-invariant expanding or contracting plasma. We use an adiabatic approximation to track the evolution of the quark condensate in a heated system and reproduce the phase structure expected from equilibrium dynamics. We then study solutions of the full partial differential equation that describes the evolution of out of equilibrium configurations to provide a complete description of the phase transition including describing aspects of bubble formation. (orig.)

  15. Statistical physics of interacting neural networks

    Science.gov (United States)

    Kinzel, Wolfgang; Metzler, Richard; Kanter, Ido

    2001-12-01

    Recent results on the statistical physics of time series generation and prediction are presented. A neural network is trained on quasi-periodic and chaotic sequences and overlaps to the sequence generator as well as the prediction errors are calculated numerically. For each network there exists a sequence for which it completely fails to make predictions. Two interacting networks show a transition to perfect synchronization. A pool of interacting networks shows good coordination in the minority game-a model of competition in a closed market. Finally, as a demonstration, a perceptron predicts bit sequences produced by human beings.

  16. Computer program determines chemical composition of physical system at equilibrium

    Science.gov (United States)

    Kwong, S. S.

    1966-01-01

    FORTRAN 4 digital computer program calculates equilibrium composition of complex, multiphase chemical systems. This is a free energy minimization method with solution of the problem reduced to mathematical operations, without concern for the chemistry involved. Also certain thermodynamic properties are determined as byproducts of the main calculations.

  17. Classical and Quantum Models in Non-Equilibrium Statistical Mechanics: Moment Methods and Long-Time Approximations

    Directory of Open Access Journals (Sweden)

    Ramon F. Alvarez-Estrada

    2012-02-01

    Full Text Available We consider non-equilibrium open statistical systems, subject to potentials and to external “heat baths” (hb at thermal equilibrium at temperature T (either with ab initio dissipation or without it. Boltzmann’s classical equilibrium distributions generate, as Gaussian weight functions in momenta, orthogonal polynomials in momenta (the position-independent Hermite polynomialsHn’s. The moments of non-equilibrium classical distributions, implied by the Hn’s, fulfill a hierarchy: for long times, the lowest moment dominates the evolution towards thermal equilibrium, either with dissipation or without it (but under certain approximation. We revisit that hierarchy, whose solution depends on operator continued fractions. We review our generalization of that moment method to classical closed many-particle interacting systems with neither a hb nor ab initio dissipation: with initial states describing thermal equilibrium at T at large distances but non-equilibrium at finite distances, the moment method yields, approximately, irreversible thermalization of the whole system at T, for long times. Generalizations to non-equilibrium quantum interacting systems meet additional difficulties. Three of them are: (i equilibrium distributions (represented through Wigner functions are neither Gaussian in momenta nor known in closed form; (ii they may depend on dissipation; and (iii the orthogonal polynomials in momenta generated by them depend also on positions. We generalize the moment method, dealing with (i, (ii and (iii, to some non-equilibrium one-particle quantum interacting systems. Open problems are discussed briefly.

  18. Statistical physics of crime: a review.

    Science.gov (United States)

    D'Orsogna, Maria R; Perc, Matjaž

    2015-03-01

    Containing the spread of crime in urban societies remains a major challenge. Empirical evidence suggests that, if left unchecked, crimes may be recurrent and proliferate. On the other hand, eradicating a culture of crime may be difficult, especially under extreme social circumstances that impair the creation of a shared sense of social responsibility. Although our understanding of the mechanisms that drive the emergence and diffusion of crime is still incomplete, recent research highlights applied mathematics and methods of statistical physics as valuable theoretical resources that may help us better understand criminal activity. We review different approaches aimed at modeling and improving our understanding of crime, focusing on the nucleation of crime hotspots using partial differential equations, self-exciting point process and agent-based modeling, adversarial evolutionary games, and the network science behind the formation of gangs and large-scale organized crime. We emphasize that statistical physics of crime can relevantly inform the design of successful crime prevention strategies, as well as improve the accuracy of expectations about how different policing interventions should impact malicious human activity that deviates from social norms. We also outline possible directions for future research, related to the effects of social and coevolving networks and to the hierarchical growth of criminal structures due to self-organization. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Study of energy fluctuation effect on the statistical mechanics of equilibrium systems

    International Nuclear Information System (INIS)

    Lysogorskiy, Yu V; Wang, Q A; Tayurskii, D A

    2012-01-01

    This work is devoted to the modeling of energy fluctuation effect on the behavior of small classical thermodynamic systems. It is known that when an equilibrium system gets smaller and smaller, one of the major quantities that becomes more and more uncertain is its internal energy. These increasing fluctuations can considerably modify the original statistics. The present model considers the effect of such energy fluctuations and is based on an overlapping between the Boltzmann-Gibbs statistics and the statistics of the fluctuation. Within this o verlap statistics , we studied the effects of several types of energy fluctuations on the probability distribution, internal energy and heat capacity. It was shown that the fluctuations can considerably change the temperature dependence of internal energy and heat capacity in the low energy range and at low temperatures. Particularly, it was found that, due to the lower energy limit of the systems, the fluctuations reduce the probability for the low energy states close to the lowest energy and increase the total average energy. This energy increasing is larger for lower temperatures, making negative heat capacity possible for this case.

  20. The universal statistical distributions of the affinity, equilibrium constants, kinetics and specificity in biomolecular recognition.

    Directory of Open Access Journals (Sweden)

    Xiliang Zheng

    2015-04-01

    Full Text Available We uncovered the universal statistical laws for the biomolecular recognition/binding process. We quantified the statistical energy landscapes for binding, from which we can characterize the distributions of the binding free energy (affinity, the equilibrium constants, the kinetics and the specificity by exploring the different ligands binding with a particular receptor. The results of the analytical studies are confirmed by the microscopic flexible docking simulations. The distribution of binding affinity is Gaussian around the mean and becomes exponential near the tail. The equilibrium constants of the binding follow a log-normal distribution around the mean and a power law distribution in the tail. The intrinsic specificity for biomolecular recognition measures the degree of discrimination of native versus non-native binding and the optimization of which becomes the maximization of the ratio of the free energy gap between the native state and the average of non-native states versus the roughness measured by the variance of the free energy landscape around its mean. The intrinsic specificity obeys a Gaussian distribution near the mean and an exponential distribution near the tail. Furthermore, the kinetics of binding follows a log-normal distribution near the mean and a power law distribution at the tail. Our study provides new insights into the statistical nature of thermodynamics, kinetics and function from different ligands binding with a specific receptor or equivalently specific ligand binding with different receptors. The elucidation of distributions of the kinetics and free energy has guiding roles in studying biomolecular recognition and function through small-molecule evolution and chemical genetics.

  1. Statistical Physics Approaches to RNA Editing

    Science.gov (United States)

    Bundschuh, Ralf

    2012-02-01

    The central dogma of molecular Biology states that DNA is transcribed base by base into RNA which is in turn translated into proteins. However, some organisms edit their RNA before translation by inserting, deleting, or substituting individual or short stretches of bases. In many instances the mechanisms by which an organism recognizes the positions at which to edit or by which it performs the actual editing are unknown. One model system that stands out by its very high rate of on average one out of 25 bases being edited are the Myxomycetes, a class of slime molds. In this talk we will show how the computational methods and concepts from statistical Physics can be used to analyze DNA and protein sequence data to predict editing sites in these slime molds and to guide experiments that identified previously unknown types of editing as well as the complete set of editing events in the slime mold Physarum polycephalum.

  2. Statistical Physics Approaches to Microbial Ecology

    Science.gov (United States)

    Mehta, Pankaj

    The unprecedented ability to quantitatively measure and probe complex microbial communities has renewed interest in identifying the fundamental ecological principles governing community ecology in microbial ecosystems. Here, we present work from our group and others showing how ideas from statistical physics can help us uncover these ecological principles. Two major lessons emerge from this work. First, large, ecosystems with many species often display new, emergent ecological behaviors that are absent in small ecosystems with just a few species. To paraphrase Nobel laureate Phil Anderson, ''More is Different'', especially in community ecology. Second, the lack of trophic layer separation in microbial ecology fundamentally distinguishes microbial ecology from classical paradigms of community ecology and leads to qualitative different rules for community assembly in microbes. I illustrate these ideas using both theoretical modeling and novel new experiments on large microbial ecosystems performed by our collaborators (Joshua Goldford and Alvaro Sanchez). Work supported by Simons Investigator in MMLS and NIH R35 R35 GM119461.

  3. Modular reweighting software for statistical mechanical analysis of biased equilibrium data

    Science.gov (United States)

    Sindhikara, Daniel J.

    2012-07-01

    Here a simple, useful, modular approach and software suite designed for statistical reweighting and analysis of equilibrium ensembles is presented. Statistical reweighting is useful and sometimes necessary for analysis of equilibrium enhanced sampling methods, such as umbrella sampling or replica exchange, and also in experimental cases where biasing factors are explicitly known. Essentially, statistical reweighting allows extrapolation of data from one or more equilibrium ensembles to another. Here, the fundamental separable steps of statistical reweighting are broken up into modules - allowing for application to the general case and avoiding the black-box nature of some “all-inclusive” reweighting programs. Additionally, the programs included are, by-design, written with little dependencies. The compilers required are either pre-installed on most systems, or freely available for download with minimal trouble. Examples of the use of this suite applied to umbrella sampling and replica exchange molecular dynamics simulations will be shown along with advice on how to apply it in the general case. New version program summaryProgram title: Modular reweighting version 2 Catalogue identifier: AEJH_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJH_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 179 118 No. of bytes in distributed program, including test data, etc.: 8 518 178 Distribution format: tar.gz Programming language: C++, Python 2.6+, Perl 5+ Computer: Any Operating system: Any RAM: 50-500 MB Supplementary material: An updated version of the original manuscript (Comput. Phys. Commun. 182 (2011) 2227) is available Classification: 4.13 Catalogue identifier of previous version: AEJH_v1_0 Journal reference of previous version: Comput. Phys. Commun. 182 (2011) 2227 Does the new

  4. A new equation of state Based on Nuclear Statistical Equilibrium for Core-Collapse Simulations

    Science.gov (United States)

    Furusawa, Shun; Yamada, Shoichi; Sumiyoshi, Kohsuke; Suzuki, Hideyuki

    2012-09-01

    We calculate a new equation of state for baryons at sub-nuclear densities for the use in core-collapse simulations of massive stars. The formulation is the nuclear statistical equilibrium description and the liquid drop approximation of nuclei. The model free energy to minimize is calculated by relativistic mean field theory for nucleons and the mass formula for nuclei with atomic number up to ~ 1000. We have also taken into account the pasta phase. We find that the free energy and other thermodynamical quantities are not very different from those given in the standard EOSs that adopt the single nucleus approximation. On the other hand, the average mass is systematically different, which may have an important effect on the rates of electron captures and coherent neutrino scatterings on nuclei in supernova cores.

  5. Statistical physics of pairwise probability models

    Directory of Open Access Journals (Sweden)

    Yasser Roudi

    2009-11-01

    Full Text Available Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying and using pairwise models. We build on our previous work on the subject and study the relation between different methods for fitting these models and evaluating their quality. In particular, using data from simulated cortical networks we study how the quality of various approximate methods for inferring the parameters in a pairwise model depends on the time bin chosen for binning the data. We also study the effect of the size of the time bin on the model quality itself, again using simulated data. We show that using finer time bins increases the quality of the pairwise model. We offer new ways of deriving the expressions reported in our previous work for assessing the quality of pairwise models.

  6. Non-equilibrium phenomena in confined soft matter irreversible adsorption, physical aging and glass transition at the nanoscale

    CERN Document Server

    2015-01-01

    This book presents cutting-edge experimental and computational results and provides comprehensive coverage on the impact of non-equilibrium structure and dynamics on the properties of soft matter confined to the nanoscale. The book is organized into three main sections: ·         Equilibration and physical aging: by treating non-equilibrium phenomena with the formal methodology of statistical physics in bulk, the analysis of the kinetics of equilibration sheds new light on the physical origin of the non-equilibrium character of thin polymer films. Both the impact of sample preparation and that of interfacial interactions are analyzed using a large set of experiments. A historical overview of the investigation of the non-equilibrium character of thin polymer films is also presented. Furthermore, the discussion focuses on how interfaces and geometrical confinement perturb the pathways and kinetics of equilibrations of soft glasses (a process of tremendous technological interest). ·         Irr...

  7. Statistical physics, seismogenesis, and seismic hazard

    Science.gov (United States)

    Main, Ian

    1996-11-01

    The scaling properties of earthquake populations show remarkable similarities to those observed at or near the critical point of other composite systems in statistical physics. This has led to the development of a variety of different physical models of seismogenesis as a critical phenomenon, involving locally nonlinear dynamics, with simplified rheologies exhibiting instability or avalanche-type behavior, in a material composed of a large number of discrete elements. In particular, it has been suggested that earthquakes are an example of a "self-organized critical phenomenon" analogous to a sandpile that spontaneously evolves to a critical angle of repose in response to the steady supply of new grains at the summit. In this stationary state of marginal stability the distribution of avalanche energies is a power law, equivalent to the Gutenberg-Richter frequency-magnitude law, and the behavior is relatively insensitive to the details of the dynamics. Here we review the results of some of the composite physical models that have been developed to simulate seismogenesis on different scales during (1) dynamic slip on a preexisting fault, (2) fault growth, and (3) fault nucleation. The individual physical models share some generic features, such as a dynamic energy flux applied by tectonic loading at a constant strain rate, strong local interactions, and fluctuations generated either dynamically or by fixed material heterogeneity, but they differ significantly in the details of the assumed dynamics and in the methods of numerical solution. However, all exhibit critical or near-critical behavior, with behavior quantitatively consistent with many of the observed fractal or multifractal scaling laws of brittle faulting and earthquakes, including the Gutenberg-Richter law. Some of the results are sensitive to the details of the dynamics and hence are not strict examples of self-organized criticality. Nevertheless, the results of these different physical models share some

  8. The scientifiv way of thinking in statistics, statistical physics and quantum mechanics

    OpenAIRE

    Săvoiu, Gheorghe

    2008-01-01

    This paper focuses on the way of thinking in both classical and modern Physics and Statistics, Statistical Mechanics or Statistical Physics and Quantum Mechanics. These different statistical ways of thinking and their specific methods have generated new fields for new activities and new scientific disciplines, like Econophysics (between Economics and Physics), Sociophysics (between Sociology and Physics), Mediaphysics (between all media and comunication sciences), etc. After describing some r...

  9. The scientific way of thinking in statistics, statistical physics and quantum mechanics

    OpenAIRE

    Săvoiu, Gheorghe

    2008-01-01

    This paper focuses on the way of thinking in both classical and modern Physics and Statistics, Statistical Mechanics or Statistical Physics and Quantum Mechanics. These different statistical ways of thinking and their specific methods have generated new fields for new activities and new scientific disciplines, like Econophysics (between Economics and Physics), Sociophysics (between Sociology and Physics), Mediaphysics (between all media and comunication sciences), etc. After describing some r...

  10. Nuclear Statistical Equilibrium for compact stars: modelling the nuclear energy functional

    International Nuclear Information System (INIS)

    Aymard, Francois

    2015-01-01

    The core collapse supernova is one of the most powerful known phenomena in the universe. It results from the explosion of very massive stars after they have burnt all their fuel. The hot compact remnant, the so-called proto-neutron star, cools down to become an inert catalyzed neutron star. The dynamics and structure of compact stars, that is core collapse supernovae, proto-neutron stars and neutron stars, are still not fully understood and are currently under active research, in association with astrophysical observations and nuclear experiments. One of the key components for modelling compact stars concerns the Equation of State. The task of computing a complete realistic consistent Equation of State for all such stars is challenging because a wide range of densities, proton fractions and temperatures is spanned. This thesis deals with the microscopic modelling of the structure and internal composition of baryonic matter with nucleonic degrees of freedom in compact stars, in order to obtain a realistic unified Equation of State. In particular, we are interested in a formalism which can be applied both at sub-saturation and super-saturation densities, and which gives in the zero temperature limit results compatible with the microscopic Hartree-Fock-Bogoliubov theory with modern realistic effective interactions constrained on experimental nuclear data. For this purpose, we present, for sub-saturated matter, a Nuclear Statistical Equilibrium model which corresponds to a statistical superposition of finite configurations, the so-called Wigner-Seitz cells. Each cell contains a nucleus, or cluster, embedded in a homogeneous electron gas as well as a homogeneous neutron and proton gas. Within each cell, we investigate the different components of the nuclear energy of clusters in interaction with gases. The use of the nuclear mean-field theory for the description of both the clusters and the nucleon gas allows a theoretical consistency with the treatment at saturation

  11. A variational method in out-of-equilibrium physical systems.

    Science.gov (United States)

    Pinheiro, Mario J

    2013-12-09

    We propose a new variational principle for out-of-equilibrium dynamic systems that are fundamentally based on the method of Lagrange multipliers applied to the total entropy of an ensemble of particles. However, we use the fundamental equation of thermodynamics on differential forms, considering U and S as 0-forms. We obtain a set of two first order differential equations that reveal the same formal symplectic structure shared by classical mechanics, fluid mechanics and thermodynamics. From this approach, a topological torsion current emerges of the form , where Aj and ωk denote the components of the vector potential (gravitational and/or electromagnetic) and where ω denotes the angular velocity of the accelerated frame. We derive a special form of the Umov-Poynting theorem for rotating gravito-electromagnetic systems. The variational method is then applied to clarify the working mechanism of particular devices.

  12. The non-equilibrium statistical mechanics of a simple geophysical fluid dynamics model

    Science.gov (United States)

    Verkley, Wim; Severijns, Camiel

    2014-05-01

    Lorenz [1] has devised a dynamical system that has proved to be very useful as a benchmark system in geophysical fluid dynamics. The system in its simplest form consists of a periodic array of variables that can be associated with an atmospheric field on a latitude circle. The system is driven by a constant forcing, is damped by linear friction and has a simple advection term that causes the model to behave chaotically if the forcing is large enough. Our aim is to predict the statistics of Lorenz' model on the basis of a given average value of its total energy - obtained from a numerical integration - and the assumption of statistical stationarity. Our method is the principle of maximum entropy [2] which in this case reads: the information entropy of the system's probability density function shall be maximal under the constraints of normalization, a given value of the average total energy and statistical stationarity. Statistical stationarity is incorporated approximately by using `stationarity constraints', i.e., by requiring that the average first and possibly higher-order time-derivatives of the energy are zero in the maximization of entropy. The analysis [3] reveals that, if the first stationarity constraint is used, the resulting probability density function rather accurately reproduces the statistics of the individual variables. If the second stationarity constraint is used as well, the correlations between the variables are also reproduced quite adequately. The method can be generalized straightforwardly and holds the promise of a viable non-equilibrium statistical mechanics of the forced-dissipative systems of geophysical fluid dynamics. [1] E.N. Lorenz, 1996: Predictability - A problem partly solved, in Proc. Seminar on Predictability (ECMWF, Reading, Berkshire, UK), Vol. 1, pp. 1-18. [2] E.T. Jaynes, 2003: Probability Theory - The Logic of Science (Cambridge University Press, Cambridge). [3] W.T.M. Verkley and C.A. Severijns, 2014: The maximum entropy

  13. Statistical equilibrium calculations for silicon in early-type model stellar atmospheres

    International Nuclear Information System (INIS)

    Kamp, L.W.

    1976-02-01

    Line profiles of 36 multiplets of silicon (Si) II, III, and IV were computed for a grid of model atmospheres covering the range from 15,000 to 35,000 K in effective temperature and 2.5 to 4.5 in log (gravity). The computations involved simultaneous solution of the steady-state statistical equilibrium equations for the populations and of the equation of radiative transfer in the lines. The variables were linearized, and successive corrections were computed until a minimal accuracy of 1/1000 in the line intensities was reached. The common assumption of local thermodynamic equilibrium (LTE) was dropped. The model atmospheres used also were computed by non-LTE methods. Some effects that were incorporated into the calculations were the depression of the continuum by free electrons, hydrogen and ionized helium line blocking, and auto-ionization and dielectronic recombination, which later were found to be insignificant. Use of radiation damping and detailed electron (quadratic Stark) damping constants had small but significant effects on the strong resonance lines of Si III and IV. For weak and intermediate-strength lines, large differences with respect to LTE computations, the results of which are also presented, were found in line shapes and strengths. For the strong lines the differences are generally small, except for the models at the hot, low-gravity extreme of the range. These computations should be useful in the interpretation of the spectra of stars in the spectral range B0--B5, luminosity classes III, IV, and V

  14. Statistical Physics of Complex Substitutive Systems

    Science.gov (United States)

    Jin, Qing

    Diffusion processes are central to human interactions. Despite extensive studies that span multiple disciplines, our knowledge is limited to spreading processes in non-substitutive systems. Yet, a considerable number of ideas, products, and behaviors spread by substitution; to adopt a new one, agents must give up an existing one. This captures the spread of scientific constructs--forcing scientists to choose, for example, a deterministic or probabilistic worldview, as well as the adoption of durable items, such as mobile phones, cars, or homes. In this dissertation, I develop a statistical physics framework to describe, quantify, and understand substitutive systems. By empirically exploring three collected high-resolution datasets pertaining to such systems, I build a mechanistic model describing substitutions, which not only analytically predicts the universal macroscopic phenomenon discovered in the collected datasets, but also accurately captures the trajectories of individual items in a complex substitutive system, demonstrating a high degree of regularity and universality in substitutive systems. I also discuss the origins and insights of the parameters in the substitution model and possible generalization form of the mathematical framework. The systematical study of substitutive systems presented in this dissertation could potentially guide the understanding and prediction of all spreading phenomena driven by substitutions, from electric cars to scientific paradigms, and from renewable energy to new healthy habits.

  15. Statistical physics of media processes: Mediaphysics

    Science.gov (United States)

    Kuznetsov, Dmitri V.; Mandel, Igor

    2007-04-01

    The processes of mass communications in complicated social or sociobiological systems such as marketing, economics, politics, animal populations, etc. as a subject for the special scientific subbranch-“mediaphysics”-are considered in its relation with sociophysics. A new statistical physics approach to analyze these phenomena is proposed. A keystone of the approach is an analysis of population distribution between two or many alternatives: brands, political affiliations, or opinions. Relative distances between a state of a “person's mind” and the alternatives are measures of propensity to buy (to affiliate, or to have a certain opinion). The distribution of population by those relative distances is time dependent and affected by external (economic, social, marketing, natural) and internal (influential propagation of opinions, “word of mouth”, etc.) factors, considered as fields. Specifically, the interaction and opinion-influence field can be generalized to incorporate important elements of Ising-spin-based sociophysical models and kinetic-equation ones. The distributions were described by a Schrödinger-type equation in terms of Green's functions. The developed approach has been applied to a real mass-media efficiency problem for a large company and generally demonstrated very good results despite low initial correlations of factors and the target variable.

  16. Stochastic Spatial Models in Ecology: A Statistical Physics Approach

    Science.gov (United States)

    Pigolotti, Simone; Cencini, Massimo; Molina, Daniel; Muñoz, Miguel A.

    2017-11-01

    Ecosystems display a complex spatial organization. Ecologists have long tried to characterize them by looking at how different measures of biodiversity change across spatial scales. Ecological neutral theory has provided simple predictions accounting for general empirical patterns in communities of competing species. However, while neutral theory in well-mixed ecosystems is mathematically well understood, spatial models still present several open problems, limiting the quantitative understanding of spatial biodiversity. In this review, we discuss the state of the art in spatial neutral theory. We emphasize the connection between spatial ecological models and the physics of non-equilibrium phase transitions and how concepts developed in statistical physics translate in population dynamics, and vice versa. We focus on non-trivial scaling laws arising at the critical dimension D = 2 of spatial neutral models, and their relevance for biological populations inhabiting two-dimensional environments. We conclude by discussing models incorporating non-neutral effects in the form of spatial and temporal disorder, and analyze how their predictions deviate from those of purely neutral theories.

  17. Phase equilibrium and physical properties of biobased ionic liquid mixtures.

    Science.gov (United States)

    Toledo Hijo, Ariel A C; Maximo, Guilherme J; Cunha, Rosiane L; Fonseca, Felipe H S; Cardoso, Lisandro P; Pereira, Jorge F B; Costa, Mariana C; Batista, Eduardo A C; Meirelles, Antonio J A

    2018-02-28

    Protic ionic liquid crystals (PILCs) obtained from natural sources are promising compounds due to their peculiar properties and sustainable appeal. However, obtaining PILCs with higher thermal and mechanical stabilities for product and process design is in demand and studies on such approaches using this new IL generation are still scarce. In this context, this work discloses an alternative way for tuning the physicochemical properties of ILCs by mixing PILs. New binary mixtures of PILs derived from fatty acids and 2-hydroxy ethylamines have been synthesized here and investigated through the characterization of the solid-solid-[liquid crystal]-liquid thermodynamic equilibrium and their rheological and critical micellar concentration profiles. The mixtures presented a marked nonideal melting profile with the formation of solid solutions. This work revealed an improvement of the PILCs' properties based on a significant increase in the ILC temperature domain and the obtainment of more stable mesophases at high temperatures when compared to pure PILs. In addition, mixtures of PILs also showed significant changes in their non-Newtonian and viscosity profile up to 100 s -1 , as well as mechanical stability over a wide temperature range. The enhancement of the physicochemical properties of PILs here disclosed by such an approach leads to more new possibilities of their industrial application at high temperatures.

  18. Tropical limit and a micro-macro correspondence in statistical physics

    Science.gov (United States)

    Angelelli, Mario

    2017-10-01

    Tropical mathematics is used to establish a correspondence between certain microscopic and macroscopic objects in statistical models. Tropical algebra gives a common framework for macrosystems (subsets) and their elementary constituents (elements) that is well-behaved with respect to composition. This kind of connection is studied with maps that preserve a monoid structure. The approach highlights an underlying order relation that is explored through the concepts of filter and ideal. Particular attention is paid to asymmetry and duality between max- and min-criteria. Physical implementations are presented through simple examples in thermodynamics and non-equilibrium physics. The phenomenon of ultrametricity, the notion of tropical equilibrium and the role of ground energy in non-equilibrium models are discussed. Tropical symmetry, i.e. idempotence, is investigated.

  19. Statistical spectroscopic studies in nuclear structure physics

    International Nuclear Information System (INIS)

    Halemane, T.R.

    1979-01-01

    The spectral distribution theory establishes the centroid and width of the energy spectrum as quantities of fundamental importance and gives credence to a geometry associated with averages of the product of pairs of operators acting within a model space. Utilizing this fact and partitioning the model space according to different group symmetries, simple and physically meaningful expansions are obtained for the model interactions. In the process, a global measure for the goodness of group symmetries is also developed. This procedure could eventually lead to a new way of constructing model interactions for nuclear structure studies. Numerical results for six (ds)-shell interactions and for scalar-isospin, configuration-isospin, space symmetry, supermultiplet and SU(e) x SU(4) group structures are presented. The notion of simultaneous propagation of operator averages in the irreps of two or more groups (not necessarily commuting) is also introduced. The non-energy-weighted sum rule (NEWSR) for electric and magnetic multipole excitations in the (ds)-shell nuclei 20 Ne, 24 Mg, 28 Si, 32 S, and 36 Ar are evaluated. A generally applicable procedure for evaluating the eigenvalue bound to the NEWSR is presented and numerical results obtained for the said excitations and nuclei. Comparisons are made with experimental data and shell-model results. Further, a general theory is given for the linear-energy-weighted sum rule (LEWSR). When the Hamiltonian is one-body, this has a very simple form (expressible in terms of occupancies) and amounts to an extension of the Kurath sum rule to other types of excitations and to arbitrary one-body Hamiltonians. Finally, we develop a statistical approach to perturbation theory and inverse-energy-weighted sum rules, and indicate some applications

  20. Lattice ellipsoidal statistical BGK model for thermal non-equilibrium flows

    Science.gov (United States)

    Meng, Jianping; Zhang, Yonghao; Hadjiconstantinou, Nicolas G.; Radtke, Gregg A.; Shan, Xiaowen

    2013-03-01

    A thermal lattice Boltzmann model is constructed on the basis of the ellipsoidal statistical Bhatnagar-Gross-Krook (ES-BGK) collision operator via the Hermite moment representation. The resulting lattice ES-BGK model uses a single distribution function and features an adjustable Prandtl number. Numerical simulations show that using a moderate discrete velocity set, this model can accurately recover steady and transient solutions of the ES-BGK equation in the slip-flow and early transition regimes in the small Mach number limit that is typical of microscale problems of practical interest. In the transition regime in particular, comparisons with numerical solutions of the ES-BGK model, direct Monte Carlo and low-variance deviational Monte Carlo simulations show good accuracy for values of the Knudsen number up to approximately 0.5. On the other hand, highly non-equilibrium phenomena characterized by high Mach numbers, such as viscous heating and force-driven Poiseuille flow for large values of the driving force, are more difficult to capture quantitatively in the transition regime using discretizations chosen with computational efficiency in mind such as the one used here, although improved accuracy is observed as the number of discrete velocities is increased.

  1. Conceptual developments of non-equilibrium statistical mechanics in the early days of Japan

    Science.gov (United States)

    Ichiyanagi, Masakazu

    1995-11-01

    This paper reviews the research in nonequilibrium statistical mechanics made in Japan in the period between 1930 and 1960. Nearly thirty years have passed since the discovery of the exact formula for the electrical conductivity. With the rise of the linear response theory, the methods and results of which are quickly grasped by anyone, its rationale was pushed aside and even at the stage where the formulation was still incomplete some authors hurried to make physical applications. Such an attitude robbed it of most of its interest for the average physicist, who would approach an understanding of some basic concept, not through abstract and logical analysis but by simply increasing his technical experiences with the concept. The purpose of this review is to rescue the linear response theory from being labeled a mathematical tool and to show that it has considerable physical content. Many key papers, originally written in Japanese, are reproduced.

  2. Statistical physics approaches to Alzheimer's disease

    Science.gov (United States)

    Peng, Shouyong

    Alzheimer's disease (AD) is the most common cause of late life dementia. In the brain of an AD patient, neurons are lost and spatial neuronal organizations (microcolumns) are disrupted. An adequate quantitative analysis of microcolumns requires that we automate the neuron recognition stage in the analysis of microscopic images of human brain tissue. We propose a recognition method based on statistical physics. Specifically, Monte Carlo simulations of an inhomogeneous Potts model are applied for image segmentation. Unlike most traditional methods, this method improves the recognition of overlapped neurons, and thus improves the overall recognition percentage. Although the exact causes of AD are unknown, as experimental advances have revealed the molecular origin of AD, they have continued to support the amyloid cascade hypothesis, which states that early stages of aggregation of amyloid beta (Abeta) peptides lead to neurodegeneration and death. X-ray diffraction studies reveal the common cross-beta structural features of the final stable aggregates-amyloid fibrils. Solid-state NMR studies also reveal structural features for some well-ordered fibrils. But currently there is no feasible experimental technique that can reveal the exact structure or the precise dynamics of assembly and thus help us understand the aggregation mechanism. Computer simulation offers a way to understand the aggregation mechanism on the molecular level. Because traditional all-atom continuous molecular dynamics simulations are not fast enough to investigate the whole aggregation process, we apply coarse-grained models and discrete molecular dynamics methods to increase the simulation speed. First we use a coarse-grained two-bead (two beads per amino acid) model. Simulations show that peptides can aggregate into multilayer beta-sheet structures, which agree with X-ray diffraction experiments. To better represent the secondary structure transition happening during aggregation, we refine the

  3. A statistical physics of stationary and metastable states

    International Nuclear Information System (INIS)

    Cabo, A; González, A; Curilef, S; Cabo-Bizet, N G; Vera, C A

    2011-01-01

    We present a generalization of Gibbs statistical mechanics designed to describe a general class of stationary and metastable equilibrium states. It is assumed that the physical system maximizes the entropy functional S subject to the standard conditions plus an extra conserved constraint function F, imposed to force the system to remain in the metastable configuration. After requiring additivity for two quasi-independent subsystems, and the commutation of the new constraint with the density matrix ρ, it is argued that F should be a homogeneous function of ρ, at least for systems in which the spectrum is sufficiently dense to be considered as continuous. Therefore, surprisingly, the analytic form of F turns out to be of the kind F(p i ) = p i q , where the p i are the eigenvalues of the density matrix and q is a real number to be determined. Thus, the discussion identifies the physical relevance of Lagrange multiplier constraints of the Tsallis kind and their q parameter, as enforced by the additivity of the constraint F which fixes the metastable state. An approximate analytic solution for the probability density is found for q close to unity. The procedure is applied to describe the results from the plasma experiment of Huang and Driscoll. For small and medium values of the radial distance, the measured density is predicted with a precision similar to that achieved by minimal enstrophy and Tsallis procedures. Also, the particle density is predicted at all the radial positions. Thus, the discussion gives a solution to the conceptual difficulties of the two above mentioned approaches as applied to this problem, which both predict a non-analytic abrupt vanishing of the density above a critical radial distance

  4. Spectral-Lagrangian methods for collisional models of non-equilibrium statistical states

    International Nuclear Information System (INIS)

    Gamba, Irene M.; Tharkabhushanam, Sri Harsha

    2009-01-01

    We propose a new spectral Lagrangian based deterministic solver for the non-linear Boltzmann transport equation (BTE) in d-dimensions for variable hard sphere (VHS) collision kernels with conservative or non-conservative binary interactions. The method is based on symmetries of the Fourier transform of the collision integral, where the complexity in its computation is reduced to a separate integral over the unit sphere S d-1 . The conservation of moments is enforced by Lagrangian constraints. The resulting scheme, implemented in free space, is very versatile and adjusts in a very simple manner to several cases that involve energy dissipation due to local micro-reversibility (inelastic interactions) or elastic models of slowing down process. Our simulations are benchmarked with available exact self-similar solutions, exact moment equations and analytical estimates for the homogeneous Boltzmann equation, both for elastic and inelastic VHS interactions. Benchmarking of the simulations involves the selection of a time self-similar rescaling of the numerical distribution function which is performed using the continuous spectrum of the equation for Maxwell molecules as studied first in Bobylev et al. [A.V. Bobylev, C. Cercignani, G. Toscani, Proof of an asymptotic property of self-similar solutions of the Boltzmann equation for granular materials, Journal of Statistical Physics 111 (2003) 403-417] and generalized to a wide range of related models in Bobylev et al. [A.V. Bobylev, C. Cercignani, I.M. Gamba, On the self-similar asymptotics for generalized non-linear kinetic Maxwell models, Communication in Mathematical Physics, in press. URL: ( )]. The method also produces accurate results in the case of inelastic diffusive Boltzmann equations for hard spheres (inelastic collisions under thermal bath), where overpopulated non-Gaussian exponential tails have been conjectured in computations by stochastic methods [T.V. Noije, M. Ernst, Velocity distributions in homogeneously

  5. Physics colloquium: Electron counting in quantum dots in and out of equilibrium

    CERN Multimedia

    Geneva University

    2011-01-01

    GENEVA UNIVERSITY Ecole de physique Département de physique nucléaire et corspusculaire 24, quai Ernest-Ansermet 1211 Genève 4 Tél.: (022) 379 62 73 Fax: (022) 379 69 92olé   Lundi 31 octobre 2011 17h00 - Ecole de Physique, Auditoire Stueckelberg PHYSICS COLLOQUIUM « Electron counting in quantum dots in and out of equilibrium » Prof. Klaus Ensslin Solid State Physics Laboratory, ETH Zurich, 8093 Zurich, Switzerland   Electron transport through quantum dots is governed by Coulomb blockade. Using a nearby quantum point contact the time-dependent charge flow through quantum dots can be monitored on the basis of single electrons. This way electron transport has been investigated in equilibrium as well as out of equilibrium. Recently it has become possible to experimentally verify the fluctuation theorem. The talk will also address electron counting experiments in grapheme. Une verrée ...

  6. Nonextensive statistical mechanics and high energy physics

    Directory of Open Access Journals (Sweden)

    Tsallis Constantino

    2014-04-01

    Full Text Available The use of the celebrated Boltzmann-Gibbs entropy and statistical mechanics is justified for ergodic-like systems. In contrast, complex systems typically require more powerful theories. We will provide a brief introduction to nonadditive entropies (characterized by indices like q, which, in the q → 1 limit, recovers the standard Boltzmann-Gibbs entropy and associated nonextensive statistical mechanics. We then present somerecent applications to systems such as high-energy collisions, black holes and others. In addition to that, we clarify and illustrate the neat distinction that exists between Lévy distributions and q-exponential ones, a point which occasionally causes some confusion in the literature, very particularly in the LHC literature

  7. On fractional spin symmetries and statistical physics

    International Nuclear Information System (INIS)

    Saidi, E.H.

    1995-09-01

    The partition function Z and the quantum distribution of systems Σ of identical particles of fractional spin s = 1/k mod 1, k ≥ 2, generalizing the well-known Bose and Fermi ones, are derived. The generalized Sommerfeld development of the distribution around T = O deg. K is given. The low temperature analysis of statistical systems Σ is made. Known results are recovered. (author). 26 refs, 6 figs

  8. Statistical and particle physics: Common problems and techniques

    International Nuclear Information System (INIS)

    Bowler, K.C.; Mc Kane, A.J.

    1984-01-01

    These proceedings contain statistical mechanical studies in condensed matter physics; interfacial problems in statistical physics; string theory; general monte carlo methods and their application to Lattice gauge theories; topological excitations in field theory; phase transformation kinetics; and studies of chaotic systems

  9. Statistics a guide to the use of statistical methods in the physical sciences

    CERN Document Server

    Barlow, Roger J

    1989-01-01

    The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition F. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A.C. Phillips Computing for Scienti

  10. New Directions in Statistical Physics: Econophysics, Bioinformatics, and Pattern Recognition

    International Nuclear Information System (INIS)

    Grassberger, P

    2004-01-01

    This book contains 18 contributions from different authors. Its subtitle 'Econophysics, Bioinformatics, and Pattern Recognition' says more precisely what it is about: not so much about central problems of conventional statistical physics like equilibrium phase transitions and critical phenomena, but about its interdisciplinary applications. After a long period of specialization, physicists have, over the last few decades, found more and more satisfaction in breaking out of the limitations set by the traditional classification of sciences. Indeed, this classification had never been strict, and physicists in particular had always ventured into other fields. Helmholtz, in the middle of the 19th century, had considered himself a physicist when working on physiology, stressing that the physics of animate nature is as much a legitimate field of activity as the physics of inanimate nature. Later, Max Delbrueck and Francis Crick did for experimental biology what Schroedinger did for its theoretical foundation. And many of the experimental techniques used in chemistry, biology, and medicine were developed by a steady stream of talented physicists who left their proper discipline to venture out into the wider world of science. The development we have witnessed over the last thirty years or so is different. It started with neural networks where methods could be applied which had been developed for spin glasses, but todays list includes vehicular traffic (driven lattice gases), geology (self-organized criticality), economy (fractal stochastic processes and large scale simulations), engineering (dynamical chaos), and many others. By staying in the physics departments, these activities have transformed the physics curriculum and the view physicists have of themselves. In many departments there are now courses on econophysics or on biological physics, and some universities offer degrees in the physics of traffic or in econophysics. In order to document this change of attitude

  11. Statistical analysis of the equilibrium configurations of the W7-X stellarator using function parameterization

    International Nuclear Information System (INIS)

    Mc Carthy, P.J.; Sengupta, A.; Geiger, J.; Werner, A.

    2005-01-01

    The W7-X stellarator, under construction at IPP-Greifswald, is being designed to demonstrate the steady state capability of fusion devices. Due to the pulse length involved, real time monitoring and control of the discharges is a crucial issue in steady state operations. For W7-X, we have planned a sequence of in-depth analyses of the magnetic configurations which, ultimately, will lead to a proper understanding of plasma equilibrium, stability and transport. It should also provide insight into the parameterization of the various plasma-related quantities which is important from the point of view of real time study. The first step in our sequence of analyses involved a study of the vacuum configuration, including the detectable magnetic islands, of W7-X. We now proceed to the scenario at finite beta considering full magnetohydrodynamic (MHD) equilibria based on vmec2000 calculations. A database of order 10000 equilibria was calculated on the same parameter space for the coil current ratios. The parameters which were varied randomly and independently consist of the external coil current ratios (6), the parameters of the profiles (as functions of normalised toroidal flux) of plasma pressure and toroidal current (4+4) and the plasma size (a eff ) which is required to vary the plasma volume. A statistical analysis, using Function Parametrization (FP), was performed on a sample of well-converged equilibria. The plasma parameters were varied to allow a good FP for the expected values in W7-X, i.e. volume-averaged up to 5% and toroidal net-current of up to ±50 kA for a mean field strength of about 2 T throughout the database. The profiles were chosen as a sequence of polynomials with the property that the addition of a higher order polynomial would not change the lower order volume-averaged moments of the resulting profile. The aim of this was to try to avoid cross correlations in the independent input parameters for the database generation. However, some restrictions

  12. Safety bey statistics? A critical view on statistical methods applied in health physics

    International Nuclear Information System (INIS)

    Kraut, W.

    2016-01-01

    The only proper way to describe uncertainties in health physics is by statistical means. But statistics never can replace Your personal evaluation of effect, nor can statistics transmute randomness into certainty like an ''uncertainty laundry''. The paper discusses these problems in routine practical work.

  13. Statistical Physics and Light-Front Quantization

    Energy Technology Data Exchange (ETDEWEB)

    Raufeisen, J

    2004-08-12

    Light-front quantization has important advantages for describing relativistic statistical systems, particularly systems for which boost invariance is essential, such as the fireball created in a heavy ion collisions. In this paper the authors develop light-front field theory at finite temperature and density with special attention to quantum chromodynamics. They construct the most general form of the statistical operator allowed by the Poincare algebra and show that there are no zero-mode related problems when describing phase transitions. They then demonstrate a direct connection between densities in light-front thermal field theory and the parton distributions measured in hard scattering experiments. The approach thus generalizes the concept of a parton distribution to finite temperature. In light-front quantization, the gauge-invariant Green's functions of a quark in a medium can be defined in terms of just 2-component spinors and have a much simpler spinor structure than the equal-time fermion propagator. From the Green's function, the authors introduce the new concept of a light-front density matrix, whose matrix elements are related to forward and to off-diagonal parton distributions. Furthermore, they explain how thermodynamic quantities can be calculated in discretized light-cone quantization, which is applicable at high chemical potential and is not plagued by the fermion-doubling problems.

  14. A study of complex particle emission in the pre-equilibrium statistical model

    International Nuclear Information System (INIS)

    Miao Rongzhi; Wu Guohua

    1986-01-01

    A concept of the quasi-composite system in the process of the pre-equilibrium emission is presented in this paper. On the basis of the principle of detailed balance, the existence of the factor, [γ β ω(π β , 0, ν β , 0, E-U)g π,ν ], has been proved with an account of the distinguishabllity between protons and neutrons. A formula for the rate of the complex particle emission in the pre-equilibrium process can be obtained. The theoretical calculation results fit the experimental data quite well, especially in the high energy part of the energy spectrum the agreement are much better than ever before

  15. Frontier of plasma physics. 'Research network on non-equilibrium and extreme state plasmas'

    International Nuclear Information System (INIS)

    Itoh, Sanae-I.; Fujisawa, Akihide; Kodama, Ryosuke; Sato, Motoyasu; Tanaka, Kazuo A.; Hatakeyama, Rikizo; Itoh, Kimitaka

    2011-01-01

    Plasma physics and fusion science have been applied to a wide variety of plasmas such as nuclear fusion plasmas, high-energy-density plasmas, processing plasmas and nanobio- plasmas. They are pioneering science and technology frontiers such as new energy sources and new functional materials. A large project 'research network on non-equilibrium and extreme state plasmas' is being proposed to reassess individual plasma researches from a common view of the non-equilibrium extreme plasma and to promote collaboration among plasma researchers all over the country. In the present review, recent collaborative works related to this project are being introduced. (T.I.)

  16. The efficiency of driving chemical reactions by a physical non-equilibrium is kinetically controlled.

    Science.gov (United States)

    Göppel, Tobias; Palyulin, Vladimir V; Gerland, Ulrich

    2016-07-27

    An out-of-equilibrium physical environment can drive chemical reactions into thermodynamically unfavorable regimes. Under prebiotic conditions such a coupling between physical and chemical non-equilibria may have enabled the spontaneous emergence of primitive evolutionary processes. Here, we study the coupling efficiency within a theoretical model that is inspired by recent laboratory experiments, but focuses on generic effects arising whenever reactant and product molecules have different transport coefficients in a flow-through system. In our model, the physical non-equilibrium is represented by a drift-diffusion process, which is a valid coarse-grained description for the interplay between thermophoresis and convection, as well as for many other molecular transport processes. As a simple chemical reaction, we consider a reversible dimerization process, which is coupled to the transport process by different drift velocities for monomers and dimers. Within this minimal model, the coupling efficiency between the non-equilibrium transport process and the chemical reaction can be analyzed in all parameter regimes. The analysis shows that the efficiency depends strongly on the Damköhler number, a parameter that measures the relative timescales associated with the transport and reaction kinetics. Our model and results will be useful for a better understanding of the conditions for which non-equilibrium environments can provide a significant driving force for chemical reactions in a prebiotic setting.

  17. Methods of contemporary mathematical statistical physics

    CERN Document Server

    2009-01-01

    This volume presents a collection of courses introducing the reader to the recent progress with attention being paid to laying solid grounds and developing various basic tools. An introductory chapter on lattice spin models is useful as a background for other lectures of the collection. The topics include new results on phase transitions for gradient lattice models (with introduction to the techniques of the reflection positivity), stochastic geometry reformulation of classical and quantum Ising models, the localization/delocalization transition for directed polymers. A general rigorous framework for theory of metastability is presented and particular applications in the context of Glauber and Kawasaki dynamics of lattice models are discussed. A pedagogical account of several recently discussed topics in nonequilibrium statistical mechanics with an emphasis on general principles is followed by a discussion of kinetically constrained spin models that are reflecting important peculiar features of glassy dynamic...

  18. Self-organization of grafted polyelectrolyte layers via the coupling of chemical equilibrium and physical interactions.

    Science.gov (United States)

    Tagliazucchi, Mario; de la Cruz, Mónica Olvera; Szleifer, Igal

    2010-03-23

    The competition between chemical equilibrium, for example protonation, and physical interactions determines the molecular organization and functionality of biological and synthetic systems. Charge regulation by displacement of acid-base equilibrium induced by changes in the local environment provides a feedback mechanism that controls the balance between electrostatic, van der Waals, steric interactions and molecular organization. Which strategies do responsive systems follow to globally optimize chemical equilibrium and physical interactions? We address this question by theoretically studying model layers of end-grafted polyacids. These layers spontaneously form self-assembled aggregates, presenting domains of controlled local pH and whose morphologies can be manipulated by the composition of the solution in contact with the film. Charge regulation stabilizes micellar domains over a wide range of pH by reducing the local charge in the aggregate at the cost of chemical free energy and gaining in hydrophobic interactions. This balance determines the boundaries between different aggregate morphologies. We show that a qualitatively new form of organization arises from the coupling between physical interactions and protonation equilibrium. This optimization strategy presents itself with polyelectrolytes coexisting in two different and well-defined protonation states. Our results underline the need of considering the coupling between chemical equilibrium and physical interactions due to their highly nonadditive behavior. The predictions provide guidelines for the creation of responsive polymer layers presenting self-organized patterns with functional properties and they give insights for the understanding of competing interactions in highly inhomogeneous and constrained environments such as those relevant in nanotechnology and those responsible for biological cells function.

  19. Statistical physics, neural networks, brain studies

    International Nuclear Information System (INIS)

    Toulouse, G.

    1999-01-01

    An overview of some aspects of a vast domain, located at the crossroads of physics, biology and computer science is presented: (1) During the last fifteen years, physicists advancing along various pathways have come into contact with biology (computational neurosciences) and engineering (formal neural nets). (2) This move may actually be viewed as one component in a larger picture. A prominent trend of recent years, observable over many countries, has been the establishment of interdisciplinary centers devoted to the study of: cognitive sciences; natural and artificial intelligence; brain, mind and behaviour; perception and action; learning and memory; robotics; man-machine communication, etc. What are the promising lines of development? What opportunities for physicists? An attempt will be made to address such questions and related issues

  20. Probability and statistics for particle physics

    CERN Document Server

    Mana, Carlos

    2017-01-01

    This book comprehensively presents the basic concepts of probability and Bayesian inference with sufficient generality to make them applicable to current problems in scientific research. The first chapter provides the fundamentals of probability theory that are essential for the analysis of random phenomena. The second chapter includes a full and pragmatic review of the Bayesian methods that constitute a natural and coherent framework with enough freedom to analyze all the information available from experimental data in a conceptually simple manner. The third chapter presents the basic Monte Carlo techniques used in scientific research, allowing a large variety of problems to be handled difficult to tackle by other procedures. The author also introduces a basic algorithm, which enables readers to simulate samples from simple distribution, and describes useful cases for researchers in particle physics.The final chapter is devoted to the basic ideas of Information Theory, which are important in the Bayesian me...

  1. What can we learn from noise? - Mesoscopic nonequilibrium statistical physics.

    Science.gov (United States)

    Kobayashi, Kensuke

    2016-01-01

    Mesoscopic systems - small electric circuits working in quantum regime - offer us a unique experimental stage to explorer quantum transport in a tunable and precise way. The purpose of this Review is to show how they can contribute to statistical physics. We introduce the significance of fluctuation, or equivalently noise, as noise measurement enables us to address the fundamental aspects of a physical system. The significance of the fluctuation theorem (FT) in statistical physics is noted. We explain what information can be deduced from the current noise measurement in mesoscopic systems. As an important application of the noise measurement to statistical physics, we describe our experimental work on the current and current noise in an electron interferometer, which is the first experimental test of FT in quantum regime. Our attempt will shed new light in the research field of mesoscopic quantum statistical physics.

  2. Symmetry, Invariance and Ontology in Physics and Statistics

    Directory of Open Access Journals (Sweden)

    Julio Michael Stern

    2011-09-01

    Full Text Available This paper has three main objectives: (a Discuss the formal analogy between some important symmetry-invariance arguments used in physics, probability and statistics. Specifically, we will focus on Noether’s theorem in physics, the maximum entropy principle in probability theory, and de Finetti-type theorems in Bayesian statistics; (b Discuss the epistemological and ontological implications of these theorems, as they are interpreted in physics and statistics. Specifically, we will focus on the positivist (in physics or subjective (in statistics interpretations vs. objective interpretations that are suggested by symmetry and invariance arguments; (c Introduce the cognitive constructivism epistemological framework as a solution that overcomes the realism-subjectivism dilemma and its pitfalls. The work of the physicist and philosopher Max Born will be particularly important in our discussion.

  3. Statistical physics and thermodynamics an introduction to key concepts

    CERN Document Server

    Rau, Jochen

    2017-01-01

    Statistical physics and thermodynamics describe the behaviour of systems on the macroscopic scale. Their methods are applicable to a wide range of phenomena: from refrigerators to the interior of stars, from chemical reactions to magnetism. Indeed, of all physical laws, the laws of thermodynamics are perhaps the most universal. This text provides a concise yet thorough introduction to the key concepts which underlie statistical physics and thermodynamics. It begins with a review of classical probability theory and quantum theory, as well as a careful discussion of the notions of information and entropy, prior to embarking on the development of statistical physics proper. The crucial steps leading from the microscopic to the macroscopic domain are rendered transparent. In particular, the laws of thermodynamics are shown to emerge as natural consequences of the statistical framework. While the emphasis is on clarifying the basic concepts, the text also contains many applications and classroom-tested exercises,...

  4. 1. Warsaw School of Statistical Physics - Poster Abstracts

    International Nuclear Information System (INIS)

    2005-01-01

    The abstracts of information presented in posters during '1st Warsaw School of Statistical Physics' which held in Kazimierz Dolny - Poland are presented. They cover different aspects of statistical processes like diffusion, fluid hydrodynamics as well as modern quantum mechanical methods of their solutions

  5. Physics of far-from-equilibrium systems and self-organization

    International Nuclear Information System (INIS)

    Nicolis, G.

    1993-01-01

    The status of self-organization phenomena from the stand point of the physical sciences are analyzed. Non linear dynamics and the presence of constraints maintaining the system far from equilibrium are shown to be the basic mechanism involved in the emergence of these phenomena. Some particularly representative experiments are first presented: thermal conversion, chemical reactions (Benard problem), biological systems, and their explanation through order, disorder, non-linearity, irreversibility, stability, bifurcation, symmetry breaking, etc., concepts. Then it is shown how the self-organization paradigm allows to model problems outside the traditional realm of the physical sciences. 29 figs., 27 refs

  6. Correlated randomness: Some examples of exotic statistical physics

    Indian Academy of Sciences (India)

    journal of. May 2005 physics pp. 645–660. Correlated randomness: Some examples of exotic statistical physics .... The key idea is that scale invariance is a statement not about algebraic .... Very recently an article appeared in Phys. Rev. ... One quarter of any newspaper with a financial section is filled with economic fluc-.

  7. Introduction to modern theoretical physics. Volume II. Quantum theory and statistical physics

    International Nuclear Information System (INIS)

    Harris, E.G.

    1975-01-01

    The topics discussed include the history and principles, some solvable problems, and symmetry in quantum mechanics, interference phenomena, approximation methods, some applications of nonrelativistic quantum mechanics, relativistic wave equations, quantum theory of radiation, second quantization, elementary particles and their interactions, thermodynamics, equilibrium statistical mechanics and its applications, the kinetic theory of gases, and collective phenomena

  8. The equilibrium theory of inhomogeneous polymers (international series of monographs on physics)

    CERN Document Server

    Fredrickson, Glenn

    2013-01-01

    The Equilibrium Theory of Inhomogeneous Polymers provides an introduction to the field-theoretic methods and computer simulation techniques that are used in the design of structured polymeric fluids. By such methods, the principles that dictate equilibrium self-assembly in systems ranging from block and graft copolymers, to polyelectrolytes, liquid crystalline polymers, and polymer nanocomposites can be established. Building on an introductory discussion of single-polymer statistical mechanics, the book provides a detailed treatment of analytical and numerical techniques for addressing the conformational properties of polymers subjected to spatially-varying potential fields. This problem is shown to be central to the field-theoretic description of interacting polymeric fluids, and models for a number of important polymer systems are elaborated. Chapter 5 serves to unify and expound the topic of self-consistent field theory, which is a collection of analytical and numerical techniques for obtaining solutions o...

  9. Toward a Multi-scale Phase Transition Kinetics Methodology: From Non-Equilibrium Statistical Mechanics to Hydrodynamics

    Science.gov (United States)

    Belof, Jonathan; Orlikowski, Daniel; Wu, Christine; McLaughlin, Keith

    2013-06-01

    Shock and ramp compression experiments are allowing us to probe condensed matter under extreme conditions where phase transitions and other non-equilibrium aspects can now be directly observed, but first principles simulation of kinetics remains a challenge. A multi-scale approach is presented here, with non-equilibrium statistical mechanical quantities calculated by molecular dynamics (MD) and then leveraged to inform a classical nucleation and growth kinetics model at the hydrodynamic scale. Of central interest is the free energy barrier for the formation of a critical nucleus, with direct NEMD presenting the challenge of relatively long timescales necessary to resolve nucleation. Rather than attempt to resolve the time-dependent nucleation sequence directly, the methodology derived here is built upon the non-equilibrium work theorem in order to bias the formation of a critical nucleus and thus construct the nucleation and growth rates. Having determined these kinetic terms from MD, a hydrodynamics implementation of Kolmogorov-Johnson-Mehl-Avrami (KJMA) kinetics and metastabilty is applied to the dynamic compressive freezing of water and compared with recent ramp compression experiments [Dolan et al., Nature (2007)] Lawrence Livermore National Laboratory is operated by Lawrence Livermore National Security, LLC, for the U.S. Department of Energy, National Nuclear Security Administration under Contract DE-AC52-07NA27344.

  10. Stochastic optimal control as non-equilibrium statistical mechanics: calculus of variations over density and current

    Science.gov (United States)

    Chernyak, Vladimir Y.; Chertkov, Michael; Bierkens, Joris; Kappen, Hilbert J.

    2014-01-01

    In stochastic optimal control (SOC) one minimizes the average cost-to-go, that consists of the cost-of-control (amount of efforts), cost-of-space (where one wants the system to be) and the target cost (where one wants the system to arrive), for a system participating in forced and controlled Langevin dynamics. We extend the SOC problem by introducing an additional cost-of-dynamics, characterized by a vector potential. We propose derivation of the generalized gauge-invariant Hamilton-Jacobi-Bellman equation as a variation over density and current, suggest hydrodynamic interpretation and discuss examples, e.g., ergodic control of a particle-within-a-circle, illustrating non-equilibrium space-time complexity.

  11. Statistical Mechanics of the Human Placenta: A Stationary State of a Near-Equilibrium System in a Linear Regime.

    Science.gov (United States)

    Lecarpentier, Yves; Claes, Victor; Hébert, Jean-Louis; Krokidis, Xénophon; Blanc, François-Xavier; Michel, Francine; Timbely, Oumar

    2015-01-01

    All near-equilibrium systems under linear regime evolve to stationary states in which there is constant entropy production rate. In an open chemical system that exchanges matter and energy with the exterior, we can identify both the energy and entropy flows associated with the exchange of matter and energy. This can be achieved by applying statistical mechanics (SM), which links the microscopic properties of a system to its bulk properties. In the case of contractile tissues such as human placenta, Huxley's equations offer a phenomenological formalism for applying SM. SM was investigated in human placental stem villi (PSV) (n = 40). PSV were stimulated by means of KCl exposure (n = 20) and tetanic electrical stimulation (n = 20). This made it possible to determine statistical entropy (S), internal energy (E), affinity (A), thermodynamic force (A / T) (T: temperature), thermodynamic flow (v) and entropy production rate (A / T x v). We found that PSV operated near equilibrium, i.e., A ≺≺ 2500 J/mol and in a stationary linear regime, i.e., (A / T) varied linearly with v. As v was dramatically low, entropy production rate which quantified irreversibility of chemical processes appeared to be the lowest ever observed in any contractile system.

  12. A Concise Introduction to the Statistical Physics of Complex Systems

    CERN Document Server

    Bertin, Eric

    2012-01-01

    This concise primer (based on lectures given at summer schools on complex systems and on a masters degree course in complex systems modeling) will provide graduate students and newcomers to the field with the basic knowledge of the concepts and methods of statistical physics and its potential for application to interdisciplinary topics.  Indeed, in recent years, statistical physics has begun to attract the interest of a broad community of researchers in the field of complex system sciences, ranging from biology to the social sciences, economics and computer science. More generally, a growing number of graduate students and researchers feel the need to learn some basic concepts and questions originating in other disciplines without necessarily having to master all of the corresponding technicalities and jargon. Generally speaking, the goals of statistical physics may be summarized as follows: on the one hand to study systems composed of a large number of interacting ‘entities’, and on the other to predict...

  13. Statistical physics of complex systems a concise introduction

    CERN Document Server

    Bertin, Eric

    2016-01-01

    This course-tested primer provides graduate students and non-specialists with a basic understanding of the concepts and methods of statistical physics and demonstrates their wide range of applications to interdisciplinary topics in the field of complex system sciences, including selected aspects of theoretical modeling in biology and the social sciences. Generally speaking, the goals of statistical physics may be summarized as follows: on the one hand to study systems composed of a large number of interacting units, and on the other to predict the macroscopic, collective behavior of the system considered from the perspective of the microscopic laws governing the dynamics of the individual entities. These two goals are essentially also shared by what is now called 'complex systems science', and as such, systems studied in the framework of statistical physics may be considered to be among the simplest examples of complex systems – while also offering a rather well developed mathematical treatment. The second ...

  14. Heuristic versus statistical physics approach to optimization problems

    International Nuclear Information System (INIS)

    Jedrzejek, C.; Cieplinski, L.

    1995-01-01

    Optimization is a crucial ingredient of many calculation schemes in science and engineering. In this paper we assess several classes of methods: heuristic algorithms, methods directly relying on statistical physics such as the mean-field method and simulated annealing; and Hopfield-type neural networks and genetic algorithms partly related to statistical physics. We perform the analysis for three types of problems: (1) the Travelling Salesman Problem, (2) vector quantization, and (3) traffic control problem in multistage interconnection network. In general, heuristic algorithms perform better (except for genetic algorithms) and much faster but have to be specific for every problem. The key to improving the performance could be to include heuristic features into general purpose statistical physics methods. (author)

  15. Far from Equilibrium Percolation, Stochastic and Shape Resonances in the Physics of Life

    Directory of Open Access Journals (Sweden)

    Antonio Bianconi

    2011-10-01

    Full Text Available Key physical concepts, relevant for the cross-fertilization between condensed matter physics and the physics of life seen as a collective phenomenon in a system out-of-equilibrium, are discussed. The onset of life can be driven by: (a the critical fluctuations at the protonic percolation threshold in membrane transport; (b the stochastic resonance in biological systems, a mechanism that can exploit external and self-generated noise in order to gain efficiency in signal processing; and (c the shape resonance (or Fano resonance or Feshbach resonance in the association and dissociation processes of bio-molecules (a quantum mechanism that could play a key role to establish a macroscopic quantum coherence in the cell.

  16. A statistical physics perspective on criticality in financial markets

    International Nuclear Information System (INIS)

    Bury, Thomas

    2013-01-01

    Stock markets are complex systems exhibiting collective phenomena and particular features such as synchronization, fluctuations distributed as power-laws, non-random structures and similarity to neural networks. Such specific properties suggest that markets operate at a very special point. Financial markets are believed to be critical by analogy to physical systems, but little statistically founded evidence has been given. Through a data-based methodology and comparison to simulations inspired by the statistical physics of complex systems, we show that the Dow Jones and index sets are not rigorously critical. However, financial systems are closer to criticality in the crash neighborhood. (paper)

  17. Local equilibrium

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1984-12-15

    From 3-6 September the First International Workshop on Local Equilibrium in Strong Interaction Physics took place in Bad-Honnef at the Physics Centre of the German Physical Society. A number of talks covered the experimental and theoretical investigation of the 'hotspots' effect, both in high energy particle physics and in intermediate energy nuclear physics.

  18. Statistical physics of hard combinatorial optimization: Vertex cover problem

    Science.gov (United States)

    Zhao, Jin-Hua; Zhou, Hai-Jun

    2014-07-01

    Typical-case computation complexity is a research topic at the boundary of computer science, applied mathematics, and statistical physics. In the last twenty years, the replica-symmetry-breaking mean field theory of spin glasses and the associated message-passing algorithms have greatly deepened our understanding of typical-case computation complexity. In this paper, we use the vertex cover problem, a basic nondeterministic-polynomial (NP)-complete combinatorial optimization problem of wide application, as an example to introduce the statistical physical methods and algorithms. We do not go into the technical details but emphasize mainly the intuitive physical meanings of the message-passing equations. A nonfamiliar reader shall be able to understand to a large extent the physics behind the mean field approaches and to adjust the mean field methods in solving other optimization problems.

  19. Cosolutes effects on aqueous two-phase systems equilibrium formation studied by physical approaches.

    Science.gov (United States)

    Bertoluzzo, M Guadalupe; Rigatuso, Rubén; Farruggia, Beatriz; Nerli, Bibiana; Picó, Guillermo

    2007-10-01

    The effect of urea and sodium salts of monovalent halides on the aqueous polyethyleneglycol solution and binodal diagrams of polyethyleneglycol-potassium phosphate (polyethyleneglycol of molecular mass 1500, 4000, 6000 and 8000) were studied using different physical approaches. The effect of these solutes on the binodal diagram for polyethyleneglycol-potassium phosphate was also investigated. The cosolutes affected in a significant manner the water structured around the ethylene chain of polyethyleneglycol inducing a lost of this. The equilibrium curves for the aqueous two-phase systems were fitting very well by a sigmoidal function with two parameters, which are closely related with the cosolute structure making or breaking capacity on the water ordered.

  20. Statistical mechanics

    CERN Document Server

    Schwabl, Franz

    2006-01-01

    The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...

  1. "Statistical Techniques for Particle Physics" (2/4)

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...

  2. "Statistical Techniques for Particle Physics" (1/4)

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...

  3. "Statistical Techniques for Particle Physics" (4/4)

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...

  4. "Statistical Techniques for Particle Physics" (3/4)

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...

  5. Statistical Physics in the Era of Big Data

    Science.gov (United States)

    Wang, Dashun

    2013-01-01

    With the wealth of data provided by a wide range of high-throughout measurement tools and technologies, statistical physics of complex systems is entering a new phase, impacting in a meaningful fashion a wide range of fields, from cell biology to computer science to economics. In this dissertation, by applying tools and techniques developed in…

  6. Renormalization group in statistical physics - momentum and real spaces

    International Nuclear Information System (INIS)

    Yukalov, V.I.

    1988-01-01

    Two variants of the renormalization group approach in statistical physics are considered, the renormalization group in the momentum and the renormalization group in the real spaces. Common properties of these methods and their differences are cleared up. A simple model for investigating the crossover between different universality classes is suggested. 27 refs

  7. Academic Training Lecture: Statistical Methods for Particle Physics

    CERN Multimedia

    PH Department

    2012-01-01

    2, 3, 4 and 5 April 2012 Academic Training Lecture  Regular Programme from 11:00 to 12:00 -  Bldg. 222-R-001 - Filtration Plant Statistical Methods for Particle Physics by Glen Cowan (Royal Holloway) The series of four lectures will introduce some of the important statistical methods used in Particle Physics, and should be particularly relevant to those involved in the analysis of LHC data. The lectures will include an introduction to statistical tests, parameter estimation, and the application of these tools to searches for new phenomena.  Both frequentist and Bayesian methods will be described, with particular emphasis on treatment of systematic uncertainties.  The lectures will also cover unfolding, that is, estimation of a distribution in binned form where the variable in question is subject to measurement errors.

  8. Heat capacity of the neutron star inner crust within an extended nuclear statistical equilibrium model

    Science.gov (United States)

    Burrello, S.; Gulminelli, F.; Aymard, F.; Colonna, M.; Raduta, Ad. R.

    2015-11-01

    Background: Superfluidity in the crust is a key ingredient for the cooling properties of proto-neutron stars. Present theoretical calculations employ the quasiparticle mean-field Hartree-Fock-Bogoliubov theory with temperature-dependent occupation numbers for the quasiparticle states. Purpose: Finite temperature stellar matter is characterized by a whole distribution of different nuclear species. We want to assess the importance of this distribution on the calculation of heat capacity in the inner crust. Method: Following a recent work, the Wigner-Seitz cell is mapped into a model with cluster degrees of freedom. The finite temperature distribution is then given by a statistical collection of Wigner-Seitz cells. We additionally introduce pairing correlations in the local density BCS approximation both in the homogeneous unbound neutron component, and in the interface region between clusters and neutrons. Results: The heat capacity is calculated in the different baryonic density conditions corresponding to the inner crust, and in a temperature range varying from 100 KeV to 2 MeV. We show that accounting for the cluster distribution has a small effect at intermediate densities, but it considerably affects the heat capacity both close to the outer crust and close to the core. We additionally show that it is very important to consider the temperature evolution of the proton fraction for a quantitatively reliable estimation of the heat capacity. Conclusions: We present the first modelization of stellar matter containing at the same time a statistical distribution of clusters at finite temperature, and pairing correlations in the unbound neutron component. The effect of the nuclear distribution on the superfluid properties can be easily added in future calculations of the neutron star cooling curves. A strong influence of resonance population on the heat capacity at high temperature is observed, which deserves to be further studied within more microscopic calculations.

  9. Non-equilibrium physics and evolution—adaptation, extinction, and ecology: a Key Issues review

    International Nuclear Information System (INIS)

    Kussell, E; Vucelja, M

    2014-01-01

    Evolutionary dynamics in nature constitute an immensely complex non-equilibrium process. We review the application of physical models of evolution, by focusing on adaptation, extinction, and ecology. In each case, we examine key concepts by working through examples. Adaptation is discussed in the context of bacterial evolution, with a view toward the relationship between growth rates, mutation rates, selection strength, and environmental changes. Extinction dynamics for an isolated population are reviewed, with emphasis on the relation between timescales of extinction, population size, and temporally correlated noise. Ecological models are discussed by focusing on the effect of spatial interspecies interactions on diversity. Connections between physical processes—such as diffusion, turbulence, and localization—and evolutionary phenomena are highlighted. (key issues reviews)

  10. Statistical physics a prelude and fugue for engineers

    CERN Document Server

    Piazza, Roberto

    2017-01-01

    This book, provides a general introduction to the ideas and methods of statistical mechanics with the principal aim of meeting the needs of Master’s students in chemical, mechanical, and materials science engineering. Extensive introductory information is presented on many general physics topics in which students in engineering are inadequately trained, ranging from the Hamiltonian formulation of classical mechanics to basic quantum mechanics, electromagnetic fields in matter, intermolecular forces, and transport phenomena. Since engineers should be able to apply physical concepts, the book also focuses on the practical applications of statistical physics to material science and to cutting-edge technologies, with brief but informative sections on, for example, interfacial properties, disperse systems, nucleation, magnetic materials, superfluidity, and ultralow temperature technologies. The book adopts a graded approach to learning, the opening four basic-level chapters being followed by advanced “starred�...

  11. Gauge/gravity duality. From quantum phase transitions towards out-of-equilibrium physics

    International Nuclear Information System (INIS)

    Ngo Thanh, Hai

    2011-01-01

    In this dissertation we use gauge/gravity duality to investigate various phenomena of strongly coupled field theories. Of special interest are quantum phase transitions, quantum critical points, transport phenomena of charges and the thermalization process of strongly coupled medium. The systems studied in this thesis might be used as models for describing condensed matter physics in a superfluid phase near the quantum critical point and the physics of quark-gluon plasma (QGP), a deconfinement phase of QCD, which has been recently created at the Relativistic Heavy Ion Collider (RHIC). Moreover, we follow the line of considering different gravity setups whose dual field descriptions show interesting phenomena of systems in thermal equilibrium, slightly out-of-equilibrium and far-from-equilibrium. We first focus on systems in equilibrium and construct holographic superfluids at finite baryon and isospin charge densities. For that we use two different approaches, the bottom-up with an U(2) Einstein-Yang-Mills theory with back-reaction and the top-down approach with a D3/D7 brane setup with two coincident D7-brane probes. In both cases we observe phase transitions from a normal to a superfluid phase at finite and also at zero temperature. In our setup, the gravity duals of superfluids are Anti-de Sitter black holes which develop vector-hair. Studying the order of phase transitions at zero temperature, in the D3/D7 brane setup we always find a second order phase transition, while in the Einstein-Yang-Mills theory, depending on the strength of the back-reaction, we obtain a continuous or first order transition. We then move to systems which are slightly out-of-equilibrium. Using the D3/D7 brane setup with N c coincident D3-branes and N f coincident D7-brane probes, we compute transport coefficients associated with massive N=2 supersymmetric hypermultiplet fields propagating through an N=4 SU(N c ) super Yang-Mills plasma in the limit of N f c . Introducing a baryon

  12. Gauge/gravity duality. From quantum phase transitions towards out-of-equilibrium physics

    Energy Technology Data Exchange (ETDEWEB)

    Ngo Thanh, Hai

    2011-05-02

    In this dissertation we use gauge/gravity duality to investigate various phenomena of strongly coupled field theories. Of special interest are quantum phase transitions, quantum critical points, transport phenomena of charges and the thermalization process of strongly coupled medium. The systems studied in this thesis might be used as models for describing condensed matter physics in a superfluid phase near the quantum critical point and the physics of quark-gluon plasma (QGP), a deconfinement phase of QCD, which has been recently created at the Relativistic Heavy Ion Collider (RHIC). Moreover, we follow the line of considering different gravity setups whose dual field descriptions show interesting phenomena of systems in thermal equilibrium, slightly out-of-equilibrium and far-from-equilibrium. We first focus on systems in equilibrium and construct holographic superfluids at finite baryon and isospin charge densities. For that we use two different approaches, the bottom-up with an U(2) Einstein-Yang-Mills theory with back-reaction and the top-down approach with a D3/D7 brane setup with two coincident D7-brane probes. In both cases we observe phase transitions from a normal to a superfluid phase at finite and also at zero temperature. In our setup, the gravity duals of superfluids are Anti-de Sitter black holes which develop vector-hair. Studying the order of phase transitions at zero temperature, in the D3/D7 brane setup we always find a second order phase transition, while in the Einstein-Yang-Mills theory, depending on the strength of the back-reaction, we obtain a continuous or first order transition. We then move to systems which are slightly out-of-equilibrium. Using the D3/D7 brane setup with N{sub c} coincident D3-branes and N{sub f} coincident D7-brane probes, we compute transport coefficients associated with massive N=2 supersymmetric hypermultiplet fields propagating through an N=4 SU(N{sub c}) super Yang-Mills plasma in the limit of N{sub f}<

  13. Statistical Methods for Particle Physics (4/4)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The series of four lectures will introduce some of the important statistical methods used in Particle Physics, and should be particularly relevant to those involved in the analysis of LHC data. The lectures will include an introduction to statistical tests, parameter estimation, and the application of these tools to searches for new phenomena. Both frequentist and Bayesian methods will be described, with particular emphasis on treatment of systematic uncertainties. The lectures will also cover unfolding, that is, estimation of a distribution in binned form where the variable in question is subject to measurement errors.

  14. Statistical Methods for Particle Physics (1/4)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The series of four lectures will introduce some of the important statistical methods used in Particle Physics, and should be particularly relevant to those involved in the analysis of LHC data. The lectures will include an introduction to statistical tests, parameter estimation, and the application of these tools to searches for new phenomena. Both frequentist and Bayesian methods will be described, with particular emphasis on treatment of systematic uncertainties. The lectures will also cover unfolding, that is, estimation of a distribution in binned form where the variable in question is subject to measurement errors.

  15. Statistical Methods for Particle Physics (2/4)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The series of four lectures will introduce some of the important statistical methods used in Particle Physics, and should be particularly relevant to those involved in the analysis of LHC data. The lectures will include an introduction to statistical tests, parameter estimation, and the application of these tools to searches for new phenomena. Both frequentist and Bayesian methods will be described, with particular emphasis on treatment of systematic uncertainties. The lectures will also cover unfolding, that is, estimation of a distribution in binned form where the variable in question is subject to measurement errors.

  16. Statistical Methods for Particle Physics (3/4)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The series of four lectures will introduce some of the important statistical methods used in Particle Physics, and should be particularly relevant to those involved in the analysis of LHC data. The lectures will include an introduction to statistical tests, parameter estimation, and the application of these tools to searches for new phenomena. Both frequentist and Bayesian methods will be described, with particular emphasis on treatment of systematic uncertainties. The lectures will also cover unfolding, that is, estimation of a distribution in binned form where the variable in question is subject to measurement errors.

  17. Monte Carlo Simulation in Statistical Physics An Introduction

    CERN Document Server

    Binder, Kurt

    2010-01-01

    Monte Carlo Simulation in Statistical Physics deals with the computer simulation of many-body systems in condensed-matter physics and related fields of physics, chemistry and beyond, to traffic flows, stock market fluctuations, etc.). Using random numbers generated by a computer, probability distributions are calculated, allowing the estimation of the thermodynamic properties of various systems. This book describes the theoretical background to several variants of these Monte Carlo methods and gives a systematic presentation from which newcomers can learn to perform such simulations and to analyze their results. The fifth edition covers Classical as well as Quantum Monte Carlo methods. Furthermore a new chapter on the sampling of free-energy landscapes has been added. To help students in their work a special web server has been installed to host programs and discussion groups (http://wwwcp.tphys.uni-heidelberg.de). Prof. Binder was awarded the Berni J. Alder CECAM Award for Computational Physics 2001 as well ...

  18. Thermodynamics, Gibbs Method and Statistical Physics of Electron Gases Gibbs Method and Statistical Physics of Electron Gases

    CERN Document Server

    Askerov, Bahram M

    2010-01-01

    This book deals with theoretical thermodynamics and the statistical physics of electron and particle gases. While treating the laws of thermodynamics from both classical and quantum theoretical viewpoints, it posits that the basis of the statistical theory of macroscopic properties of a system is the microcanonical distribution of isolated systems, from which all canonical distributions stem. To calculate the free energy, the Gibbs method is applied to ideal and non-ideal gases, and also to a crystalline solid. Considerable attention is paid to the Fermi-Dirac and Bose-Einstein quantum statistics and its application to different quantum gases, and electron gas in both metals and semiconductors is considered in a nonequilibrium state. A separate chapter treats the statistical theory of thermodynamic properties of an electron gas in a quantizing magnetic field.

  19. On Dobrushin's way from probability theory to statistical physics

    CERN Document Server

    Minlos, R A; Suhov, Yu M; Suhov, Yu

    2000-01-01

    R. Dobrushin worked in several branches of mathematics (probability theory, information theory), but his deepest influence was on mathematical physics. He was one of the founders of the rigorous study of statistical physics. When Dobrushin began working in that direction in the early sixties, only a few people worldwide were thinking along the same lines. Now there is an army of researchers in the field. This collection is devoted to the memory of R. L. Dobrushin. The authors who contributed to this collection knew him quite well and were his colleagues. The title, "On Dobrushin's Way", is mea

  20. Foundations of Complex Systems Nonlinear Dynamics, Statistical Physics, and Prediction

    CERN Document Server

    Nicolis, Gregoire

    2007-01-01

    Complexity is emerging as a post-Newtonian paradigm for approaching a large body of phenomena of concern at the crossroads of physical, engineering, environmental, life and human sciences from a unifying point of view. This book outlines the foundations of modern complexity research as it arose from the cross-fertilization of ideas and tools from nonlinear science, statistical physics and numerical simulation. It is shown how these developments lead to an understanding, both qualitative and quantitative, of the complex systems encountered in nature and in everyday experience and, conversely, h

  1. STATISTICAL CHALLENGES FOR SEARCHES FOR NEW PHYSICS AT THE LHC.

    Energy Technology Data Exchange (ETDEWEB)

    CRANMER, K.

    2005-09-12

    Because the emphasis of the LHC is on 5{sigma} discoveries and the LHC environment induces high systematic errors, many of the common statistical procedures used in High Energy Physics are not adequate. I review the basic ingredients of LHC searches, the sources of systematics, and the performance of several methods. Finally, I indicate the methods that seem most promising for the LHC and areas that are in need of further study.

  2. Representative volume size: A comparison of statistical continuum mechanics and statistical physics

    Energy Technology Data Exchange (ETDEWEB)

    AIDUN,JOHN B.; TRUCANO,TIMOTHY G.; LO,CHI S.; FYE,RICHARD M.

    1999-05-01

    In this combination background and position paper, the authors argue that careful work is needed to develop accurate methods for relating the results of fine-scale numerical simulations of material processes to meaningful values of macroscopic properties for use in constitutive models suitable for finite element solid mechanics simulations. To provide a definite context for this discussion, the problem is couched in terms of the lack of general objective criteria for identifying the size of the representative volume (RV) of a material. The objective of this report is to lay out at least the beginnings of an approach for applying results and methods from statistical physics to develop concepts and tools necessary for determining the RV size, as well as alternatives to RV volume-averaging for situations in which the RV is unmanageably large. The background necessary to understand the pertinent issues and statistical physics concepts is presented.

  3. A statistical mechanical interpretation of algorithmic information theory: Total statistical mechanical interpretation based on physical argument

    International Nuclear Information System (INIS)

    Tadaki, Kohtaro

    2010-01-01

    The statistical mechanical interpretation of algorithmic information theory (AIT, for short) was introduced and developed by our former works [K. Tadaki, Local Proceedings of CiE 2008, pp. 425-434, 2008] and [K. Tadaki, Proceedings of LFCS'09, Springer's LNCS, vol. 5407, pp. 422-440, 2009], where we introduced the notion of thermodynamic quantities, such as partition function Z(T), free energy F(T), energy E(T), statistical mechanical entropy S(T), and specific heat C(T), into AIT. We then discovered that, in the interpretation, the temperature T equals to the partial randomness of the values of all these thermodynamic quantities, where the notion of partial randomness is a stronger representation of the compression rate by means of program-size complexity. Furthermore, we showed that this situation holds for the temperature T itself, which is one of the most typical thermodynamic quantities. Namely, we showed that, for each of the thermodynamic quantities Z(T), F(T), E(T), and S(T) above, the computability of its value at temperature T gives a sufficient condition for T is an element of (0,1) to satisfy the condition that the partial randomness of T equals to T. In this paper, based on a physical argument on the same level of mathematical strictness as normal statistical mechanics in physics, we develop a total statistical mechanical interpretation of AIT which actualizes a perfect correspondence to normal statistical mechanics. We do this by identifying a microcanonical ensemble in the framework of AIT. As a result, we clarify the statistical mechanical meaning of the thermodynamic quantities of AIT.

  4. Advanced statistics to improve the physical interpretation of atomization processes

    International Nuclear Information System (INIS)

    Panão, Miguel R.O.; Radu, Lucian

    2013-01-01

    Highlights: ► Finite pdf mixtures improves physical interpretation of sprays. ► Bayesian approach using MCMC algorithm is used to find the best finite mixture. ► Statistical method identifies multiple droplet clusters in a spray. ► Multiple drop clusters eventually associated with multiple atomization mechanisms. ► Spray described by drop size distribution and not only its moments. -- Abstract: This paper reports an analysis of the physics of atomization processes using advanced statistical tools. Namely, finite mixtures of probability density functions, which best fitting is found using a Bayesian approach based on a Markov chain Monte Carlo (MCMC) algorithm. This approach takes into account eventual multimodality and heterogeneities in drop size distributions. Therefore, it provides information about the complete probability density function of multimodal drop size distributions and allows the identification of subgroups in the heterogeneous data. This allows improving the physical interpretation of atomization processes. Moreover, it also overcomes the limitations induced by analyzing the spray droplets characteristics through moments alone, particularly, the hindering of different natures of droplet formation. Finally, the method is applied to physically interpret a case-study based on multijet atomization processes

  5. Statistical physics of networks, information and complex systems

    Energy Technology Data Exchange (ETDEWEB)

    Ecke, Robert E [Los Alamos National Laboratory

    2009-01-01

    In this project we explore the mathematical methods and concepts of statistical physics that are fmding abundant applications across the scientific and technological spectrum from soft condensed matter systems and bio-infonnatics to economic and social systems. Our approach exploits the considerable similarity of concepts between statistical physics and computer science, allowing for a powerful multi-disciplinary approach that draws its strength from cross-fertilization and mUltiple interactions of researchers with different backgrounds. The work on this project takes advantage of the newly appreciated connection between computer science and statistics and addresses important problems in data storage, decoding, optimization, the infonnation processing properties of the brain, the interface between quantum and classical infonnation science, the verification of large software programs, modeling of complex systems including disease epidemiology, resource distribution issues, and the nature of highly fluctuating complex systems. Common themes that the project has been emphasizing are (i) neural computation, (ii) network theory and its applications, and (iii) a statistical physics approach to infonnation theory. The project's efforts focus on the general problem of optimization and variational techniques, algorithm development and infonnation theoretic approaches to quantum systems. These efforts are responsible for fruitful collaborations and the nucleation of science efforts that span multiple divisions such as EES, CCS, 0 , T, ISR and P. This project supports the DOE mission in Energy Security and Nuclear Non-Proliferation by developing novel infonnation science tools for communication, sensing, and interacting complex networks such as the internet or energy distribution system. The work also supports programs in Threat Reduction and Homeland Security.

  6. Literature in Focus: Statistical Methods in Experimental Physics

    CERN Multimedia

    2007-01-01

    Frederick James was a high-energy physicist who became the CERN "expert" on statistics and is now well-known around the world, in part for this famous text. The first edition of Statistical Methods in Experimental Physics was originally co-written with four other authors and was published in 1971 by North Holland (now an imprint of Elsevier). It became such an important text that demand for it has continued for more than 30 years. Fred has updated it and it was released in a second edition by World Scientific in 2006. It is still a top seller and there is no exaggeration in calling it «the» reference on the subject. A full review of the title appeared in the October CERN Courier.Come and meet the author to hear more about how this book has flourished during its 35-year lifetime. Frederick James Statistical Methods in Experimental Physics Monday, 26th of November, 4 p.m. Council Chamber (Bldg. 503-1-001) The author will be introduced...

  7. Lecture 2: Equilibrium statistical treatment of angular momenta associated with collective modes in fission and heavy-ion reactions

    International Nuclear Information System (INIS)

    Moretto, L.G.

    1979-01-01

    The angular momentum effects in deep inelastic processes and fission have been studied in the limit of statistical equilibrium. The model consists of two touching liquid drop spheres. Angular momentum fractionation has been found to occur along the mass asymmetry coordinate. If neutron competition is included (i.e., in compound nucleus formation and fission), the fractionation occurs only to a slight degree, while extensive fractionation is predicted if no neutron competition occurs (i.e., in fusion--fission without compound nucleus formation). Thermal fluctuations in the angular momentum are predicted to occur due to degrees of freedom which can bear angular momentum, like wriggling, tilting, bending, and twisting. The coupling of relative motion to one of the wriggling modes, leading to fluctuations between orbital and intrinsic angular momentum, is considered first. Next the effect of the excitation of all the collective modes on the fragment spin is treated. General expressions for the first and second moments of the fragment spins are derived as a function of total angular momentum and the limiting behavior at large and small total angular momentum is examined. Furthermore, the effect of collective mode excitation on the fragment spin alignment is explored and is discussed in light of recent experiments. The relevance of the present study to the measured first and second moments of the γ-ray multiplicities as well as to sequential fission angular distributions is illustrated by applying the results of the theory to a well studied heavy ion reaction

  8. Statistical methods for data analysis in particle physics

    CERN Document Server

    Lista, Luca

    2017-01-01

    This concise set of course-based notes provides the reader with the main concepts and tools needed to perform statistical analyses of experimental data, in particular in the field of high-energy physics (HEP). First, the book provides an introduction to probability theory and basic statistics, mainly intended as a refresher from readers’ advanced undergraduate studies, but also to help them clearly distinguish between the Frequentist and Bayesian approaches and interpretations in subsequent applications. More advanced concepts and applications are gradually introduced, culminating in the chapter on both discoveries and upper limits, as many applications in HEP concern hypothesis testing, where the main goal is often to provide better and better limits so as to eventually be able to distinguish between competing hypotheses, or to rule out some of them altogether. Many worked-out examples will help newcomers to the field and graduate students alike understand the pitfalls involved in applying theoretical co...

  9. Inverse statistical physics of protein sequences: a key issues review.

    Science.gov (United States)

    Cocco, Simona; Feinauer, Christoph; Figliuzzi, Matteo; Monasson, Rémi; Weigt, Martin

    2018-03-01

    In the course of evolution, proteins undergo important changes in their amino acid sequences, while their three-dimensional folded structure and their biological function remain remarkably conserved. Thanks to modern sequencing techniques, sequence data accumulate at unprecedented pace. This provides large sets of so-called homologous, i.e. evolutionarily related protein sequences, to which methods of inverse statistical physics can be applied. Using sequence data as the basis for the inference of Boltzmann distributions from samples of microscopic configurations or observables, it is possible to extract information about evolutionary constraints and thus protein function and structure. Here we give an overview over some biologically important questions, and how statistical-mechanics inspired modeling approaches can help to answer them. Finally, we discuss some open questions, which we expect to be addressed over the next years.

  10. Statistical methods for data analysis in particle physics

    CERN Document Server

    AUTHOR|(CDS)2070643

    2015-01-01

    This concise set of course-based notes provides the reader with the main concepts and tools to perform statistical analysis of experimental data, in particular in the field of high-energy physics (HEP). First, an introduction to probability theory and basic statistics is given, mainly as reminder from advanced undergraduate studies, yet also in view to clearly distinguish the Frequentist versus Bayesian approaches and interpretations in subsequent applications. More advanced concepts and applications are gradually introduced, culminating in the chapter on upper limits as many applications in HEP concern hypothesis testing, where often the main goal is to provide better and better limits so as to be able to distinguish eventually between competing hypotheses or to rule out some of them altogether. Many worked examples will help newcomers to the field and graduate students to understand the pitfalls in applying theoretical concepts to actual data

  11. Topics in statistical data analysis for high-energy physics

    International Nuclear Information System (INIS)

    Cowan, G.

    2011-01-01

    These lectures concert two topics that are becoming increasingly important in the analysis of high-energy physics data: Bayesian statistics and multivariate methods. In the Bayesian approach, we extend the interpretation of probability not only to cover the frequency of repeatable outcomes but also to include a degree of belief. In this way we are able to associate probability with a hypothesis and thus to answer directly questions that cannot be addressed easily with traditional frequentist methods. In multivariate analysis, we try to exploit as much information as possible from the characteristics that we measure for each event to distinguish between event types. In particular we will look at a method that has gained popularity in high-energy physics in recent years: the boosted decision tree. Finally, we give a brief sketch of how multivariate methods may be applied in a search for a new signal process. (author)

  12. Introduction to statistical physics and to computer simulations

    CERN Document Server

    Casquilho, João Paulo

    2015-01-01

    Rigorous and comprehensive, this textbook introduces undergraduate students to simulation methods in statistical physics. The book covers a number of topics, including the thermodynamics of magnetic and electric systems; the quantum-mechanical basis of magnetism; ferrimagnetism, antiferromagnetism, spin waves and magnons; liquid crystals as a non-ideal system of technological relevance; and diffusion in an external potential. It also covers hot topics such as cosmic microwave background, magnetic cooling and Bose-Einstein condensation. The book provides an elementary introduction to simulation methods through algorithms in pseudocode for random walks, the 2D Ising model, and a model liquid crystal. Any formalism is kept simple and derivations are worked out in detail to ensure the material is accessible to students from subjects other than physics.

  13. Implementation of statistical analysis methods for medical physics data

    International Nuclear Information System (INIS)

    Teixeira, Marilia S.; Pinto, Nivia G.P.; Barroso, Regina C.; Oliveira, Luis F.

    2009-01-01

    The objective of biomedical research with different radiation natures is to contribute for the understanding of the basic physics and biochemistry of the biological systems, the disease diagnostic and the development of the therapeutic techniques. The main benefits are: the cure of tumors through the therapy, the anticipated detection of diseases through the diagnostic, the using as prophylactic mean for blood transfusion, etc. Therefore, for the better understanding of the biological interactions occurring after exposure to radiation, it is necessary for the optimization of therapeutic procedures and strategies for reduction of radioinduced effects. The group pf applied physics of the Physics Institute of UERJ have been working in the characterization of biological samples (human tissues, teeth, saliva, soil, plants, sediments, air, water, organic matrixes, ceramics, fossil material, among others) using X-rays diffraction and X-ray fluorescence. The application of these techniques for measurement, analysis and interpretation of the biological tissues characteristics are experimenting considerable interest in the Medical and Environmental Physics. All quantitative data analysis must be initiated with descriptive statistic calculation (means and standard deviations) in order to obtain a previous notion on what the analysis will reveal. It is well known que o high values of standard deviation found in experimental measurements of biologicals samples can be attributed to biological factors, due to the specific characteristics of each individual (age, gender, environment, alimentary habits, etc). This work has the main objective the development of a program for the use of specific statistic methods for the optimization of experimental data an analysis. The specialized programs for this analysis are proprietary, another objective of this work is the implementation of a code which is free and can be shared by the other research groups. As the program developed since the

  14. Adsorption of diclofenac and nimesulide on activated carbon: Statistical physics modeling and effect of adsorbate size

    Science.gov (United States)

    Sellaoui, Lotfi; Mechi, Nesrine; Lima, Éder Cláudio; Dotto, Guilherme Luiz; Ben Lamine, Abdelmottaleb

    2017-10-01

    Based on statistical physics elements, the equilibrium adsorption of diclofenac (DFC) and nimesulide (NM) on activated carbon was analyzed by a multilayer model with saturation. The paper aimed to describe experimentally and theoretically the adsorption process and study the effect of adsorbate size using the model parameters. From numerical simulation, the number of molecules per site showed that the adsorbate molecules (DFC and NM) were mostly anchored in both sides of the pore walls. The receptor sites density increase suggested that additional sites appeared during the process, to participate in DFC and NM adsorption. The description of the adsorption energy behavior indicated that the process was physisorption. Finally, by a model parameters correlation, the size effect of the adsorbate was deduced indicating that the molecule dimension has a negligible effect on the DFC and NM adsorption.

  15. A New Approach to Monte Carlo Simulations in Statistical Physics

    Science.gov (United States)

    Landau, David P.

    2002-08-01

    Monte Carlo simulations [1] have become a powerful tool for the study of diverse problems in statistical/condensed matter physics. Standard methods sample the probability distribution for the states of the system, most often in the canonical ensemble, and over the past several decades enormous improvements have been made in performance. Nonetheless, difficulties arise near phase transitions-due to critical slowing down near 2nd order transitions and to metastability near 1st order transitions, and these complications limit the applicability of the method. We shall describe a new Monte Carlo approach [2] that uses a random walk in energy space to determine the density of states directly. Once the density of states is known, all thermodynamic properties can be calculated. This approach can be extended to multi-dimensional parameter spaces and should be effective for systems with complex energy landscapes, e.g., spin glasses, protein folding models, etc. Generalizations should produce a broadly applicable optimization tool. 1. A Guide to Monte Carlo Simulations in Statistical Physics, D. P. Landau and K. Binder (Cambridge U. Press, Cambridge, 2000). 2. Fugao Wang and D. P. Landau, Phys. Rev. Lett. 86, 2050 (2001); Phys. Rev. E64, 056101-1 (2001).

  16. Utilization of a Microcomputer for the Study of an Iodine Oxidation and Equilibrium Reaction: A Physical Chemistry Experiment.

    Science.gov (United States)

    Julien, L. M.

    1984-01-01

    Describes a physical chemistry experiment which incorporates the use of a microcomputer to enhance understanding of combined kinetic and equilibrium phenomena, to increase experimental capabilities when working with large numbers of students and limited equipment, and for the student to develop a better understanding of experimental design. (JN)

  17. Statistical physics approach to earthquake occurrence and forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Arcangelis, Lucilla de [Department of Industrial and Information Engineering, Second University of Naples, Aversa (CE) (Italy); Godano, Cataldo [Department of Mathematics and Physics, Second University of Naples, Caserta (Italy); Grasso, Jean Robert [ISTerre, IRD-CNRS-OSUG, University of Grenoble, Saint Martin d’Héres (France); Lippiello, Eugenio, E-mail: eugenio.lippiello@unina2.it [Department of Mathematics and Physics, Second University of Naples, Caserta (Italy)

    2016-04-25

    There is striking evidence that the dynamics of the Earth crust is controlled by a wide variety of mutually dependent mechanisms acting at different spatial and temporal scales. The interplay of these mechanisms produces instabilities in the stress field, leading to abrupt energy releases, i.e., earthquakes. As a consequence, the evolution towards instability before a single event is very difficult to monitor. On the other hand, collective behavior in stress transfer and relaxation within the Earth crust leads to emergent properties described by stable phenomenological laws for a population of many earthquakes in size, time and space domains. This observation has stimulated a statistical mechanics approach to earthquake occurrence, applying ideas and methods as scaling laws, universality, fractal dimension, renormalization group, to characterize the physics of earthquakes. In this review we first present a description of the phenomenological laws of earthquake occurrence which represent the frame of reference for a variety of statistical mechanical models, ranging from the spring-block to more complex fault models. Next, we discuss the problem of seismic forecasting in the general framework of stochastic processes, where seismic occurrence can be described as a branching process implementing space–time-energy correlations between earthquakes. In this context we show how correlations originate from dynamical scaling relations between time and energy, able to account for universality and provide a unifying description for the phenomenological power laws. Then we discuss how branching models can be implemented to forecast the temporal evolution of the earthquake occurrence probability and allow to discriminate among different physical mechanisms responsible for earthquake triggering. In particular, the forecasting problem will be presented in a rigorous mathematical framework, discussing the relevance of the processes acting at different temporal scales for

  18. Physics-based preconditioning and the Newton-Krylov method for non-equilibrium radiation diffusion

    International Nuclear Information System (INIS)

    Mousseau, V.A.; Knoll, D.A.; Rider, W.J.

    2000-01-01

    An algorithm is presented for the solution of the time dependent reaction-diffusion systems which arise in non-equilibrium radiation diffusion applications. This system of nonlinear equations is solved by coupling three numerical methods, Jacobian-free Newton-Krylov, operator splitting, and multigrid linear solvers. An inexact Newton's method is used to solve the system of nonlinear equations. Since building the Jacobian matrix for problems of interest can be challenging, the authors employ a Jacobian-free implementation of Newton's method, where the action of the Jacobian matrix on a vector is approximated by a first order Taylor series expansion. Preconditioned generalized minimal residual (PGMRES) is the Krylov method used to solve the linear systems that come from the iterations of Newton's method. The preconditioner in this solution method is constructed using a physics-based divide and conquer approach, often referred to as operator splitting. This solution procedure inverts the scalar elliptic systems that make up the preconditioner using simple multigrid methods. The preconditioner also addresses the strong coupling between equations with local 2 x 2 block solves. The intra-cell coupling is applied after the inter-cell coupling has already been addressed by the elliptic solves. Results are presented using this solution procedure that demonstrate its efficiency while incurring minimal memory requirements

  19. Punctuated Equilibrium in Statistical Models of Generalized Coevolutionary Resilience: How Sudden Ecosystem Transitions Can Entrain Both Phenotype Expression and Darwinian Selection

    Science.gov (United States)

    Wallace, Rodrick; Wallace, Deborah

    We argue that mesoscale ecosystem resilience shifts akin to sudden phase transitions in physical systems can entrain similarly punctuated events of gene expression on more rapid time scales, and, in part through such means, slower changes induced by selection pressure, triggering punctuated equilibrium Darwinian evolutionary transitions on geologic time scales. The approach reduces ecosystem, gene expression, and Darwinian genetic dynamics to a least common denominator of information sources interacting by crosstalk at markedly differing rates. Pettini's 'topological hypothesis', via a homology between information source uncertainty and free energy density, generates a regression-like class of statistical models of sudden coevolutionary phase transition based on the Rate Distortion and Shannon-McMillan Theorems of information theory which links all three levels. A mathematical treatment of Holling's extended keystone hypothesis regarding the particular role of mesoscale phenomena in entraining both slower and faster dynamical structures produces the result. A main theme is the necessity of a cognitive paradigm for gene expression, mirroring I. Cohen's cognitive approach to immune function. Invocation of the necessary conditions imposed by the asymptotic limit theorems of communication theory enables us to penetrate one layer more deeply before needing to impose an empirically-derived phenomenological system of 'Onsager relation' recursive coevolutionary stochastic differential equations. Extending the development to second order via a large deviations argument permits modeling the influence of human cultural structures on ecosystems as 'farming'.

  20. New Hybrid Monte Carlo methods for efficient sampling. From physics to biology and statistics

    International Nuclear Information System (INIS)

    Akhmatskaya, Elena; Reich, Sebastian

    2011-01-01

    We introduce a class of novel hybrid methods for detailed simulations of large complex systems in physics, biology, materials science and statistics. These generalized shadow Hybrid Monte Carlo (GSHMC) methods combine the advantages of stochastic and deterministic simulation techniques. They utilize a partial momentum update to retain some of the dynamical information, employ modified Hamiltonians to overcome exponential performance degradation with the system’s size and make use of multi-scale nature of complex systems. Variants of GSHMCs were developed for atomistic simulation, particle simulation and statistics: GSHMC (thermodynamically consistent implementation of constant-temperature molecular dynamics), MTS-GSHMC (multiple-time-stepping GSHMC), meso-GSHMC (Metropolis corrected dissipative particle dynamics (DPD) method), and a generalized shadow Hamiltonian Monte Carlo, GSHmMC (a GSHMC for statistical simulations). All of these are compatible with other enhanced sampling techniques and suitable for massively parallel computing allowing for a range of multi-level parallel strategies. A brief description of the GSHMC approach, examples of its application on high performance computers and comparison with other existing techniques are given. Our approach is shown to resolve such problems as resonance instabilities of the MTS methods and non-preservation of thermodynamic equilibrium properties in DPD, and to outperform known methods in sampling efficiency by an order of magnitude. (author)

  1. GPU-computing in econophysics and statistical physics

    Science.gov (United States)

    Preis, T.

    2011-03-01

    A recent trend in computer science and related fields is general purpose computing on graphics processing units (GPUs), which can yield impressive performance. With multiple cores connected by high memory bandwidth, today's GPUs offer resources for non-graphics parallel processing. This article provides a brief introduction into the field of GPU computing and includes examples. In particular computationally expensive analyses employed in financial market context are coded on a graphics card architecture which leads to a significant reduction of computing time. In order to demonstrate the wide range of possible applications, a standard model in statistical physics - the Ising model - is ported to a graphics card architecture as well, resulting in large speedup values.

  2. Graphene growth process modeling: a physical-statistical approach

    Science.gov (United States)

    Wu, Jian; Huang, Qiang

    2014-09-01

    As a zero-band semiconductor, graphene is an attractive material for a wide variety of applications such as optoelectronics. Among various techniques developed for graphene synthesis, chemical vapor deposition on copper foils shows high potential for producing few-layer and large-area graphene. Since fabrication of high-quality graphene sheets requires the understanding of growth mechanisms, and methods of characterization and control of grain size of graphene flakes, analytical modeling of graphene growth process is therefore essential for controlled fabrication. The graphene growth process starts with randomly nucleated islands that gradually develop into complex shapes, grow in size, and eventually connect together to cover the copper foil. To model this complex process, we develop a physical-statistical approach under the assumption of self-similarity during graphene growth. The growth kinetics is uncovered by separating island shapes from area growth rate. We propose to characterize the area growth velocity using a confined exponential model, which not only has clear physical explanation, but also fits the real data well. For the shape modeling, we develop a parametric shape model which can be well explained by the angular-dependent growth rate. This work can provide useful information for the control and optimization of graphene growth process on Cu foil.

  3. Statistical classification techniques in high energy physics (SDDT algorithm)

    International Nuclear Information System (INIS)

    Bouř, Petr; Kůs, Václav; Franc, Jiří

    2016-01-01

    We present our proposal of the supervised binary divergence decision tree with nested separation method based on the generalized linear models. A key insight we provide is the clustering driven only by a few selected physical variables. The proper selection consists of the variables achieving the maximal divergence measure between two different classes. Further, we apply our method to Monte Carlo simulations of physics processes corresponding to a data sample of top quark-antiquark pair candidate events in the lepton+jets decay channel. The data sample is produced in pp̅ collisions at √S = 1.96 TeV. It corresponds to an integrated luminosity of 9.7 fb"-"1 recorded with the D0 detector during Run II of the Fermilab Tevatron Collider. The efficiency of our algorithm achieves 90% AUC in separating signal from background. We also briefly deal with the modification of statistical tests applicable to weighted data sets in order to test homogeneity of the Monte Carlo simulations and measured data. The justification of these modified tests is proposed through the divergence tests. (paper)

  4. The interaction of physical properties of seawater via statistical approach

    Science.gov (United States)

    Hamzah, Firdaus Mohamad; Jaafar, Othman; Sabri, Samsul Rijal Mohd; Ismail, Mohd Tahir; Jaafar, Khamisah; Arbin, Norazman

    2015-09-01

    It is of importance to determine the relationships between physical parameters in marine ecology. Model and expert opinion are needed for exploration of the form of relationship between two parameters due to the complexity of the ecosystems. These need justification with observed data over a particular periods. Novel statistical techniques such as nonparametric regression is presented to investigate the ecological relationships. These are achieved by demonstrating the features of pH, salinity and conductivity at in Straits of Johor. The monthly data measurements from 2004 until 2013 at a chosen sampling location are examined. Testing for no-effect followed by linearity testing for the relationships between salinity and pH; conductivity and pH, and conductivity and salinity are carried out, with the ecological objectives of investigating the evidence of changes in each of the above physical parameters. The findings reveal the appropriateness of smooth function to explain the variation of pH in response to the changes in salinity whilst the changes in conductivity with regards to different concentrations of salinity could be modelled parametrically. The analysis highlights the importance of both parametric and nonparametric models for assessing ecological response to environmental change in seawater.

  5. Application of statistical physics approaches to complex organizations

    Science.gov (United States)

    Matia, Kaushik

    The first part of this thesis studies two different kinds of financial markets, namely, the stock market and the commodity market. Stock price fluctuations display certain scale-free statistical features that are not unlike those found in strongly-interacting physical systems. The possibility that new insights can be gained using concepts and methods developed to understand scale-free physical phenomena has stimulated considerable research activity in the physics community. In the first part of this thesis a comparative study of stocks and commodities is performed in terms of probability density function and correlations of stock price fluctuations. It is found that the probability density of the stock price fluctuation has a power law functional form with an exponent 3, which is similar across different markets around the world. We present an autoregressive model to explain the origin of the power law functional form of the probability density function of the price fluctuation. The first part also presents the discovery of unique features of the Indian economy, which we find displays a scale-dependent probability density function. In the second part of this thesis we quantify the statistical properties of fluctuations of complex systems like business firms and world scientific publications. We analyze class size of these systems mentioned above where units agglomerate to form classes. We find that the width of the probability density function of growth rate decays with the class size as a power law with an exponent beta which is universal in the sense that beta is independent of the system studied. We also identify two other scaling exponents, gamma connecting the unit size to the class size and gamma connecting the number of units to the class size, where products are units and firms are classes. Finally we propose a generalized preferential attachment model to describe the class size distribution. This model is successful in explaining the growth rate and class

  6. Applications of statistical physics to the social and economic sciences

    Science.gov (United States)

    Petersen, Alexander M.

    2011-12-01

    This thesis applies statistical physics concepts and methods to quantitatively analyze socioeconomic systems. For each system we combine theoretical models and empirical data analysis in order to better understand the real-world system in relation to the complex interactions between the underlying human agents. This thesis is separated into three parts: (i) response dynamics in financial markets, (ii) dynamics of career trajectories, and (iii) a stochastic opinion model with quenched disorder. In Part I we quantify the response of U.S. markets to financial shocks, which perturb markets and trigger "herding behavior" among traders. We use concepts from earthquake physics to quantify the decay of volatility shocks after the "main shock." We also find, surprisingly, that we can make quantitative statements even before the main shock. In order to analyze market behavior before as well as after "anticipated news" we use Federal Reserve interest-rate announcements, which are regular events that are also scheduled in advance. In Part II we analyze the statistical physics of career longevity. We construct a stochastic model for career progress which has two main ingredients: (a) random forward progress in the career and (b) random termination of the career. We incorporate the rich-get-richer (Matthew) effect into ingredient (a), meaning that it is easier to move forward in the career the farther along one is in the career. We verify the model predictions analyzing data on 400,000 scientific careers and 20,000 professional sports careers. Our model highlights the importance of early career development, showing that many careers are stunted by the relative disadvantage associated with inexperience. In Part III we analyze a stochastic two-state spin model which represents a system of voters embedded on a network. We investigate the role in consensus formation of "zealots", which are agents with time-independent opinion. Our main result is the unexpected finding that it is the

  7. Younger Dryas Boundary (YDB) impact : physical and statistical impossibility.

    Energy Technology Data Exchange (ETDEWEB)

    Boslough, Mark Bruce Elrick

    2010-08-01

    The YDB impact hypothesis of Firestone et al. (2007) is so extremely improbable it can be considered statistically impossible in addition to being physically impossible. Comets make up only about 1% of the population of Earth-crossing objects. Broken comets are a vanishingly small fraction, and only exist as Earth-sized clusters for a very short period of time. Only a small fraction of impacts occur at angles as shallow as proposed by the YDB impact authors. Events that are exceptionally unlikely to take place in the age of the Universe are 'statistically impossible'. The size distribution of Earth-crossing asteroids is well-constrained by astronomical observations, DoD satellite bolide frequencies, and the cratering record. This distribution can be transformed to a probability density function (PDF) for the largest expected impact of the past 20,000 years. The largest impact of any kind expected over the period of interest is 250 m. Anything larger than 2 km is exceptionally unlikely (probability less than 1%). The impact hypothesis does not rely on any sound physical model. A 4-km diameter comet, even if it fragmented upon entry, would not disperse or explode in the atmosphere. It would generate a crater about 50 km in diameter with a transient cavity as deep as 10 km. There is no evidence for such a large, young crater associated with the YDB. There is no model to suggest that a comet impact of this size is capable of generating continental-wide fires or blast damage, and there is no physical mechanism that could cause a 4-km comet to explode at the optimum height of 500 km. The highest possible altitude for a cometary optimum height is about 15 km, for a 120-m diameter comet. To maximize blast and thermal damage, a 4-km comet would have to break into tens of thousands fragments of this size and spread out over the entire continent, but that would require lateral forces that greatly exceed the drag force, and would not conserve energy. Airbursts are

  8. Experimental benchmark of non-local-thermodynamic-equilibrium plasma atomic physics codes

    International Nuclear Information System (INIS)

    Nagels-Silvert, V.

    2004-09-01

    The main purpose of this thesis is to get experimental data for the testing and validation of atomic physics codes dealing with non-local-thermodynamical-equilibrium plasmas. The first part is dedicated to the spectroscopic study of xenon and krypton plasmas that have been produced by a nanosecond laser pulse interacting with a gas jet. A Thomson scattering diagnostic has allowed us to measure independently plasma parameters such as electron temperature, electron density and the average ionisation state. We have obtained time integrated spectra in the range between 5 and 10 angstroms. We have identified about one hundred xenon rays between 8.6 and 9.6 angstroms via the use of the Relac code. We have discovered unknown rays for the krypton between 5.2 and 7.5 angstroms. In a second experiment we have extended the wavelength range to the X UV domain. The Averroes/Transpec code has been tested in the ranges from 9 to 15 angstroms and from 10 to 130 angstroms, the first range has been well reproduced while the second range requires a more complex data analysis. The second part is dedicated to the spectroscopic study of aluminium, selenium and samarium plasmas in femtosecond operating rate. We have designed an interferometry diagnostic in the frequency domain that has allowed us to measure the expanding speed of the target's backside. Via the use of an adequate isothermal model this parameter has led us to know the plasma electron temperature. Spectra and emission times of various rays from the aluminium and selenium plasmas have been computed satisfactorily with the Averroes/Transpec code coupled with Film and Multif hydrodynamical codes. (A.C.)

  9. A Quantal Response Statistical Equilibrium Model of Induced Technical Change in an Interactive Factor Market: Firm-Level Evidence in the EU Economies

    Directory of Open Access Journals (Sweden)

    Jangho Yang

    2018-02-01

    Full Text Available This paper studies the pattern of technical change at the firm level by applying and extending the Quantal Response Statistical Equilibrium model (QRSE. The model assumes that a large number of cost minimizing firms decide whether to adopt a new technology based on the potential rate of cost reduction. The firm in the model is assumed to have a limited capacity to process market signals so there is a positive degree of uncertainty in adopting a new technology. The adoption decision by the firm, in turn, makes an impact on the whole market through changes in the factor-price ratio. The equilibrium distribution of the model is a unimodal probability distribution with four parameters, which is qualitatively different from the Walrasian notion of equilibrium in so far as the state of equilibrium is not a single state but a probability distribution of multiple states. This paper applies Bayesian inference to estimate the unknown parameters of the model using the firm-level data of seven advanced OECD countries over eight years and shows that the mentioned equilibrium distribution from the model can satisfactorily recover the observed pattern of technical change.

  10. Technical note: Evaluation of the simultaneous measurements of mesospheric OH, HO2, and O3 under a photochemical equilibrium assumption - a statistical approach

    Science.gov (United States)

    Kulikov, Mikhail Y.; Nechaev, Anton A.; Belikovich, Mikhail V.; Ermakova, Tatiana S.; Feigin, Alexander M.

    2018-05-01

    This Technical Note presents a statistical approach to evaluating simultaneous measurements of several atmospheric components under the assumption of photochemical equilibrium. We consider simultaneous measurements of OH, HO2, and O3 at the altitudes of the mesosphere as a specific example and their daytime photochemical equilibrium as an evaluating relationship. A simplified algebraic equation relating local concentrations of these components in the 50-100 km altitude range has been derived. The parameters of the equation are temperature, neutral density, local zenith angle, and the rates of eight reactions. We have performed a one-year simulation of the mesosphere and lower thermosphere using a 3-D chemical-transport model. The simulation shows that the discrepancy between the calculated evolution of the components and the equilibrium value given by the equation does not exceed 3-4 % in the full range of altitudes independent of season or latitude. We have developed a statistical Bayesian evaluation technique for simultaneous measurements of OH, HO2, and O3 based on the equilibrium equation taking into account the measurement error. The first results of the application of the technique to MLS/Aura data (Microwave Limb Sounder) are presented in this Technical Note. It has been found that the satellite data of the HO2 distribution regularly demonstrate lower altitudes of this component's mesospheric maximum. This has also been confirmed by model HO2 distributions and comparison with offline retrieval of HO2 from the daily zonal means MLS radiance.

  11. Statistical Physics: Third Tohwa University International Conference. AIP Conference Proceedings No. 519 [ACPCS

    International Nuclear Information System (INIS)

    Tokuyama, M.; Stanley, H.E.

    2000-01-01

    The main purpose of the Tohwa University International Conference on Statistical Physics is to provide an opportunity for an international group of experimentalists, theoreticians, and computational scientists who are working on various fields of statistical physics to gather together and discuss their recent advances. The conference covered six topics: complex systems, general methods of statistical physics, biological physics, cross-disciplinary physics, information science, and econophysics

  12. Worldwide seismicity in view of non-extensive statistical physics

    Science.gov (United States)

    Chochlaki, Kaliopi; Vallianatos, Filippos; Michas, George

    2014-05-01

    In the present work we study the distribution of worldwide shallow seismic events occurred from 1981 to 2011 extracted from the CMT catalog, with magnitude equal or greater than Mw 5.0. Our analysis based on the subdivision of the Earth surface into seismic zones that are homogeneous with regards to seismic activity and orientation of the predominant stress field. To this direction we use the Flinn-Engdahl regionalization (Flinn and Engdahl, 1965), which consists of 50 seismic zones as modified by Lombardi and Marzocchi (2007), where grouped the 50 FE zones into larger tectonically homogeneous ones, utilizing the cumulative moment tensor method. As a result Lombardi and Marzocchi (2007), limit the initial 50 regions to 39 ones, in which we apply the non- extensive statistical physics approach. The non-extensive statistical physics seems to be the most adequate and promising methodological tool for analyzing complex systems, such as the Earth's interior. In this frame, we introduce the q-exponential formulation as the expression of probability distribution function that maximizes the Sq entropy as defined by Tsallis, (1988). In the present work we analyze the interevent time distribution between successive earthquakes by a q-exponential function in each of the seismic zones defined by Lombardi and Marzocchi (2007).confirming the importance of long-range interactions and the existence of a power-law approximation in the distribution of the interevent times. Our findings supports the ideas of universality within the Tsallis approach to describe Earth's seismicity and present strong evidence on temporal clustering of seismic activity in each of the tectonic zones analyzed. Our analysis as applied in worldwide seismicity with magnitude equal or greater than Mw 5.5 and 6.) is presented and the dependence of our result on the cut-off magnitude is discussed. This research has been funded by the European Union (European Social Fund) and Greek national resources under the

  13. Is poker a skill game? New insights from statistical physics

    Science.gov (United States)

    Javarone, Marco Alberto

    2015-06-01

    During last years poker has gained a lot of prestige in several countries and, besides being one of the most famous card games, it represents a modern challenge for scientists belonging to different communities, spanning from artificial intelligence to physics and from psychology to mathematics. Unlike games like chess, the task of classifying the nature of poker (i.e., as “skill game” or gambling) seems really hard and it also constitutes a current problem, whose solution has several implications. In general, gambling offers equal winning probabilities both to rational players (i.e., those that use a strategy) and to irrational ones (i.e., those without a strategy). Therefore, in order to uncover the nature of poker, a viable way is comparing performances of rational vs. irrational players during a series of challenges. Recently, a work on this topic revealed that rationality is a fundamental ingredient to succeed in poker tournaments. In this study we analyze a simple model of poker challenges by a statistical physics approach, with the aim to uncover the nature of this game. As main result we found that, under particular conditions, few irrational players can turn poker into gambling. Therefore, although rationality is a key ingredient to succeed in poker, also the format of challenges has an important role in these dynamics, as it can strongly influence the underlying nature of the game. The importance of our results lies on the related implications, as for instance in identifying the limits within which poker can be considered as a “skill game” and, as a consequence, which kind of format must be chosen to devise algorithms able to face humans.

  14. Statistical physics of medical diagnostics: Study of a probabilistic model.

    Science.gov (United States)

    Mashaghi, Alireza; Ramezanpour, Abolfazl

    2018-03-01

    We study a diagnostic strategy which is based on the anticipation of the diagnostic process by simulation of the dynamical process starting from the initial findings. We show that such a strategy could result in more accurate diagnoses compared to a strategy that is solely based on the direct implications of the initial observations. We demonstrate this by employing the mean-field approximation of statistical physics to compute the posterior disease probabilities for a given subset of observed signs (symptoms) in a probabilistic model of signs and diseases. A Monte Carlo optimization algorithm is then used to maximize an objective function of the sequence of observations, which favors the more decisive observations resulting in more polarized disease probabilities. We see how the observed signs change the nature of the macroscopic (Gibbs) states of the sign and disease probability distributions. The structure of these macroscopic states in the configuration space of the variables affects the quality of any approximate inference algorithm (so the diagnostic performance) which tries to estimate the sign-disease marginal probabilities. In particular, we find that the simulation (or extrapolation) of the diagnostic process is helpful when the disease landscape is not trivial and the system undergoes a phase transition to an ordered phase.

  15. Statistical physics of medical diagnostics: Study of a probabilistic model

    Science.gov (United States)

    Mashaghi, Alireza; Ramezanpour, Abolfazl

    2018-03-01

    We study a diagnostic strategy which is based on the anticipation of the diagnostic process by simulation of the dynamical process starting from the initial findings. We show that such a strategy could result in more accurate diagnoses compared to a strategy that is solely based on the direct implications of the initial observations. We demonstrate this by employing the mean-field approximation of statistical physics to compute the posterior disease probabilities for a given subset of observed signs (symptoms) in a probabilistic model of signs and diseases. A Monte Carlo optimization algorithm is then used to maximize an objective function of the sequence of observations, which favors the more decisive observations resulting in more polarized disease probabilities. We see how the observed signs change the nature of the macroscopic (Gibbs) states of the sign and disease probability distributions. The structure of these macroscopic states in the configuration space of the variables affects the quality of any approximate inference algorithm (so the diagnostic performance) which tries to estimate the sign-disease marginal probabilities. In particular, we find that the simulation (or extrapolation) of the diagnostic process is helpful when the disease landscape is not trivial and the system undergoes a phase transition to an ordered phase.

  16. Statistical Physics of Neural Systems with Nonadditive Dendritic Coupling

    Directory of Open Access Journals (Sweden)

    David Breuer

    2014-03-01

    Full Text Available How neurons process their inputs crucially determines the dynamics of biological and artificial neural networks. In such neural and neural-like systems, synaptic input is typically considered to be merely transmitted linearly or sublinearly by the dendritic compartments. Yet, single-neuron experiments report pronounced supralinear dendritic summation of sufficiently synchronous and spatially close-by inputs. Here, we provide a statistical physics approach to study the impact of such nonadditive dendritic processing on single-neuron responses and the performance of associative-memory tasks in artificial neural networks. First, we compute the effect of random input to a neuron incorporating nonlinear dendrites. This approach is independent of the details of the neuronal dynamics. Second, we use those results to study the impact of dendritic nonlinearities on the network dynamics in a paradigmatic model for associative memory, both numerically and analytically. We find that dendritic nonlinearities maintain network convergence and increase the robustness of memory performance against noise. Interestingly, an intermediate number of dendritic branches is optimal for memory functionality.

  17. Determination of equilibrium composition of thermally ionized monoatomic gas under different physical conditions

    Science.gov (United States)

    Romanova, M. S.; Rydalevskaya, M. A.

    2017-05-01

    Perfect gas mixtures that result from thermal ionization of spatially and chemically homogeneous monoatomic gases are considered. Equilibrium concentrations of the components of such mixtures are determined using integration over the momentum space and summation with respect to energy levels of the distribution functions that maximize the entropy of system under condition for constancy of the total number of nuclei and electrons. It is demonstrated that such a method allows significant simplification of the calculation of the equilibrium composition for ionized mixtures at different temperatures and makes it possible to study the degree of ionization of gas versus gas density and number in the periodic table of elements.

  18. Control of Chemical Equilibrium by Solvent: A Basis for Teaching Physical Chemistry of Solutions

    Science.gov (United States)

    Prezhdo, Oleg V.; Craig, Colleen F.; Fialkov, Yuriy; Prezhdo, Victor V.

    2007-01-01

    The study demonstrates that the solvent present in a system can highly alter and control the chemical equilibrium of a system. The results show that the dipole moment and polarizibility of a system can be highly altered by using different mixed solvents.

  19. The new physics of non-equilibrium condensates: insights from classical dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Eastham, P R [Theory of Condensed Matter, Cavendish Laboratory, Cambridge CB3 0HE (United Kingdom)

    2007-07-25

    We discuss the dynamics of classical Dicke-type models, aiming to clarify the mechanisms by which coherent states could develop in potentially non-equilibrium systems such as semiconductor microcavities. We present simulations of an undamped model which show spontaneous coherent states with persistent oscillations in the magnitude of the order parameter. These states are generalizations of superradiant ringing to the case of inhomogeneous broadening. They correspond to the persistent gap oscillations proposed in fermionic atomic condensates, and arise from a variety of initial conditions. We show that introducing randomness into the couplings can suppress the oscillations, leading to a limiting dynamics with a time-independent order parameter. This demonstrates that non-equilibrium generalizations of polariton condensates can be created even without dissipation. We explain the dynamical origins of the coherence in terms of instabilities of the normal state, and consider how it can additionally develop through scattering and dissipation.

  20. The new physics of non-equilibrium condensates: insights from classical dynamics

    International Nuclear Information System (INIS)

    Eastham, P R

    2007-01-01

    We discuss the dynamics of classical Dicke-type models, aiming to clarify the mechanisms by which coherent states could develop in potentially non-equilibrium systems such as semiconductor microcavities. We present simulations of an undamped model which show spontaneous coherent states with persistent oscillations in the magnitude of the order parameter. These states are generalizations of superradiant ringing to the case of inhomogeneous broadening. They correspond to the persistent gap oscillations proposed in fermionic atomic condensates, and arise from a variety of initial conditions. We show that introducing randomness into the couplings can suppress the oscillations, leading to a limiting dynamics with a time-independent order parameter. This demonstrates that non-equilibrium generalizations of polariton condensates can be created even without dissipation. We explain the dynamical origins of the coherence in terms of instabilities of the normal state, and consider how it can additionally develop through scattering and dissipation

  1. Parallelization of the Physical-Space Statistical Analysis System (PSAS)

    Science.gov (United States)

    Larson, J. W.; Guo, J.; Lyster, P. M.

    1999-01-01

    Atmospheric data assimilation is a method of combining observations with model forecasts to produce a more accurate description of the atmosphere than the observations or forecast alone can provide. Data assimilation plays an increasingly important role in the study of climate and atmospheric chemistry. The NASA Data Assimilation Office (DAO) has developed the Goddard Earth Observing System Data Assimilation System (GEOS DAS) to create assimilated datasets. The core computational components of the GEOS DAS include the GEOS General Circulation Model (GCM) and the Physical-space Statistical Analysis System (PSAS). The need for timely validation of scientific enhancements to the data assimilation system poses computational demands that are best met by distributed parallel software. PSAS is implemented in Fortran 90 using object-based design principles. The analysis portions of the code solve two equations. The first of these is the "innovation" equation, which is solved on the unstructured observation grid using a preconditioned conjugate gradient (CG) method. The "analysis" equation is a transformation from the observation grid back to a structured grid, and is solved by a direct matrix-vector multiplication. Use of a factored-operator formulation reduces the computational complexity of both the CG solver and the matrix-vector multiplication, rendering the matrix-vector multiplications as a successive product of operators on a vector. Sparsity is introduced to these operators by partitioning the observations using an icosahedral decomposition scheme. PSAS builds a large (approx. 128MB) run-time database of parameters used in the calculation of these operators. Implementing a message passing parallel computing paradigm into an existing yet developing computational system as complex as PSAS is nontrivial. One of the technical challenges is balancing the requirements for computational reproducibility with the need for high performance. The problem of computational

  2. Physical mechanism for biopolymers to aggregate and maintain in non-equilibrium states.

    Science.gov (United States)

    Ma, Wen-Jong; Hu, Chin-Kun

    2017-06-08

    Many human or animal diseases are related to aggregation of proteins. A viable biological organism should maintain in non-equilibrium states. How protein aggregate and why biological organisms can maintain in non-equilibrium states are not well understood. As a first step to understand such complex systems problems, we consider simple model systems containing polymer chains and solvent particles. The strength of the spring to connect two neighboring monomers in a polymer chain is controlled by a parameter s with s → ∞ for rigid-bond. The strengths of bending and torsion angle dependent interactions are controlled by a parameter s A with s A  → -∞ corresponding to no bending and torsion angle dependent interactions. We find that for very small s A , polymer chains tend to aggregate spontaneously and the trend is independent of the strength of spring. For strong springs, the speed distribution of monomers in the parallel (along the direction of the spring to connect two neighboring monomers) and perpendicular directions have different effective temperatures and such systems are in non-equilibrium states.

  3. Introduction to the basic concepts of modern physics special relativity, quantum and statistical physics

    CERN Document Server

    Becchi, Carlo Maria

    2016-01-01

    This is the third edition of a well-received textbook on modern physics theory. This book provides an elementary but rigorous and self-contained presentation of the simplest theoretical framework that will meet the needs of undergraduate students. In addition, a number of examples of relevant applications and an appropriate list of solved problems are provided.Apart from a substantial extension of the proposed problems, the new edition provides more detailed discussion on Lorentz transformations and their group properties, a deeper treatment of quantum mechanics in a central potential, and a closer comparison of statistical mechanics in classical and in quantum physics. The first part of the book is devoted to special relativity, with a particular focus on space-time relativity and relativistic kinematics. The second part deals with Schrödinger's formulation of quantum mechanics. The presentation concerns mainly one-dimensional problems, but some three-dimensional examples are discussed in detail. The third...

  4. Theoretical and computational studies of non-equilibrium and non-statistical dynamics in the gas phase, in the condensed phase and at interfaces.

    Science.gov (United States)

    Spezia, Riccardo; Martínez-Nuñez, Emilio; Vazquez, Saulo; Hase, William L

    2017-04-28

    In this Introduction, we show the basic problems of non-statistical and non-equilibrium phenomena related to the papers collected in this themed issue. Over the past few years, significant advances in both computing power and development of theories have allowed the study of larger systems, increasing the time length of simulations and improving the quality of potential energy surfaces. In particular, the possibility of using quantum chemistry to calculate energies and forces 'on the fly' has paved the way to directly study chemical reactions. This has provided a valuable tool to explore molecular mechanisms at given temperatures and energies and to see whether these reactive trajectories follow statistical laws and/or minimum energy pathways. This themed issue collects different aspects of the problem and gives an overview of recent works and developments in different contexts, from the gas phase to the condensed phase to excited states.This article is part of the themed issue 'Theoretical and computational studies of non-equilibrium and non-statistical dynamics in the gas phase, in the condensed phase and at interfaces'. © 2017 The Author(s).

  5. Physical Research Program: research contracts and statistical summary

    International Nuclear Information System (INIS)

    1975-01-01

    The physical research program consists of fundamental theoretical and experimental investigations designed to support the objectives of ERDA. The program is directed toward discovery of natural laws and new knowledge, and to improved understanding of the physical sciences as related to the development, use, and control of energy. The ultimate goal is to develop a scientific underlay for the overall ERDA effort and the fundamental principles of natural phenomena so that these phenomena may be understood and new principles, formulated. The physical research program is organized into four functional subprograms, high-energy physics, nuclear sciences, materials sciences, and molecular sciences. Approximately four-fifths of the total physical research program costs are associated with research conducted in ERDA-owned, contractor-operated federally funded research and development centers. A little less than one-fifth of the costs are associated with the support of research conducted in other laboratories

  6. Determination of the Equilibrium Constants of a Weak Acid: An Experiment for Analytical or Physical Chemistry

    Science.gov (United States)

    Bonham, Russell A.

    1998-05-01

    A simple experiment, utilizing readily available equipment and chemicals, is described. It allows students to explore the concepts of chemical equilibria, nonideal behavior of aqueous solutions, least squares with adjustment of nonlinear model parameters, and errors. The relationship between the pH of a solution of known initial concentration and volume of a weak acid as it is titrated by known volumes of a monohydroxy strong base is developed rigorously assuming ideal behavior. A distinctive feature of this work is a method that avoids dealing with the problems presented by equations with multiple roots. The volume of base added is calculated in terms of a known value of the pH and the equilibrium constants. The algebraic effort involved is nearly the same as the alternative of deriving a master equation for solving for the hydrogen ion concentration or activity and results in a more efficient computational algorithm. This approach offers two advantages over the use of computer software to solve directly for the hydrogen ion concentration. First, it avoids a potentially lengthy iterative procedure encountered when the polynomial exceeds third order in the hydrogen ion concentration; and second, it provides a means of obtaining results with a hand calculator that can prove useful in checking computer code. The approach is limited to weak solutions to avoid dealing with molalities and to insure that the Debye-Hückel limiting law is applicable. The nonlinear least squares algorithm Nonlinear Fit, found in the computational mathematics library Mathematica, is utilized to fit the measured volume of added base to the calculated value as a function of the measured pH subject to variation of all the equilibrium constants as parameters (including Kw). The experiment emphasizes both data collection and data analysis aspects of the problem. Data for the titration of phosphorous acid, H3PO3, by NaOH are used to illustrate the approach. Fits of the data without corrections

  7. Interdisciplinary applications of statistical physics to complex systems: Seismic physics, econophysics, and sociophysics

    Science.gov (United States)

    Tenenbaum, Joel

    This thesis applies statistical physics concepts and methods to quantitatively analyze complex systems. This thesis is separated into four parts: (i) characteristics of earthquake systems (ii) memory and volatility in data time series (iii) the application of part (ii) to world financial markets, and (iv) statistical observations on the evolution of word usage. In Part I, we observe statistical patterns in the occurrence of earthquakes. We select a 14-year earthquake catalog covering the archipelago of Japan. We find that regions traditionally thought of as being too distant from one another for causal contact display remarkably high correlations, and the networks that result have a tendency to link highly connected areas with other highly connected areas. In Part II, we introduce and apply the concept of "volatility asymmetry", the primary use of which is in financial data. We explain the relation between memory and "volatility asymmetry" in terms of an asymmetry parameter lambda. We define a litmus test for determining whether lambda is statistically significant and propose a stochastic model based on this parameter and use the model to further explain empirical data. In Part III, we expand on volatility asymmetry. Importing the concepts of time dependence and universality from physics, we explore the aspects of emerging (or "transition") economies in Eastern Europe as they relate to asymmetry. We find that these emerging markets in some instances behave like developed markets and in other instances do not, and that the distinction is a matter both of country and a matter of time period, crisis periods showing different asymmetry characteristics than "healthy" periods. In Part IV, we take note of a series of findings in econophysics, showing statistical growth similarities between a variety of different areas that all have in common the fact of taking place in areas that are both (i) competing and (ii) dynamic. We show that this same growth distribution can be

  8. Fundamental properties of fracture and seismicity in a non extensive statistical physics framework.

    Science.gov (United States)

    Vallianatos, Filippos

    2010-05-01

    A fundamental challenge in many scientific disciplines concerns upscaling, that is, of determining the regularities and laws of evolution at some large scale, from those known at a lower scale. Earthquake physics is no exception, with the challenge of understanding the transition from the laboratory scale to the scale of fault networks and large earthquakes. In this context, statistical physics has a remarkably successful work record in addressing the upscaling problem in physics. It is natural then to consider that the physics of many earthquakes has to be studied with a different approach than the physics of one earthquake and in this sense we can consider the use of statistical physics not only appropriate but necessary to understand the collective properties of earthquakes [see Corral 2004, 2005a,b,c;]. A significant attempt is given in a series of works [Main 1996; Rundle et al., 1997; Main et al., 2000; Main and Al-Kindy, 2002; Rundle et al., 2003; Vallianatos and Triantis, 2008a] that uses classical statistical physics to describe seismicity. Then a natural question arises. What type of statistical physics is appropriate to commonly describe effects from fracture level to seismicity scale?? The application of non extensive statistical physics offers a consistent theoretical framework, based on a generalization of entropy, to analyze the behavior of natural systems with fractal or multi-fractal distribution of their elements. Such natural systems where long - range interactions or intermittency are important, lead to power law behavior. We note that this is consistent with a classical thermodynamic approach to natural systems that rapidly attain equilibrium, leading to exponential-law behavior. In the frame of non extensive statistical physics approach, the probability function p(X) is calculated using the maximum entropy formulation of Tsallis entropy which involves the introduction of at least two constraints (Tsallis et al., 1998). The first one is the

  9. Introduction to the basic concepts of modern physics special relativity, quantum and statistical physics

    CERN Document Server

    Becchi, Carlo Maria

    2007-01-01

    These notes are designed as a text book for a course on the Modern Physics Theory for undergraduate students. The purpose is providing a rigorous and self-contained presentation of the simplest theoretical framework using elementary mathematical tools. A number of examples of relevant applications and an appropriate list of exercises and answered questions are also given. The first part is devoted to Special Relativity concerning in particular space-time relativity and relativistic kinematics. The second part deals with Schroedinger's formulation of quantum mechanics. The presentation concerns mainly one dimensional problems, in particular tunnel effect, discrete energy levels and band spectra. The third part concerns the application of Gibbs statistical methods to quantum systems and in particular to Bose and Fermi gasses.

  10. Statistical physics of black holes as quantum-mechanical systems

    OpenAIRE

    Giddings, Steven B.

    2013-01-01

    Some basic features of black-hole statistical mechanics are investigated, assuming that black holes respect the principles of quantum mechanics. Care is needed in defining an entropy S_bh corresponding to the number of microstates of a black hole, given that the black hole interacts with its surroundings. An open question is then the relationship between this entropy and the Bekenstein-Hawking entropy S_BH. For a wide class of models with interactions needed to ensure unitary quantum evolutio...

  11. Statistical Analysis of Questionnaire on Physical Rehabilitation in Multiple Sclerosis

    Czech Academy of Sciences Publication Activity Database

    Martinková, Patrícia; Řasová, K.

    -, č. 3 (2010), S340 ISSN 1210-7859. [Obnovené neuroimunologickjé a likvorologické dny. 21.05.2010-22.05.2010, Praha] R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : questionnaire * physical rehabilitation * multiple sclerosis Subject RIV: IN - Informatics, Computer Science

  12. Some applications of multivariate statistics to physical anthropology

    NARCIS (Netherlands)

    van Vark, GN

    This paper presents some of the results of the cooperation between the author, a physical anthropologist, and Willem Schaafsma. The subjects of study to be discussed in this paper all refer to human evolution, in particular to the process of hominisation. It is described how the interest of the

  13. From Random Walks to Brownian Motion, from Diffusion to Entropy: Statistical Principles in Introductory Physics

    Science.gov (United States)

    Reeves, Mark

    2014-03-01

    Entropy changes underlie the physics that dominates biological interactions. Indeed, introductory biology courses often begin with an exploration of the qualities of water that are important to living systems. However, one idea that is not explicitly addressed in most introductory physics or biology textbooks is dominant contribution of the entropy in driving important biological processes towards equilibrium. From diffusion to cell-membrane formation, to electrostatic binding in protein folding, to the functioning of nerve cells, entropic effects often act to counterbalance deterministic forces such as electrostatic attraction and in so doing, allow for effective molecular signaling. A small group of biology, biophysics and computer science faculty have worked together for the past five years to develop curricular modules (based on SCALEUP pedagogy) that enable students to create models of stochastic and deterministic processes. Our students are first-year engineering and science students in the calculus-based physics course and they are not expected to know biology beyond the high-school level. In our class, they learn to reduce seemingly complex biological processes and structures to be described by tractable models that include deterministic processes and simple probabilistic inference. The students test these models in simulations and in laboratory experiments that are biologically relevant. The students are challenged to bridge the gap between statistical parameterization of their data (mean and standard deviation) and simple model-building by inference. This allows the students to quantitatively describe realistic cellular processes such as diffusion, ionic transport, and ligand-receptor binding. Moreover, the students confront ``random'' forces and traditional forces in problems, simulations, and in laboratory exploration throughout the year-long course as they move from traditional kinematics through thermodynamics to electrostatic interactions. This talk

  14. BOOK REVIEW: New Directions in Statistical Physics: Econophysics, Bioinformatics, and Pattern Recognition

    Science.gov (United States)

    Grassberger, P.

    2004-10-01

    This book contains 18 contributions from different authors. Its subtitle `Econophysics, Bioinformatics, and Pattern Recognition' says more precisely what it is about: not so much about central problems of conventional statistical physics like equilibrium phase transitions and critical phenomena, but about its interdisciplinary applications. After a long period of specialization, physicists have, over the last few decades, found more and more satisfaction in breaking out of the limitations set by the traditional classification of sciences. Indeed, this classification had never been strict, and physicists in particular had always ventured into other fields. Helmholtz, in the middle of the 19th century, had considered himself a physicist when working on physiology, stressing that the physics of animate nature is as much a legitimate field of activity as the physics of inanimate nature. Later, Max Delbrück and Francis Crick did for experimental biology what Schrödinger did for its theoretical foundation. And many of the experimental techniques used in chemistry, biology, and medicine were developed by a steady stream of talented physicists who left their proper discipline to venture out into the wider world of science. The development we have witnessed over the last thirty years or so is different. It started with neural networks where methods could be applied which had been developed for spin glasses, but todays list includes vehicular traffic (driven lattice gases), geology (self-organized criticality), economy (fractal stochastic processes and large scale simulations), engineering (dynamical chaos), and many others. By staying in the physics departments, these activities have transformed the physics curriculum and the view physicists have of themselves. In many departments there are now courses on econophysics or on biological physics, and some universities offer degrees in the physics of traffic or in econophysics. In order to document this change of attitude

  15. A new formalism for non extensive physical systems: Tsallis Thermo statistics

    International Nuclear Information System (INIS)

    Tirnakli, U.; Bueyuekkilic, F.; Demirhan, D.

    1999-01-01

    Although Boltzmann-Gibbs (BG) statistics provides a suitable tool which enables us to handle a large number of physical systems satisfactorily, it has some basic restrictions. Recently a non extensive thermo statistics has been proposed by C.Tsallis to handle the non extensive physical systems and up to now, besides the generalization of some of the conventional concepts, the formalism has been prosperous in some of the physical applications. In this study, our effort is to introduce Tsallis thermo statistics in some details and to emphasize its achievements on physical systems by noting the recent developments on this line

  16. Statistical Physics of Nanoparticles in the Gas Phase

    CERN Document Server

    Hansen, Klavs

    2013-01-01

    Thermal processes are ubiquitous and an understanding of thermal phenomena is essential for a complete description of the physics of nanoparticles, both for the purpose of modeling the dynamics of the particles and for the correct interpretation of experimental data. This book has the twofold aim to present coherently the relevant results coming from the recent scientific literature and to guide the readers through the process of deriving results, enabling them to explore the limits of the mathematical approximations and test the power of the method. The book is focused on the fundamental properties of nanosystems in the gas phase. For this reason there is a strong emphasis on microcanonical physics. Each chapter is enriched with exercises and 3 Appendices provide additional useful materials.

  17. Towards the unified non-classical physics: account for quantum fluctuations in equilibrium thermodynamics via the effective temperature

    Directory of Open Access Journals (Sweden)

    Yu.G.Rudoy

    2005-01-01

    Full Text Available The concept of effective temperature (ET T*(T0, T is used in order to approximately "quantize" the thermodynamic functions of the dynamical object which is in the thermal equilibrium with thermal bath being at constant temperature T (T0=E0/kB, where E0 is the ground-state energy, kB - Boltzmann constant, is the characteristic ``quantum'' temperature of the system itself. On these grounds the extensive comparative investigation is carried out for the ``standard model'' of statistical mechanics - the one-dimensional harmonic oscillator (HO. Three well-known approaches are considered and their thermodynamic consequences thoroughly studied. These are: the exact quantum, or non-classical Planck-Einstein approach, intermediate, or semiclassical Bloch-Wigner approach and, finally, the pure classical, or Maxwell-Boltzmann approach.

  18. Statistical physics of learning from examples: a brief introduction

    International Nuclear Information System (INIS)

    Broeck, C. van den

    1994-01-01

    The problem of how one can learn from examples is illustrated on the case of a student perception trained by the Hebb rule on examples generated by a teacher perception. Two basic quantities are calculated: the training error and the generalization error. The obtained results are found to be typical. Other training rules are discussed. For the case of an Ising student with an Ising teacher, the existence of a first order phase transition is shown. Special effects such as dilution, queries, rejection, etc. are discussed and some results for multilayer networks are reviewed. In particular, the properties of a self-similar committee machine are derived. Finally, we discuss the statistic of generalization, with a review of the Hoeffding inequality, the Dvoretzky Kiefer Wolfowitz theorem and the Vapnik Chervonenkis theorem. (author). 29 refs, 6 figs

  19. Monte Carlo simulation in statistical physics an introduction

    CERN Document Server

    Binder, Kurt

    1992-01-01

    The Monte Carlo method is a computer simulation method which uses random numbers to simulate statistical fluctuations The method is used to model complex systems with many degrees of freedom Probability distributions for these systems are generated numerically and the method then yields numerically exact information on the models Such simulations may be used tosee how well a model system approximates a real one or to see how valid the assumptions are in an analyical theory A short and systematic theoretical introduction to the method forms the first part of this book The second part is a practical guide with plenty of examples and exercises for the student Problems treated by simple sampling (random and self-avoiding walks, percolation clusters, etc) are included, along with such topics as finite-size effects and guidelines for the analysis of Monte Carlo simulations The two parts together provide an excellent introduction to the theory and practice of Monte Carlo simulations

  20. Statistical and physical study of one-sided planetary nebulae.

    Science.gov (United States)

    Ali, A.; El-Nawawy, M. S.; Pfleiderer, J.

    The authors have investigated the spatial orientation of one-sided planetary nebulae. Most of them if not all are interacting with the interstellar medium. Seventy percent of the nebulae in the sample have inclination angles larger than 45° to the Galactic plane and 30% of the inclination angles are less than 45°. Most of the selected objects are old, evolved planetary nebulae with large dimensions, and not far away from the Galactic plane. Seventy-five percent of the objects are within 160 pc from the Galactic plane. The enhanced concavity arc can be explained physically as a result of the 'planetary nebulae-interstellar matter' interaction. The authors discuss the possible effect of the interstellar magnetic field in the concavity regions.

  1. Assessing the hydrogeochemical processes affecting groundwater pollution in arid areas using an integration of geochemical equilibrium and multivariate statistical techniques

    International Nuclear Information System (INIS)

    El Alfy, Mohamed; Lashin, Aref; Abdalla, Fathy; Al-Bassam, Abdulaziz

    2017-01-01

    Rapid economic expansion poses serious problems for groundwater resources in arid areas, which typically have high rates of groundwater depletion. In this study, integration of hydrochemical investigations involving chemical and statistical analyses are conducted to assess the factors controlling hydrochemistry and potential pollution in an arid region. Fifty-four groundwater samples were collected from the Dhurma aquifer in Saudi Arabia, and twenty-one physicochemical variables were examined for each sample. Spatial patterns of salinity and nitrate were mapped using fitted variograms. The nitrate spatial distribution shows that nitrate pollution is a persistent problem affecting a wide area of the aquifer. The hydrochemical investigations and cluster analysis reveal four significant clusters of groundwater zones. Five main factors were extracted, which explain >77% of the total data variance. These factors indicated that the chemical characteristics of the groundwater were influenced by rock–water interactions and anthropogenic factors. The identified clusters and factors were validated with hydrochemical investigations. The geogenic factors include the dissolution of various minerals (calcite, aragonite, gypsum, anhydrite, halite and fluorite) and ion exchange processes. The anthropogenic factors include the impact of irrigation return flows and the application of potassium, nitrate, and phosphate fertilizers. Over time, these anthropogenic factors will most likely contribute to further declines in groundwater quality. - Highlights: • Hydrochemical investigations were carried out in Dhurma aquifer in Saudi Arabia. • The factors controlling potential groundwater pollution in an arid region were studied. • Chemical and statistical analyses are integrated to assess these factors. • Five main factors were extracted, which explain >77% of the total data variance. • The chemical characteristics of the groundwater were influenced by rock–water interactions

  2. Excel 2013 for physical sciences statistics a guide to solving practical problems

    CERN Document Server

    Quirk, Thomas J; Horton, Howard F

    2016-01-01

    This book shows the capabilities of Microsoft Excel in teaching physical sciences statistics effectively. Similar to the previously published Excel 2010 for Physical Sciences Statistics, this book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical science problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in science courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2013 for Physical Sciences Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their ...

  3. Statistical Uncertainty Quantification of Physical Models during Reflood of LBLOCA

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Deog Yeon; Seul, Kwang Won; Woo, Sweng Woong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    The use of the best-estimate (BE) computer codes in safety analysis for loss-of-coolant accident (LOCA) is the major trend in many countries to reduce the significant conservatism. A key feature of this BE evaluation requires the licensee to quantify the uncertainty of the calculations. So, it is very important how to determine the uncertainty distribution before conducting the uncertainty evaluation. Uncertainty includes those of physical model and correlation, plant operational parameters, and so forth. The quantification process is often performed mainly by subjective expert judgment or obtained from reference documents of computer code. In this respect, more mathematical methods are needed to reasonably determine the uncertainty ranges. The first uncertainty quantification are performed with the various increments for two influential uncertainty parameters to get the calculated responses and their derivatives. The different data set with two influential uncertainty parameters for FEBA tests, are chosen applying more strict criteria for selecting responses and their derivatives, which may be considered as the user’s effect in the CIRCÉ applications. Finally, three influential uncertainty parameters are considered to study the effect on the number of uncertainty parameters due to the limitation of CIRCÉ method. With the determined uncertainty ranges, uncertainty evaluations for FEBA tests are performed to check whether the experimental responses such as the cladding temperature or pressure drop are inside the limits of calculated uncertainty bounds. A confirmation step will be performed to evaluate the quality of the information in the case of the different reflooding PERICLES experiments. The uncertainty ranges of physical model in MARS-KS thermal-hydraulic code during the reflooding were quantified by CIRCÉ method using FEBA experiment tests, instead of expert judgment. Also, through the uncertainty evaluation for FEBA and PERICLES tests, it was confirmed

  4. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  5. Steam generators clogging diagnosis through physical and statistical modelling

    International Nuclear Information System (INIS)

    Girard, S.

    2012-01-01

    Steam generators are massive heat exchangers feeding the turbines of pressurised water nuclear power plants. Internal parts of steam generators foul up with iron oxides which gradually close some holes aimed for the passing of the fluid. This phenomenon called clogging causes safety issues and means to assess it are needed to optimise the maintenance strategy. The approach investigated in this thesis is the analysis of steam generators dynamic behaviour during power transients with a mono dimensional physical model. Two improvements to the model have been implemented. One was taking into account flows orthogonal to the modelling axis, the other was introducing a slip between phases accounting for velocity difference between liquid water and steam. These two elements increased the model's degrees of freedom and improved the adequacy of the simulation to plant data. A new calibration and validation methodology has been proposed to assess the robustness of the model. The initial inverse problem was ill posed: different clogging spatial configurations can produce identical responses. The relative importance of clogging, depending on its localisation, has been estimated by sensitivity analysis with the Sobol' method. The dimension of the model functional output had been previously reduced by principal components analysis. Finally, the input dimension has been reduced by a technique called sliced inverse regression. Based on this new framework, a new diagnosis methodology, more robust and better understood than the existing one, has been proposed. (author)

  6. Assessing the hydrogeochemical processes affecting groundwater pollution in arid areas using an integration of geochemical equilibrium and multivariate statistical techniques.

    Science.gov (United States)

    El Alfy, Mohamed; Lashin, Aref; Abdalla, Fathy; Al-Bassam, Abdulaziz

    2017-10-01

    Rapid economic expansion poses serious problems for groundwater resources in arid areas, which typically have high rates of groundwater depletion. In this study, integration of hydrochemical investigations involving chemical and statistical analyses are conducted to assess the factors controlling hydrochemistry and potential pollution in an arid region. Fifty-four groundwater samples were collected from the Dhurma aquifer in Saudi Arabia, and twenty-one physicochemical variables were examined for each sample. Spatial patterns of salinity and nitrate were mapped using fitted variograms. The nitrate spatial distribution shows that nitrate pollution is a persistent problem affecting a wide area of the aquifer. The hydrochemical investigations and cluster analysis reveal four significant clusters of groundwater zones. Five main factors were extracted, which explain >77% of the total data variance. These factors indicated that the chemical characteristics of the groundwater were influenced by rock-water interactions and anthropogenic factors. The identified clusters and factors were validated with hydrochemical investigations. The geogenic factors include the dissolution of various minerals (calcite, aragonite, gypsum, anhydrite, halite and fluorite) and ion exchange processes. The anthropogenic factors include the impact of irrigation return flows and the application of potassium, nitrate, and phosphate fertilizers. Over time, these anthropogenic factors will most likely contribute to further declines in groundwater quality. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Lectures on statistical mechanics

    CERN Document Server

    Bowler, M G

    1982-01-01

    Anyone dissatisfied with the almost ritual dullness of many 'standard' texts in statistical mechanics will be grateful for the lucid explanation and generally reassuring tone. Aimed at securing firm foundations for equilibrium statistical mechanics, topics of great subtlety are presented transparently and enthusiastically. Very little mathematical preparation is required beyond elementary calculus and prerequisites in physics are limited to some elementary classical thermodynamics. Suitable as a basis for a first course in statistical mechanics, the book is an ideal supplement to more convent

  8. Thermal transport in low dimensions from statistical physics to nanoscale heat transfer

    CERN Document Server

    2016-01-01

    Understanding non-equilibrium properties of classical and quantum many-particle systems is one of the goals of contemporary statistical mechanics. Besides its own interest for the theoretical foundations of irreversible thermodynamics(e.g. of the Fourier's law of heat conduction), this topic is also relevant to develop innovative ideas for nanoscale thermal management with possible future applications to nanotechnologies and effective energetic resources. The first part of the volume (Chapters 1-6) describes the basic models, the phenomenology and the various theoretical approaches to understand heat transport in low-dimensional lattices (1D e 2D). The methods described will include equilibrium and nonequilibrium molecular dynamics simulations, hydrodynamic and kinetic approaches and the solution of stochastic models. The second part (Chapters 7-10) deals with applications to nano and microscale heat transfer, as for instance phononic transport in carbon-based nanomaterials, including the prominent case of na...

  9. Excel 2016 for physical sciences statistics a guide to solving practical problems

    CERN Document Server

    Quirk, Thomas J; Horton, Howard F

    2016-01-01

    This book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical physical science problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel is an effective learning tool for quantitative analyses in environmental science courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2016 for Physical Sciences Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel 2016 to statistical techniques necessary in their courses and work. Each chapter explains statistical formulas and directs the reader to use Excel commands to solve specific, easy-to-understand physical science problems. Practice problems are provided at the end of each chapter with their s...

  10. On estimating perturbative coefficients in quantum field theory and statistical physics

    International Nuclear Information System (INIS)

    Samuel, M.A.; Stanford Univ., CA

    1994-05-01

    The authors present a method for estimating perturbative coefficients in quantum field theory and Statistical Physics. They are able to obtain reliable error-bars for each estimate. The results, in all cases, are excellent

  11. Statistical physics of non-thermal phase transitions from foundations to applications

    CERN Document Server

    Abaimov, Sergey G

    2015-01-01

    Statistical physics can be used to better understand non-thermal complex systems—phenomena such as stock-market crashes, revolutions in society and in science, fractures in engineered materials and in the Earth’s crust, catastrophes, traffic jams, petroleum clusters, polymerization, self-organized criticality and many others exhibit behaviors resembling those of thermodynamic systems. In particular, many of these systems possess phase transitions identical to critical or spinodal phenomena in statistical physics. The application of the well-developed formalism of statistical physics to non-thermal complex systems may help to predict and prevent such catastrophes as earthquakes, snow-avalanches and landslides, failure of engineering structures, or economical crises. This book addresses the issue step-by-step, from phenomenological analogies between complex systems and statistical physics to more complex aspects, such as correlations, fluctuation-dissipation theorem, susceptibility, the concept of free ener...

  12. [Mental and physical equilibrium for better quality of care: experience of the Ravenna CNAI group].

    Science.gov (United States)

    Burrai, Francesco; Suprani, Riccarda

    2010-01-01

    The current orientation of health services is a progressive reduction of resources and a constant increase of efficiency and efficacy: this implies a greater demand on health personnel with the risk of stress and demotivation. In this context , well-balanced mental and physical conditions are essential and health workers should be given all the support they need to obtain and maintain such conditions, also to avoid repercussions on patients. For this purpose a satellite group of the CNAI nursing association organized two formative events based on guided imagery and mindfulness , to increase self-awareness, aimed not only at nurses but also rehabilitation and other health care workers. Results were evaluated using a questionnaire and demonstrated better awareness , less psychosomatic stress-related problems , improved quality of life and well-being.

  13. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  14. Equilibrium thermodynamics

    CERN Document Server

    de Oliveira, Mário J

    2017-01-01

    This textbook provides an exposition of equilibrium thermodynamics and its applications to several areas of physics with particular attention to phase transitions and critical phenomena. The applications include several areas of condensed matter physics and include also a chapter on thermochemistry. Phase transitions and critical phenomena are treated according to the modern development of the field, based on the ideas of universality and on the Widom scaling theory. For each topic, a mean-field or Landau theory is presented to describe qualitatively the phase transitions. These theories include the van der Waals theory of the liquid-vapor transition, the Hildebrand-Heitler theory of regular mixtures, the Griffiths-Landau theory for multicritical points in multicomponent systems, the Bragg-Williams theory of order-disorder in alloys, the Weiss theory of ferromagnetism, the Néel theory of antiferromagnetism, the Devonshire theory for ferroelectrics and Landau-de Gennes theory of liquid crystals. This new edit...

  15. Functional and physical molecular size of the chicken hepatic lectin determined by radiation inactivation and sedimentation equilibrium analysis

    International Nuclear Information System (INIS)

    Steer, C.J.; Osborne, J.C. Jr.; Kempner, E.S.

    1990-01-01

    Radiation inactivation and sedimentation equilibrium analysis were used to determine the functional and physical size of the chicken hepatic membrane receptor that binds N-acetylglucosamine-terminated glycoproteins. Purified plasma membranes from chicken liver were irradiated with high energy electrons and assayed for 125I-agalactoorosomucoid binding. Increasing the dose of ionizing radiation resulted in a monoexponential decay in binding activity due to a progressive loss of binding sites. The molecular mass of the chicken lectin, determined in situ by target analysis, was 69,000 +/- 9,000 Da. When the same irradiated membranes were solubilized in Brij 58 and assayed, the binding protein exhibited a target size of 62,000 +/- 4,000 Da; in Triton X-100, the functional size of the receptor was 85,000 +/- 10,000 Da. Sedimentation equilibrium measurements of the purified binding protein yielded a lower limit molecular weight of 79,000 +/- 7,000. However, the solubilized lectin was detected as a heterogeneous population of oligomers with molecular weights as high as 450,000. Addition of calcium or calcium plus N-acetylglucosamine decreased the higher molecular weight species, but the lower limit molecular weights remained invariant. Similar results were determined when the chicken lectin was solubilized in Brij 58, C12E9, or 3-[(3-cholamidopropyl)dimethylammonio]-1-propane-sulfonic acid (CHAPS). Results from the present study suggest that in the plasma membrane, the functional species of the chicken hepatic lectin exists as a trimer. However, in detergent solution, the purified receptor forms a heterogeneous population of irreversible oligomers that exhibit binding activity proportional to size

  16. Statistical issues in searches for new phenomena in High Energy Physics

    Science.gov (United States)

    Lyons, Louis; Wardle, Nicholas

    2018-03-01

    Many analyses of data in High Energy Physics are concerned with searches for New Physics. We review the statistical issues that arise in such searches, and then illustrate these using the specific example of the recent successful search for the Higgs boson, produced in collisions between high energy protons at CERN’s Large Hadron Collider.

  17. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  18. SERC School on Computational Statistical Physics held at the Indian Institute of Technology

    CERN Document Server

    Ray, Purusattam

    2011-01-01

    The present book is an outcome of the SERC school on Computational Statistical Physics held at the Indian Institute of Technology, Guwahati, in December 2008. Numerical experimentation has played an extremely important role in statistical physics in recent years. Lectures given at the School covered a large number of topics of current and continuing interest. Based on lectures by active researchers in the field- Bikas Chakrabarti, S Chaplot, Deepak Dhar, Sanjay Kumar, Prabal Maiti, Sanjay Puri, Purusattam Ray, Sitangshu Santra and Subir Sarkar- the nine chapters comprising the book deal with topics that range from the fundamentals of the field, to problems and questions that are at the very forefront of current research. This book aims to expose the graduate student to the basic as well as advanced techniques in computational statistical physics. Following a general introduction to statistical mechanics and critical phenomena, the various chapters cover Monte Carlo and molecular dynamics simulation methodolog...

  19. Physics Teachers and Students: A Statistical and Historical Analysis of Women

    Science.gov (United States)

    Gregory, Amanda

    2009-10-01

    Historically, women have been denied an education comparable to that available to men. Since women have been allowed into institutions of higher learning, they have been studying and earning physics degrees. The aim of this poster is to discuss the statistical relationship between the number of women enrolled in university physics programs and the number of female physics faculty members. Special care has been given to examining the statistical data in the context of the social climate at the time that these women were teaching or pursuing their education.

  20. Statistics

    International Nuclear Information System (INIS)

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  1. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  2. Statistics

    International Nuclear Information System (INIS)

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  3. Physics-based statistical model and simulation method of RF propagation in urban environments

    Science.gov (United States)

    Pao, Hsueh-Yuan; Dvorak, Steven L.

    2010-09-14

    A physics-based statistical model and simulation/modeling method and system of electromagnetic wave propagation (wireless communication) in urban environments. In particular, the model is a computationally efficient close-formed parametric model of RF propagation in an urban environment which is extracted from a physics-based statistical wireless channel simulation method and system. The simulation divides the complex urban environment into a network of interconnected urban canyon waveguides which can be analyzed individually; calculates spectral coefficients of modal fields in the waveguides excited by the propagation using a database of statistical impedance boundary conditions which incorporates the complexity of building walls in the propagation model; determines statistical parameters of the calculated modal fields; and determines a parametric propagation model based on the statistical parameters of the calculated modal fields from which predictions of communications capability may be made.

  4. The Practicality of Statistical Physics Handout Based on KKNI and the Constructivist Approach

    Science.gov (United States)

    Sari, S. Y.; Afrizon, R.

    2018-04-01

    Statistical physics lecture shows that: 1) the performance of lecturers, social climate, students’ competence and soft skills needed at work are in enough category, 2) students feel difficulties in following the lectures of statistical physics because it is abstract, 3) 40.72% of students needs more understanding in the form of repetition, practice questions and structured tasks, and 4) the depth of statistical physics material needs to be improved gradually and structured. This indicates that learning materials in accordance of The Indonesian National Qualification Framework or Kerangka Kualifikasi Nasional Indonesia (KKNI) with the appropriate learning approach are needed to help lecturers and students in lectures. The author has designed statistical physics handouts which have very valid criteria (90.89%) according to expert judgment. In addition, the practical level of handouts designed also needs to be considered in order to be easy to use, interesting and efficient in lectures. The purpose of this research is to know the practical level of statistical physics handout based on KKNI and a constructivist approach. This research is a part of research and development with 4-D model developed by Thiagarajan. This research activity has reached part of development test at Development stage. Data collection took place by using a questionnaire distributed to lecturers and students. Data analysis using descriptive data analysis techniques in the form of percentage. The analysis of the questionnaire shows that the handout of statistical physics has very practical criteria. The conclusion of this study is statistical physics handouts based on the KKNI and constructivist approach have been practically used in lectures.

  5. Multivariate statistical methods and data mining in particle physics (4/4)

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    The lectures will cover multivariate statistical methods and their applications in High Energy Physics. The methods will be viewed in the framework of a statistical test, as used e.g. to discriminate between signal and background events. Topics will include an introduction to the relevant statistical formalism, linear test variables, neural networks, probability density estimation (PDE) methods, kernel-based PDE, decision trees and support vector machines. The methods will be evaluated with respect to criteria relevant to HEP analyses such as statistical power, ease of computation and sensitivity to systematic effects. Simple computer examples that can be extended to more complex analyses will be presented.

  6. Multivariate statistical methods and data mining in particle physics (2/4)

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    The lectures will cover multivariate statistical methods and their applications in High Energy Physics. The methods will be viewed in the framework of a statistical test, as used e.g. to discriminate between signal and background events. Topics will include an introduction to the relevant statistical formalism, linear test variables, neural networks, probability density estimation (PDE) methods, kernel-based PDE, decision trees and support vector machines. The methods will be evaluated with respect to criteria relevant to HEP analyses such as statistical power, ease of computation and sensitivity to systematic effects. Simple computer examples that can be extended to more complex analyses will be presented.

  7. Multivariate statistical methods and data mining in particle physics (1/4)

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    The lectures will cover multivariate statistical methods and their applications in High Energy Physics. The methods will be viewed in the framework of a statistical test, as used e.g. to discriminate between signal and background events. Topics will include an introduction to the relevant statistical formalism, linear test variables, neural networks, probability density estimation (PDE) methods, kernel-based PDE, decision trees and support vector machines. The methods will be evaluated with respect to criteria relevant to HEP analyses such as statistical power, ease of computation and sensitivity to systematic effects. Simple computer examples that can be extended to more complex analyses will be presented.

  8. Statistics

    International Nuclear Information System (INIS)

    2003-01-01

    For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products

  9. Statistics

    International Nuclear Information System (INIS)

    2004-01-01

    For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees

  10. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  11. Statistical state dynamics-based analysis of the physical mechanisms sustaining and regulating turbulence in Couette flow

    Science.gov (United States)

    Farrell, Brian F.; Ioannou, Petros J.

    2017-08-01

    This paper describes a study of the self-sustaining process in wall turbulence. The study is based on a second order statistical state dynamics model of Couette flow in which the state variables are the streamwise mean flow (first cumulant) and perturbation covariance (second cumulant). This statistical state dynamics model is closed by either setting the third cumulant to zero or by replacing it with a stochastic parametrization. Statistical state dynamics models with this form are referred to as S3T models. S3T models have been shown to self-sustain turbulence with a mean flow and second order perturbation structure similar to that obtained by direct numerical simulation of the equations of motion. The use of a statistical state dynamics model to study the physical mechanisms underlying turbulence has important advantages over the traditional approach of studying the dynamics of individual realizations of turbulence. One advantage is that the analytical structure of S3T statistical state dynamics models isolates the interaction between the mean flow and the perturbation components of the turbulence. Isolation of the interaction between these components reveals how this interaction underlies both the maintenance of the turbulence variance by transfer of energy from the externally driven flow to the perturbation components as well as the enforcement of the observed statistical mean turbulent state by feedback regulation between the mean and perturbation fields. Another advantage of studying turbulence using statistical state dynamics models of S3T form is that the analytical structure of S3T turbulence can be completely characterized. For example, the perturbation component of turbulence in the S3T system is demonstrably maintained by a parametric perturbation growth mechanism in which fluctuation of the mean flow maintains the perturbation field which in turn maintains the mean flow fluctuations in a synergistic interaction. Furthermore, the equilibrium

  12. Hunting Solomonoff's Swans: Exploring the Boundary Between Physics and Statistics in Hydrological Modeling

    Science.gov (United States)

    Nearing, G. S.

    2014-12-01

    Statistical models consistently out-perform conceptual models in the short term, however to account for a nonstationary future (or an unobserved past) scientists prefer to base predictions on unchanging and commutable properties of the universe - i.e., physics. The problem with physically-based hydrology models is, of course, that they aren't really based on physics - they are based on statistical approximations of physical interactions, and we almost uniformly lack an understanding of the entropy associated with these approximations. Thermodynamics is successful precisely because entropy statistics are computable for homogeneous (well-mixed) systems, and ergodic arguments explain the success of Newton's laws to describe systems that are fundamentally quantum in nature. Unfortunately, similar arguments do not hold for systems like watersheds that are heterogeneous at a wide range of scales. Ray Solomonoff formalized the situation in 1968 by showing that given infinite evidence, simultaneously minimizing model complexity and entropy in predictions always leads to the best possible model. The open question in hydrology is about what happens when we don't have infinite evidence - for example, when the future will not look like the past, or when one watershed does not behave like another. How do we isolate stationary and commutable components of watershed behavior? I propose that one possible answer to this dilemma lies in a formal combination of physics and statistics. In this talk I outline my recent analogue (Solomonoff's theorem was digital) of Solomonoff's idea that allows us to quantify the complexity/entropy tradeoff in a way that is intuitive to physical scientists. I show how to formally combine "physical" and statistical methods for model development in a way that allows us to derive the theoretically best possible model given any given physics approximation(s) and available observations. Finally, I apply an analogue of Solomonoff's theorem to evaluate the

  13. Generalized statistical criterion for distinguishing random optical groupings from physical multiple systems

    International Nuclear Information System (INIS)

    Anosova, Z.P.

    1988-01-01

    A statistical criterion is proposed for distinguishing between random and physical groupings of stars and galaxies. The criterion is applied to nearby wide multiple stars, triplets of galaxies in the list of Karachentsev, Karachentseva, and Shcherbanovskii, and double galaxies in the list of Dahari, in which the principal components are Seyfert galaxies. Systems that are almost certainly physical, probably physical, probably optical, and almost certainly optical are identified. The limiting difference between the radial velocities of the components of physical multiple galaxies is estimated

  14. Safety bey statistics? A critical view on statistical methods applied in health physics; Sicherheit durch Statistik? Ein kritischer Blick auf die Anwendung statistischer Methoden im Strahlenschutz

    Energy Technology Data Exchange (ETDEWEB)

    Kraut, W. [Duale Hochschule Baden-Wuerttemberg (DHBW), Karlsruhe (Germany). Studiengang Sicherheitswesen

    2016-07-01

    The only proper way to describe uncertainties in health physics is by statistical means. But statistics never can replace Your personal evaluation of effect, nor can statistics transmute randomness into certainty like an ''uncertainty laundry''. The paper discusses these problems in routine practical work.

  15. Classical Methods of Statistics With Applications in Fusion-Oriented Plasma Physics

    CERN Document Server

    Kardaun, Otto J W F

    2005-01-01

    Classical Methods of Statistics is a blend of theory and practical statistical methods written for graduate students and researchers interested in applications to plasma physics and its experimental aspects. It can also fruitfully be used by students majoring in probability theory and statistics. In the first part, the mathematical framework and some of the history of the subject are described. Many exercises help readers to understand the underlying concepts. In the second part, two case studies are presented exemplifying discriminant analysis and multivariate profile analysis. The introductions of these case studies outline contextual magnetic plasma fusion research. In the third part, an overview of statistical software is given and, in particular, SAS and S-PLUS are discussed. In the last chapter, several datasets with guided exercises, predominantly from the ASDEX Upgrade tokamak, are included and their physical background is concisely described. The book concludes with a list of essential keyword transl...

  16. Identification of AE Bursts by Classification of Physical and Statistical Parameters

    International Nuclear Information System (INIS)

    Mieza, J.I.; Oliveto, M.E.; Lopez Pumarega, M.I.; Armeite, M.; Ruzzante, J.E.; Piotrkowski, R.

    2005-01-01

    Physical and statistical parameters obtained with the Principal Components method, extracted from Acoustic Emission bursts coming from triaxial deformation tests were analyzed. The samples came from seamless steel tubes used in the petroleum industry and some of them were provided with a protective coating. The purpose of our work was to identify bursts originated in the breakage of the coating, from those originated in damage mechanisms in the bulk steel matrix. Analysis was performed by statistical distributions, fractal analysis and clustering methods

  17. Statistical Plasma Physics in a Strong Magnetic Field: Paradigms and Problems

    Energy Technology Data Exchange (ETDEWEB)

    J.A. Krommes

    2004-03-19

    An overview is given of certain aspects of fundamental statistical theories as applied to strongly magnetized plasmas. Emphasis is given to the gyrokinetic formalism, the historical development of realizable Markovian closures, and recent results in the statistical theory of turbulent generation of long-wavelength flows that generalize and provide further physical insight to classic calculations of eddy viscosity. A Hamiltonian formulation of turbulent flow generation is described and argued to be very useful.

  18. A new universality class in corpus of texts; A statistical physics study

    Science.gov (United States)

    Najafi, Elham; Darooneh, Amir H.

    2018-05-01

    Text can be regarded as a complex system. There are some methods in statistical physics which can be used to study this system. In this work, by means of statistical physics methods, we reveal new universal behaviors of texts associating with the fractality values of words in a text. The fractality measure indicates the importance of words in a text by considering distribution pattern of words throughout the text. We observed a power law relation between fractality of text and vocabulary size for texts and corpora. We also observed this behavior in studying biological data.

  19. Information Theory - The Bridge Connecting Bounded Rational Game Theory and Statistical Physics

    Science.gov (United States)

    Wolpert, David H.

    2005-01-01

    A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality of all red-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. This paper shows that the same information theoretic mathematical structure, known as Product Distribution (PD) theory, addresses both issues. In this, PD theory not only provides a principle formulation of bounded rationality and a set of new types of mean field theory in statistical physics; it also shows that those topics are fundamentally one and the same.

  20. Statistical physics inspired energy-efficient coded-modulation for optical communications.

    Science.gov (United States)

    Djordjevic, Ivan B; Xu, Lei; Wang, Ting

    2012-04-15

    Because Shannon's entropy can be obtained by Stirling's approximation of thermodynamics entropy, the statistical physics energy minimization methods are directly applicable to the signal constellation design. We demonstrate that statistical physics inspired energy-efficient (EE) signal constellation designs, in combination with large-girth low-density parity-check (LDPC) codes, significantly outperform conventional LDPC-coded polarization-division multiplexed quadrature amplitude modulation schemes. We also describe an EE signal constellation design algorithm. Finally, we propose the discrete-time implementation of D-dimensional transceiver and corresponding EE polarization-division multiplexed system. © 2012 Optical Society of America

  1. Non-equilibrium physics of neural networks for leaning, memory and decision making: landscape and flux perspectives

    Science.gov (United States)

    Wang, Jin

    Cognitive behaviors are determined by underlying neural networks. Many brain functions, such as learning and memory, can be described by attractor dynamics. We developed a theoretical framework for global dynamics by quantifying the landscape associated with the steady state probability distributions and steady state curl flux, measuring the degree of non-equilibrium through detailed balance breaking. We found the dynamics and oscillations in human brains responsible for cognitive processes and physiological rhythm regulations are determined not only by the landscape gradient but also by the flux. We found that the flux is closely related to the degrees of the asymmetric connections in neural networks and is the origin of the neural oscillations. The neural oscillation landscape shows a closed-ring attractor topology. The landscape gradient attracts the network down to the ring. The flux is responsible for coherent oscillations on the ring. We suggest the flux may provide the driving force for associations among memories. Both landscape and flux determine the kinetic paths and speed of decision making. The kinetics and global stability of decision making are explored by quantifying the landscape topography through the barrier heights and the mean first passage time. The theoretical predictions are in agreement with experimental observations: more errors occur under time pressure. We quantitatively explored two mechanisms of the speed-accuracy tradeoff with speed emphasis and further uncovered the tradeoffs among speed, accuracy, and energy cost. Our results show an optimal balance among speed, accuracy, and the energy cost in decision making. We uncovered possible mechanisms of changes of mind and how mind changes improve performance in decision processes. Our landscape approach can help facilitate an understanding of the underlying physical mechanisms of cognitive processes and identify the key elements in neural networks.

  2. Statistical thermodynamics

    International Nuclear Information System (INIS)

    Lim, Gyeong Hui

    2008-03-01

    This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics

  3. Using the Science Writing Heuristic To Move toward an Inquiry-Based Laboratory Curriculum: An Example from Physical Equilibrium.

    Science.gov (United States)

    Rudd, James A., II; Greenbowe, Thomas J.; Hand, Brian M.; Legg, Margaret J.

    2001-01-01

    Investigates the effects of the Science Writing Heuristic (SWH) format on student's achievement, thinking abilities and motivation. Focuses on distribution equilibrium and assesses student understanding by studying metacognitive and practical factors. (Contains 17 references.) (Author/YDS)

  4. Data Analysis and Graphing in an Introductory Physics Laboratory: Spreadsheet versus Statistics Suite

    Science.gov (United States)

    Peterlin, Primoz

    2010-01-01

    Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared. (Contains 7…

  5. Data analysis and graphing in an introductory physics laboratory: spreadsheet versus statistics suite

    International Nuclear Information System (INIS)

    Peterlin, Primoz

    2010-01-01

    Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared.

  6. Efficient pan-European river flood hazard modelling through a combination of statistical and physical models

    NARCIS (Netherlands)

    Paprotny, D.; Morales Napoles, O.; Jonkman, S.N.

    2017-01-01

    Flood hazard is currently being researched on continental and global scales, using models of increasing complexity. In this paper we investigate a different, simplified approach, which combines statistical and physical models in place of conventional rainfall-run-off models to carry out flood

  7. Competitive agents in a market: Statistical physics of the minority game

    Science.gov (United States)

    Sherrington, David

    2007-10-01

    A brief review is presented of the minority game, a simple frustrated many-body system stimulated by considerations of a market of competitive speculative agents. Its cooperative behaviour exhibits phase transitions and both ergodic and non-ergodic regimes. It provides novel challenges to statistical physics, reminiscent of those of mean-field spin glasses.

  8. Methods and applications of statistics in engineering, quality control, and the physical sciences

    CERN Document Server

    Balakrishnan, N

    2011-01-01

    Inspired by the Encyclopedia of Statistical Sciences, Second Edition (ESS2e), this volume presents a concise, well-rounded focus on the statistical concepts and applications that are essential for understanding gathered data in the fields of engineering, quality control, and the physical sciences. The book successfully upholds the goals of ESS2e by combining both previously-published and newly developed contributions written by over 100 leading academics, researchers, and practitioner in a comprehensive, approachable format. The result is a succinct reference that unveils modern, cutting-edge approaches to acquiring and analyzing data across diverse subject areas within these three disciplines, including operations research, chemistry, physics, the earth sciences, electrical engineering, and quality assurance. In addition, techniques related to survey methodology, computational statistics, and operations research are discussed, where applicable. Topics of coverage include: optimal and stochastic control, arti...

  9. Combination of statistical and physically based methods to assess shallow slide susceptibility at the basin scale

    Science.gov (United States)

    Oliveira, Sérgio C.; Zêzere, José L.; Lajas, Sara; Melo, Raquel

    2017-07-01

    Approaches used to assess shallow slide susceptibility at the basin scale are conceptually different depending on the use of statistical or physically based methods. The former are based on the assumption that the same causes are more likely to produce the same effects, whereas the latter are based on the comparison between forces which tend to promote movement along the slope and the counteracting forces that are resistant to motion. Within this general framework, this work tests two hypotheses: (i) although conceptually and methodologically distinct, the statistical and deterministic methods generate similar shallow slide susceptibility results regarding the model's predictive capacity and spatial agreement; and (ii) the combination of shallow slide susceptibility maps obtained with statistical and physically based methods, for the same study area, generate a more reliable susceptibility model for shallow slide occurrence. These hypotheses were tested at a small test site (13.9 km2) located north of Lisbon (Portugal), using a statistical method (the information value method, IV) and a physically based method (the infinite slope method, IS). The landslide susceptibility maps produced with the statistical and deterministic methods were combined into a new landslide susceptibility map. The latter was based on a set of integration rules defined by the cross tabulation of the susceptibility classes of both maps and analysis of the corresponding contingency tables. The results demonstrate a higher predictive capacity of the new shallow slide susceptibility map, which combines the independent results obtained with statistical and physically based models. Moreover, the combination of the two models allowed the identification of areas where the results of the information value and the infinite slope methods are contradictory. Thus, these areas were classified as uncertain and deserve additional investigation at a more detailed scale.

  10. Statistical mechanics of driven diffusive systems

    CERN Document Server

    Schmittmann, B

    1995-01-01

    Far-from-equilibrium phenomena, while abundant in nature, are not nearly as well understood as their equilibrium counterparts. On the theoretical side, progress is slowed by the lack of a simple framework, such as the Boltzmann-Gbbs paradigm in the case of equilibrium thermodynamics. On the experimental side, the enormous structural complexity of real systems poses serious obstacles to comprehension. Similar difficulties have been overcome in equilibrium statistical mechanics by focusing on model systems. Even if they seem too simplistic for known physical systems, models give us considerable insight, provided they capture the essential physics. They serve as important theoretical testing grounds where the relationship between the generic physical behavior and the key ingredients of a successful theory can be identified and understood in detail. Within the vast realm of non-equilibrium physics, driven diffusive systems form a subset with particularly interesting properties. As a prototype model for these syst...

  11. Reflections on Gibbs: From Statistical Physics to the Amistad V3.0

    Science.gov (United States)

    Kadanoff, Leo P.

    2014-07-01

    This note is based upon a talk given at an APS meeting in celebration of the achievements of J. Willard Gibbs. J. Willard Gibbs, the younger, was the first American physical sciences theorist. He was one of the inventors of statistical physics. He introduced and developed the concepts of phase space, phase transitions, and thermodynamic surfaces in a remarkably correct and elegant manner. These three concepts form the basis of different areas of physics. The connection among these areas has been a subject of deep reflection from Gibbs' time to our own. This talk therefore celebrated Gibbs by describing modern ideas about how different parts of physics fit together. I finished with a more personal note. Our own J. Willard Gibbs had all his many achievements concentrated in science. His father, also J. Willard Gibbs, also a Professor at Yale, had one great non-academic achievement that remains unmatched in our day. I describe it.

  12. Statistical panorama of female physics graduate students for 2000-2010 in Peru

    Science.gov (United States)

    Cerón Loayza, María Luisa; Bravo Cabrejos, Jorge Aurelio

    2013-03-01

    We report the results of a statistical study on the number of women entering the undergraduate and master's programs of physics at Universidad Nacional Mayor de San Marcos in Peru. From 2006 through 2010, 13 female students entered the master's degree program but no females graduated with the degree. Considering that Peru is a developing country, a career in physics is not considered an attractive professional choice even for male students because it is thought that there are no work centers to practice this profession. We recommend that the causes preventing female physics students from completing their studies and research work be analyzed, and that strategies be planned to help women complete their academic work. We are considering getting help from the Peruvian Physics Society (SOPERFI) in order to draw more attention for our plan.

  13. Synthetic Earthquake Statistics From Physical Fault Models for the Lower Rhine Embayment

    Science.gov (United States)

    Brietzke, G. B.; Hainzl, S.; Zöller, G.

    2012-04-01

    As of today, seismic risk and hazard estimates mostly use pure empirical, stochastic models of earthquake fault systems tuned specifically to the vulnerable areas of interest. Although such models allow for reasonable risk estimates they fail to provide a link between the observed seismicity and the underlying physical processes. Solving a state-of-the-art fully dynamic description set of all relevant physical processes related to earthquake fault systems is likely not useful since it comes with a large number of degrees of freedom, poor constraints on its model parameters and a huge computational effort. Here, quasi-static and quasi-dynamic physical fault simulators provide a compromise between physical completeness and computational affordability and aim at providing a link between basic physical concepts and statistics of seismicity. Within the framework of quasi-static and quasi-dynamic earthquake simulators we investigate a model of the Lower Rhine Embayment (LRE) that is based upon seismological and geological data. We present and discuss statistics of the spatio-temporal behavior of generated synthetic earthquake catalogs with respect to simplification (e.g. simple two-fault cases) as well as to complication (e.g. hidden faults, geometric complexity, heterogeneities of constitutive parameters).

  14. Use of JANAF Tables in Equilibrium Calculations and Partition Function Calculations for an Undergraduate Physical Chemistry Course

    Science.gov (United States)

    Cleary, David A.

    2014-01-01

    The usefulness of the JANAF tables is demonstrated with specific equilibrium calculations. An emphasis is placed on the nature of standard chemical potential calculations. Also, the use of the JANAF tables for calculating partition functions is examined. In the partition function calculations, the importance of the zero of energy is highlighted.

  15. Micro-foundations for macroeconomics: New set-up based on statistical physics

    Science.gov (United States)

    Yoshikawa, Hiroshi

    2016-12-01

    Modern macroeconomics is built on "micro foundations." Namely, optimization of micro agent such as consumer and firm is explicitly analyzed in model. Toward this goal, standard model presumes "the representative" consumer/firm, and analyzes its behavior in detail. However, the macroeconomy consists of 107 consumers and 106 firms. For the purpose of analyzing such macro system, it is meaningless to pursue the micro behavior in detail. In this respect, there is no essential difference between economics and physics. The method of statistical physics can be usefully applied to the macroeconomy, and provides Keynesian economics with correct micro-foundations.

  16. Perspectives and challenges in statistical physics and complex systems for the next decade

    CERN Document Server

    Raposo, Ernesto P; Gomes Eleutério da Luz, Marcos

    2014-01-01

    Statistical Physics (SP) has followed an unusual evolutionary path in science. Originally aiming to provide a fundamental basis for another important branch of Physics, namely Thermodynamics, SP gradually became an independent field of research in its own right. But despite more than a century of steady progress, there are still plenty of challenges and open questions in the SP realm. In fact, the area is still rapidly evolving, in contrast to other branches of science, which already have well defined scopes and borderlines of applicability. This difference is due to the steadily expanding num

  17. Examples of the Application of Nonparametric Information Geometry to Statistical Physics

    Directory of Open Access Journals (Sweden)

    Giovanni Pistone

    2013-09-01

    Full Text Available We review a nonparametric version of Amari’s information geometry in which the set of positive probability densities on a given sample space is endowed with an atlas of charts to form a differentiable manifold modeled on Orlicz Banach spaces. This nonparametric setting is used to discuss the setting of typical problems in machine learning and statistical physics, such as black-box optimization, Kullback-Leibler divergence, Boltzmann-Gibbs entropy and the Boltzmann equation.

  18. New exponential, logarithm and q-probability in the non-extensive statistical physics

    OpenAIRE

    Chung, Won Sang

    2013-01-01

    In this paper, a new exponential and logarithm related to the non-extensive statistical physics is proposed by using the q-sum and q-product which satisfy the distributivity. And we discuss the q-mapping from an ordinary probability to q-probability. The q-entropy defined by the idea of q-probability is shown to be q-additive.

  19. Quantum Entropy and Its Applications to Quantum Communication and Statistical Physics

    Directory of Open Access Journals (Sweden)

    Masanori Ohya

    2010-05-01

    Full Text Available Quantum entropy is a fundamental concept for quantum information recently developed in various directions. We will review the mathematical aspects of quantum entropy (entropies and discuss some applications to quantum communication, statistical physics. All topics taken here are somehow related to the quantum entropy that the present authors have been studied. Many other fields recently developed in quantum information theory, such as quantum algorithm, quantum teleportation, quantum cryptography, etc., are totally discussed in the book (reference number 60.

  20. Demonstration of fundamental statistics by studying timing of electronics signals in a physics-based laboratory

    Science.gov (United States)

    Beach, Shaun E.; Semkow, Thomas M.; Remling, David J.; Bradt, Clayton J.

    2017-07-01

    We have developed accessible methods to demonstrate fundamental statistics in several phenomena, in the context of teaching electronic signal processing in a physics-based college-level curriculum. A relationship between the exponential time-interval distribution and Poisson counting distribution for a Markov process with constant rate is derived in a novel way and demonstrated using nuclear counting. Negative binomial statistics is demonstrated as a model for overdispersion and justified by the effect of electronic noise in nuclear counting. The statistics of digital packets on a computer network are shown to be compatible with the fractal-point stochastic process leading to a power-law as well as generalized inverse Gaussian density distributions of time intervals between packets.

  1. Comparisons between physics-based, engineering, and statistical learning models for outdoor sound propagation.

    Science.gov (United States)

    Hart, Carl R; Reznicek, Nathan J; Wilson, D Keith; Pettit, Chris L; Nykaza, Edward T

    2016-05-01

    Many outdoor sound propagation models exist, ranging from highly complex physics-based simulations to simplified engineering calculations, and more recently, highly flexible statistical learning methods. Several engineering and statistical learning models are evaluated by using a particular physics-based model, namely, a Crank-Nicholson parabolic equation (CNPE), as a benchmark. Narrowband transmission loss values predicted with the CNPE, based upon a simulated data set of meteorological, boundary, and source conditions, act as simulated observations. In the simulated data set sound propagation conditions span from downward refracting to upward refracting, for acoustically hard and soft boundaries, and low frequencies. Engineering models used in the comparisons include the ISO 9613-2 method, Harmonoise, and Nord2000 propagation models. Statistical learning methods used in the comparisons include bagged decision tree regression, random forest regression, boosting regression, and artificial neural network models. Computed skill scores are relative to sound propagation in a homogeneous atmosphere over a rigid ground. Overall skill scores for the engineering noise models are 0.6%, -7.1%, and 83.8% for the ISO 9613-2, Harmonoise, and Nord2000 models, respectively. Overall skill scores for the statistical learning models are 99.5%, 99.5%, 99.6%, and 99.6% for bagged decision tree, random forest, boosting, and artificial neural network regression models, respectively.

  2. Semiconductor statistics

    CERN Document Server

    Blakemore, J S

    1962-01-01

    Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co

  3. Non-Equilibrium Thermodynamics in Multiphase Flows

    CERN Document Server

    Mauri, Roberto

    2013-01-01

    Non-equilibrium thermodynamics is a general framework that allows the macroscopic description of irreversible processes. This book introduces non-equilibrium thermodynamics and its applications to the rheology of multiphase flows. The subject is relevant to graduate students in chemical and mechanical engineering, physics and material science. This book is divided into two parts. The first part presents the theory of non-equilibrium thermodynamics, reviewing its essential features and showing, when possible, some applications. The second part of this book deals with how the general theory can be applied to model multiphase flows and, in particular, how to determine their constitutive relations. Each chapter contains problems at the end, the solutions of which are given at the end of the book. No prior knowledge of statistical mechanics is required; the necessary prerequisites are elements of transport phenomena and on thermodynamics. “The style of the book is mathematical, but nonetheless it remains very re...

  4. Plenary lectures of the divisions semiconductor physics, thin films, dynamics and statistical physics, magnetism, metal physics, surface physics, low temperature physics

    International Nuclear Information System (INIS)

    Roessler, U.

    1992-01-01

    This volume contains a selection of plenary and invited lectures of the Solid State Division spring meeting of the DPG (Deutsche Physikalische Gesellschaft) 1992 in Regensburg. The constribution come mainly from five fields of the physics of condensed matter: doped fullerenes and high Tc superconductors, surfaces, time-resolved on nonlinear optics, polymer melts, and low-dimensional semiconductor systems. (orig.)

  5. The usefulness of descriptive statistics in the interpretation of data on occupational physical activity of Poles

    Directory of Open Access Journals (Sweden)

    Elżbieta Biernat

    2014-12-01

    Full Text Available Background: The aim of this paper is to assess whether basic descriptive statistics is sufficient to interpret the data on physical activity of Poles within occupational domain of life. Material and Methods: The study group consisted of 964 randomly selected Polish working professionals. The long version of the International Physical Activity Questionnaire (IPAQ was used. Descriptive statistics included characteristics of variables using: mean (M, median (Me, maximal and minimal values (max–min., standard deviation (SD and percentile values. Statistical inference was based on the comparison of variables with the significance level of 0.05 (Kruskal-Wallis and Pearson’s Chi2 tests. Results: Occupational physical activity (OPA was declared by 46.4% of respondents (vigorous – 23.5%, moderate – 30.2%, walking – 39.5%. The total OPA amounted to 2751.1 MET-min/week (Metabolic Equivalent of Task with very high standard deviation (SD = 5302.8 and max = 35 511 MET-min/week. It concerned different types of activities. Approximately 10% (90th percentile overstated the average. However, there was no significant difference depended on the character of the profession, or the type of activity. The average time of sitting was 256 min/day. As many as 39% of the respondents met the World Health Organization standards only due to OPA (42.5% of white-collar workers, 38% of administrative and technical employees and only 37.9% of physical workers. Conclusions: In the data analysis it is necessary to define quantiles to provide a fuller picture of the distributions of OPA in MET-min/week. It is also crucial to update the guidelines for data processing and analysis of long version of IPAQ. It seems that 16 h of activity/day is not a sufficient criterion for excluding the results from further analysis. Med Pr 2014;65(6:743–753

  6. Local equilibrium in bird flocks

    Science.gov (United States)

    Mora, Thierry; Walczak, Aleksandra M.; Del Castello, Lorenzo; Ginelli, Francesco; Melillo, Stefania; Parisi, Leonardo; Viale, Massimiliano; Cavagna, Andrea; Giardina, Irene

    2016-12-01

    The correlated motion of flocks is an example of global order emerging from local interactions. An essential difference with respect to analogous ferromagnetic systems is that flocks are active: animals move relative to each other, dynamically rearranging their interaction network. This non-equilibrium characteristic has been studied theoretically, but its impact on actual animal groups remains to be fully explored experimentally. Here, we introduce a novel dynamical inference technique, based on the principle of maximum entropy, which accommodates network rearrangements and overcomes the problem of slow experimental sampling rates. We use this method to infer the strength and range of alignment forces from data of starling flocks. We find that local bird alignment occurs on a much faster timescale than neighbour rearrangement. Accordingly, equilibrium inference, which assumes a fixed interaction network, gives results consistent with dynamical inference. We conclude that bird orientations are in a state of local quasi-equilibrium over the interaction length scale, providing firm ground for the applicability of statistical physics in certain active systems.

  7. Introduction to quantum statistical mechanics

    International Nuclear Information System (INIS)

    Bogolyubov, N.N.; Bogolyubov, N.N.

    1980-01-01

    In a set of lectures, which has been delivered at the Physical Department of Moscow State University as a special course for students represented are some basic ideas of quantum statistical mechanics. Considered are in particular, the Liouville equations in classical and quantum mechanics, canonical distribution and thermodynamical functions, two-time correlation functions and Green's functions in the theory of thermal equilibrium

  8. On the establishment of thermal equilibrium in simplest mechanical systems

    International Nuclear Information System (INIS)

    Kotsinyan, Ar.M.

    1987-01-01

    The process of the establishment of thermal equilibrium of the damping oscillators and a ''free'' particle in interaction with the blackbody radiation field is considered. A special attention is payed to the principal role of non-closedness of real systems as well as to the irreversibility of the microscopic equations of motion in the question of grounding of the statistical physics

  9. PHYSICS OF NON-GAUSSIAN FIELDS AND THE COSMOLOGICAL GENUS STATISTIC

    International Nuclear Information System (INIS)

    James, J. Berian

    2012-01-01

    We report a technique to calculate the impact of distinct physical processes inducing non-Gaussianity on the cosmological density field. A natural decomposition of the cosmic genus statistic into an orthogonal polynomial sequence allows complete expression of the scale-dependent evolution of the topology of large-scale structure, in which effects including galaxy bias, nonlinear gravitational evolution, and primordial non-Gaussianity may be delineated. The relationship of this decomposition to previous methods for analyzing the genus statistic is briefly considered and the following applications are made: (1) the expression of certain systematics affecting topological measurements, (2) the quantification of broad deformations from Gaussianity that appear in the genus statistic as measured in the Horizon Run simulation, and (3) the study of the evolution of the genus curve for simulations with primordial non-Gaussianity. These advances improve the treatment of flux-limited galaxy catalogs for use with this measurement and further the use of the genus statistic as a tool for exploring non-Gaussianity.

  10. Nonlinear Fluctuation Behavior of Financial Time Series Model by Statistical Physics System

    Directory of Open Access Journals (Sweden)

    Wuyang Cheng

    2014-01-01

    Full Text Available We develop a random financial time series model of stock market by one of statistical physics systems, the stochastic contact interacting system. Contact process is a continuous time Markov process; one interpretation of this model is as a model for the spread of an infection, where the epidemic spreading mimics the interplay of local infections and recovery of individuals. From this financial model, we study the statistical behaviors of return time series, and the corresponding behaviors of returns for Shanghai Stock Exchange Composite Index (SSECI and Hang Seng Index (HSI are also comparatively studied. Further, we investigate the Zipf distribution and multifractal phenomenon of returns and price changes. Zipf analysis and MF-DFA analysis are applied to investigate the natures of fluctuations for the stock market.

  11. The Intuitive Physics of the Equilibrium of the Lever and of the Hydraulic Pressures: Implications for the Teaching of Elementary Physics

    Science.gov (United States)

    Masin, Sergio Cesare; Crivellaro, Francesco; Varotto, Diego

    2014-01-01

    The research field of intuitive physics focuses on discrepancies between theoretical and intuitive physical knowledge. Consideration of these discrepancies can help in the teaching of elementary physics. However, evidence shows that theoretical and intuitive physical knowledge may also be congruent. Physics teaching could further benefit from…

  12. Data analysis in high energy physics. A practical guide to statistical methods

    International Nuclear Information System (INIS)

    Behnke, Olaf; Schoerner-Sadenius, Thomas; Kroeninger, Kevin; Schott, Gregory

    2013-01-01

    This practical guide covers the essential tasks in statistical data analysis encountered in high energy physics and provides comprehensive advice for typical questions and problems. The basic methods for inferring results from data are presented as well as tools for advanced tasks such as improving the signal-to-background ratio, correcting detector effects, determining systematics and many others. Concrete applications are discussed in analysis walkthroughs. Each chapter is supplemented by numerous examples and exercises and by a list of literature and relevant links. The book targets a broad readership at all career levels - from students to senior researchers.

  13. Computing physical properties with quantum Monte Carlo methods with statistical fluctuations independent of system size.

    Science.gov (United States)

    Assaraf, Roland

    2014-12-01

    We show that the recently proposed correlated sampling without reweighting procedure extends the locality (asymptotic independence of the system size) of a physical property to the statistical fluctuations of its estimator. This makes the approach potentially vastly more efficient for computing space-localized properties in large systems compared with standard correlated methods. A proof is given for a large collection of noninteracting fragments. Calculations on hydrogen chains suggest that this behavior holds not only for systems displaying short-range correlations, but also for systems with long-range correlations.

  14. Geant4 electromagnetic physics for high statistic simulation of LHC experiments

    CERN Document Server

    Allison, J; Bagulya, A; Champion, C; Elles, S; Garay, F; Grichine, V; Howard, A; Incerti, S; Ivanchenko, V; Jacquemier, J; Maire, M; Mantero, A; Nieminen, P; Pandola, L; Santin, G; Sawkey, D; Schalicke, A; Urban, L

    2012-01-01

    An overview of the current status of electromagnetic physics (EM) of the Geant4 toolkit is presented. Recent improvements are focused on the performance of large scale production for LHC and on the precision of simulation results over a wide energy range. Significant efforts have been made to improve the accuracy without compromising of CPU speed for EM particle transport. New biasing options have been introduced, which are applicable to any EM process. These include algorithms to enhance and suppress processes, force interactions or splitting of secondary particles. It is shown that the performance of the EM sub-package is improved. We will report extensions of the testing suite allowing high statistics validation of EM physics. It includes validation of multiple scattering, bremsstrahlung and other models. Cross checks between standard and low-energy EM models have been performed using evaluated data libraries and reference benchmark results.

  15. Supermathematics and its applications in statistical physics Grassmann variables and the method of supersymmetry

    CERN Document Server

    Wegner, Franz

    2016-01-01

    This text presents the mathematical concepts of Grassmann variables and the method of supersymmetry to a broad audience of physicists interested in applying these tools to disordered and critical systems, as well as related topics in statistical physics. Based on many courses and seminars held by the author, one of the pioneers in this field, the reader is given a systematic and tutorial introduction to the subject matter. The algebra and analysis of Grassmann variables is presented in part I. The mathematics of these variables is applied to a random matrix model, path integrals for fermions, dimer models and the Ising model in two dimensions. Supermathematics - the use of commuting and anticommuting variables on an equal footing - is the subject of part II. The properties of supervectors and supermatrices, which contain both commuting and Grassmann components, are treated in great detail, including the derivation of integral theorems. In part III, supersymmetric physical models are considered. While supersym...

  16. Data analysis in high energy physics a practical guide to statistical methods

    CERN Document Server

    Behnke, Olaf; Kröninger, Kevin; Schott, Grégory; Schörner-Sadenius, Thomas

    2013-01-01

    This practical guide covers the most essential statistics-related tasks and problems encountered in high-energy physics data analyses. It addresses both advanced students entering the field of particle physics as well as researchers looking for a reliable source on optimal separation of signal and background, determining signals or estimating upper limits, correcting the data for detector effects and evaluating systematic uncertainties. Each chapter is dedicated to a single topic and supplemented by a substantial number of both paper and computer exercises related to real experiments, with the solutions provided at the end of the book along with references. A special feature of the book are the analysis walk-throughs used to illustrate the application of the methods discussed beforehand. The authors give examples of data analysis, referring to real problems in HEP, and display the different stages of data analysis in a descriptive manner. The accompanying website provides more algorithms as well as up-to-date...

  17. Statistical physics of community ecology: a cavity solution to MacArthur’s consumer resource model

    Science.gov (United States)

    Advani, Madhu; Bunin, Guy; Mehta, Pankaj

    2018-03-01

    A central question in ecology is to understand the ecological processes that shape community structure. Niche-based theories have emphasized the important role played by competition for maintaining species diversity. Many of these insights have been derived using MacArthur’s consumer resource model (MCRM) or its generalizations. Most theoretical work on the MCRM has focused on small ecosystems with a few species and resources. However theoretical insights derived from small ecosystems many not scale up to large ecosystems with many resources and species because large systems with many interacting components often display new emergent behaviors that cannot be understood or deduced from analyzing smaller systems. To address these shortcomings, we develop a statistical physics inspired cavity method to analyze MCRM when both the number of species and the number of resources is large. Unlike previous work in this limit, our theory addresses resource dynamics and resource depletion and demonstrates that species generically and consistently perturb their environments and significantly modify available ecological niches. We show how our cavity approach naturally generalizes niche theory to large ecosystems by accounting for the effect of collective phenomena on species invasion and ecological stability. Our theory suggests that such phenomena are a generic feature of large, natural ecosystems and must be taken into account when analyzing and interpreting community structure. It also highlights the important role that statistical-physics inspired approaches can play in furthering our understanding of ecology.

  18. Quantifying fluctuations in economic systems by adapting methods of statistical physics

    Science.gov (United States)

    Stanley, H. E.; Gopikrishnan, P.; Plerou, V.; Amaral, L. A. N.

    2000-12-01

    The emerging subfield of econophysics explores the degree to which certain concepts and methods from statistical physics can be appropriately modified and adapted to provide new insights into questions that have been the focus of interest in the economics community. Here we give a brief overview of two examples of research topics that are receiving recent attention. A first topic is the characterization of the dynamics of stock price fluctuations. For example, we investigate the relation between trading activity - measured by the number of transactions NΔ t - and the price change GΔ t for a given stock, over a time interval [t, t+ Δt] . We relate the time-dependent standard deviation of price fluctuations - volatility - to two microscopic quantities: the number of transactions NΔ t in Δ t and the variance WΔ t2 of the price changes for all transactions in Δ t. Our work indicates that while the pronounced tails in the distribution of price fluctuations arise from WΔ t, the long-range correlations found in ∣ GΔ t∣ are largely due to NΔ t. We also investigate the relation between price fluctuations and the number of shares QΔ t traded in Δ t. We find that the distribution of QΔ t is consistent with a stable Lévy distribution, suggesting a Lévy scaling relationship between QΔ t and NΔ t, which would provide one explanation for volume-volatility co-movement. A second topic concerns cross-correlations between the price fluctuations of different stocks. We adapt a conceptual framework, random matrix theory (RMT), first used in physics to interpret statistical properties of nuclear energy spectra. RMT makes predictions for the statistical properties of matrices that are universal, that is, do not depend on the interactions between the elements comprising the system. In physics systems, deviations from the predictions of RMT provide clues regarding the mechanisms controlling the dynamics of a given system, so this framework can be of potential value if

  19. The Physical Models and Statistical Procedures Used in the RACER Monte Carlo Code

    International Nuclear Information System (INIS)

    Sutton, T.M.; Brown, F.B.; Bischoff, F.G.; MacMillan, D.B.; Ellis, C.L.; Ward, J.T.; Ballinger, C.T.; Kelly, D.J.; Schindler, L.

    1999-01-01

    This report describes the MCV (Monte Carlo - Vectorized)Monte Carlo neutron transport code [Brown, 1982, 1983; Brown and Mendelson, 1984a]. MCV is a module in the RACER system of codes that is used for Monte Carlo reactor physics analysis. The MCV module contains all of the neutron transport and statistical analysis functions of the system, while other modules perform various input-related functions such as geometry description, material assignment, output edit specification, etc. MCV is very closely related to the 05R neutron Monte Carlo code [Irving et al., 1965] developed at Oak Ridge National Laboratory. 05R evolved into the 05RR module of the STEMB system, which was the forerunner of the RACER system. Much of the overall logic and physics treatment of 05RR has been retained and, indeed, the original verification of MCV was achieved through comparison with STEMB results. MCV has been designed to be very computationally efficient [Brown, 1981, Brown and Martin, 1984b; Brown, 1986]. It was originally programmed to make use of vector-computing architectures such as those of the CDC Cyber- 205 and Cray X-MP. MCV was the first full-scale production Monte Carlo code to effectively utilize vector-processing capabilities. Subsequently, MCV was modified to utilize both distributed-memory [Sutton and Brown, 1994] and shared memory parallelism. The code has been compiled and run on platforms ranging from 32-bit UNIX workstations to clusters of 64-bit vector-parallel supercomputers. The computational efficiency of the code allows the analyst to perform calculations using many more neutron histories than is practical with most other Monte Carlo codes, thereby yielding results with smaller statistical uncertainties. MCV also utilizes variance reduction techniques such as survival biasing, splitting, and rouletting to permit additional reduction in uncertainties. While a general-purpose neutron Monte Carlo code, MCV is optimized for reactor physics calculations. It has the

  20. Statistical mechanics of the self-gravitating gas: II. Local physical magnitudes and fractal structures

    International Nuclear Information System (INIS)

    Vega, H.J. de; Sanchez, N.

    2002-01-01

    We complete our study of the self-gravitating gas by computing the fluctuations around the saddle point solution for the three statistical ensembles (grand canonical, canonical and microcanonical). Although the saddle point is the same for the three ensembles, the fluctuations change from one ensemble to the other. The zeroes of the small fluctuations determinant determine the position of the critical points for each ensemble. This yields the domains of validity of the mean field approach. Only the S-wave determinant exhibits critical points. Closed formulae for the S- and P-wave determinants of fluctuations are derived. The local properties of the self-gravitating gas in thermodynamic equilibrium are studied in detail. The pressure, energy density, particle density and speed of sound are computed and analyzed as functions of the position. The equation of state turns out to be locally p(r→ )=Tρ V (r→ ) as for the ideal gas. Starting from the partition function of the self-gravitating gas, we prove in this microscopic calculation that the hydrostatic description yielding locally the ideal gas equation of state is exact in the N=∞ limit. The dilute nature of the thermodynamic limit (N∼L→∞ with N/L fixed) together with the long range nature of the gravitational forces play a crucial role in obtaining such ideal gas equation. The self-gravitating gas being inhomogeneous, we have PV/[NT]=f(η)≤1 for any finite volume V. The inhomogeneous particle distribution in the ground state suggests a fractal distribution with Haussdorf dimension D, D is slowly decreasing with increasing density, 1< D<3. The average distance between particles is computed in Monte Carlo simulations and analytically in the mean field approach. A dramatic drop at the phase transition is exhibited, clearly illustrating the properties of the collapse

  1. Equilibrium Dialysis

    African Journals Online (AJOL)

    context of antimicrobial therapy in malnutrition. Dialysis has in the past presented technical problems, being complicated and time-consuming. A new dialysis system based on the equilibrium technique has now become available, and it is the principles and practical application of this apparatus (Kontron Diapack; Kontron.

  2. Strategic Equilibrium

    NARCIS (Netherlands)

    van Damme, E.E.C.

    2000-01-01

    An outcome in a noncooperative game is said to be self-enforcing, or a strategic equilibrium, if, whenever it is recommended to the players, no player has an incentive to deviate from it.This paper gives an overview of the concepts that have been proposed as formalizations of this requirement and of

  3. Maximin equilibrium

    NARCIS (Netherlands)

    Ismail, M.S.

    2014-01-01

    We introduce a new concept which extends von Neumann and Morgenstern's maximin strategy solution by incorporating `individual rationality' of the players. Maximin equilibrium, extending Nash's value approach, is based on the evaluation of the strategic uncertainty of the whole game. We show that

  4. PREFACE: 3rd International Workshop on Statistical Physics and Mathematics for Complex Systems (SPMCS 2012)

    Science.gov (United States)

    Tayurskii, Dmitrii; Abe, Sumiyoshi; Alexandre Wang, Q.

    2012-11-01

    The 3rd International Workshop on Statistical Physics and Mathematics for Complex Systems (SPMCS2012) was held between 25-30 August at Kazan (Volga Region) Federal University, Kazan, Russian Federation. This workshop was jointly organized by Kazan Federal University and Institut Supérieur des Matériaux et Mécaniques Avancées (ISMANS), France. The series of SPMCS workshops was created in 2008 with the aim to be an interdisciplinary incubator for the worldwide exchange of innovative ideas and information about the latest results. The first workshop was held at ISMANS, Le Mans (France) in 2008, and the third at Huazhong Normal University, Wuhan (China) in 2010. At SPMCS2012, we wished to bring together a broad community of researchers from the different branches of the rapidly developing complexity science to discuss the fundamental theoretical challenges (geometry/topology, number theory, statistical physics, dynamical systems, etc) as well as experimental and applied aspects of many practical problems (condensed matter, disordered systems, financial markets, chemistry, biology, geoscience, etc). The program of SPMCS2012 was prepared based on three categories: (i) physical and mathematical studies (quantum mechanics, generalized nonequilibrium thermodynamics, nonlinear dynamics, condensed matter physics, nanoscience); (ii) natural complex systems (physical, geophysical, chemical and biological); (iii) social, economical, political agent systems and man-made complex systems. The conference attracted 64 participants from 10 countries. There were 10 invited lectures, 12 invited talks and 28 regular oral talks in the morning and afternoon sessions. The book of Abstracts is available from the conference website (http://www.ksu.ru/conf/spmcs2012/?id=3). A round table was also held, the topic of which was 'Recent and Anticipated Future Progress in Science of Complexity', discussing a variety of questions and opinions important for the understanding of the concept of

  5. Statistical Physics, Optimization, Inference, and Message-Passing Algorithms : Lecture Notes of the Les Houches School of Physics : Special Issue, October 2013

    CERN Document Server

    Ricci-Tersenghi, Federico; Zdeborova, Lenka; Zecchina, Riccardo; Tramel, Eric W; Cugliandolo, Leticia F

    2015-01-01

    This book contains a collection of the presentations that were given in October 2013 at the Les Houches Autumn School on statistical physics, optimization, inference, and message-passing algorithms. In the last decade, there has been increasing convergence of interest and methods between theoretical physics and fields as diverse as probability, machine learning, optimization, and inference problems. In particular, much theoretical and applied work in statistical physics and computer science has relied on the use of message-passing algorithms and their connection to the statistical physics of glasses and spin glasses. For example, both the replica and cavity methods have led to recent advances in compressed sensing, sparse estimation, and random constraint satisfaction, to name a few. This book’s detailed pedagogical lectures on statistical inference, computational complexity, the replica and cavity methods, and belief propagation are aimed particularly at PhD students, post-docs, and young researchers desir...

  6. Statistical physics as an approximate method of many-body quantum mechanics in the representation of occupation numbers

    International Nuclear Information System (INIS)

    Kushnirenko, A.N.

    1989-01-01

    An attempt was made to substantiate statistical physics from the viewpoint of many-body quantum mechanics in the representation of occupation numbers. This approach enabled to develop the variation method for solution of stationary and nonstationary nonequilibrium problems

  7. Statistical Physics on the Eve of the 21st Century: in Honour of J B McGuire on the Occasion of His 65th Birthday

    Science.gov (United States)

    Batchelor, Murray T.; Wille, Luc T.

    The Table of Contents for the book is as follows: * Preface * Modelling the Immune System - An Example of the Simulation of Complex Biological Systems * Brief Overview of Quantum Computation * Quantal Information in Statistical Physics * Modeling Economic Randomness: Statistical Mechanics of Market Phenomena * Essentially Singular Solutions of Feigenbaum- Type Functional Equations * Spatiotemporal Chaotic Dynamics in Coupled Map Lattices * Approach to Equilibrium of Chaotic Systems * From Level to Level in Brain and Behavior * Linear and Entropic Transformations of the Hydrophobic Free Energy Sequence Help Characterize a Novel Brain Polyprotein: CART's Protein * Dynamical Systems Response to Pulsed High-Frequency Fields * Bose-Einstein Condensates in the Light of Nonlinear Physics * Markov Superposition Expansion for the Entropy and Correlation Functions in Two and Three Dimensions * Calculation of Wave Center Deflection and Multifractal Analysis of Directed Waves Through the Study of su(1,1)Ferromagnets * Spectral Properties and Phases in Hierarchical Master Equations * Universality of the Distribution Functions of Random Matrix Theory * The Universal Chiral Partition Function for Exclusion Statistics * Continuous Space-Time Symmetries in a Lattice Field Theory * Quelques Cas Limites du Problème à N Corps Unidimensionnel * Integrable Models of Correlated Electrons * On the Riemann Surface of the Three-State Chiral Potts Model * Two Exactly Soluble Lattice Models in Three Dimensions * Competition of Ferromagnetic and Antiferromagnetic Order in the Spin-l/2 XXZ Chain at Finite Temperature * Extended Vertex Operator Algebras and Monomial Bases * Parity and Charge Conjugation Symmetries and S Matrix of the XXZ Chain * An Exactly Solvable Constrained XXZ Chain * Integrable Mixed Vertex Models Ftom the Braid-Monoid Algebra * From Yang-Baxter Equations to Dynamical Zeta Functions for Birational Tlansformations * Hexagonal Lattice Directed Site Animals * Direction in

  8. Physics colloquium: Single-electron counting in quantum metrology and in statistical mechanics

    CERN Multimedia

    Geneva University

    2011-01-01

    GENEVA UNIVERSITY Ecole de physique Département de physique nucléaire et corspusculaire 24, quai Ernest-Ansermet 1211 Genève 4 Tél.: (022) 379 62 73 Fax: (022) 379 69 92olé   Lundi 17 octobre 2011 17h00 - Ecole de Physique, Auditoire Stueckelberg PHYSICS COLLOQUIUM « Single-electron counting in quantum metrology and in statistical mechanics » Prof. Jukka Pekola Low Temperature Laboratory, Aalto University Helsinki, Finland   First I discuss the basics of single-electron tunneling and its potential applications in metrology. My main focus is in developing an accurate source of single-electron current for the realization of the unit ampere. I discuss the principle and the present status of the so-called single- electron turnstile. Investigation of errors in transporting electrons one by one has revealed a wealth of observations on fundamental phenomena in mesoscopic superconductivity, including individual Andreev...

  9. Asymptotics of elliptic and parabolic pdes and their applications in statistical physics, computational neuroscience, and biophysics

    CERN Document Server

    Holcman, David

    2018-01-01

    This is a monograph on the emerging branch of mathematical biophysics combining asymptotic analysis with numerical and stochastic methods to analyze partial differential equations arising in biological and physical sciences. In more detail, the book presents the analytic methods and tools for approximating solutions of mixed boundary value problems, with particular emphasis on the narrow escape problem. Informed throughout by real-world applications, the book includes topics such as the Fokker-Planck equation, boundary layer analysis, WKB approximation, applications of spectral theory, as well as recent results in narrow escape theory. Numerical and stochastic aspects, including mean first passage time and extreme statistics, are discussed in detail and relevant applications are presented in parallel with the theory. Including background on the classical asymptotic theory of differential equations, this book is written for scientists of various backgrounds interested in deriving solutions to real-world proble...

  10. Solving Large-Scale Computational Problems Using Insights from Statistical Physics

    Energy Technology Data Exchange (ETDEWEB)

    Selman, Bart [Cornell University

    2012-02-29

    Many challenging problems in computer science and related fields can be formulated as constraint satisfaction problems. Such problems consist of a set of discrete variables and a set of constraints between those variables, and represent a general class of so-called NP-complete problems. The goal is to find a value assignment to the variables that satisfies all constraints, generally requiring a search through and exponentially large space of variable-value assignments. Models for disordered systems, as studied in statistical physics, can provide important new insights into the nature of constraint satisfaction problems. Recently, work in this area has resulted in the discovery of a new method for solving such problems, called the survey propagation (SP) method. With SP, we can solve problems with millions of variables and constraints, an improvement of two orders of magnitude over previous methods.

  11. Sweatshop equilibrium

    OpenAIRE

    Chau, Nancy H.

    2009-01-01

    This paper presents a capability-augmented model of on the job search, in which sweatshop conditions stifle the capability of the working poor to search for a job while on the job. The augmented setting unveils a sweatshop equilibrium in an otherwise archetypal Burdett-Mortensen economy, and reconciles a number of oft noted yet perplexing features of sweatshop economies. We demonstrate existence of multiple rational expectation equilibria, graduation pathways out of sweatshops in complete abs...

  12. An ensemble Kalman filter for statistical estimation of physics constrained nonlinear regression models

    International Nuclear Information System (INIS)

    Harlim, John; Mahdi, Adam; Majda, Andrew J.

    2014-01-01

    A central issue in contemporary science is the development of nonlinear data driven statistical–dynamical models for time series of noisy partial observations from nature or a complex model. It has been established recently that ad-hoc quadratic multi-level regression models can have finite-time blow-up of statistical solutions and/or pathological behavior of their invariant measure. Recently, a new class of physics constrained nonlinear regression models were developed to ameliorate this pathological behavior. Here a new finite ensemble Kalman filtering algorithm is developed for estimating the state, the linear and nonlinear model coefficients, the model and the observation noise covariances from available partial noisy observations of the state. Several stringent tests and applications of the method are developed here. In the most complex application, the perfect model has 57 degrees of freedom involving a zonal (east–west) jet, two topographic Rossby waves, and 54 nonlinearly interacting Rossby waves; the perfect model has significant non-Gaussian statistics in the zonal jet with blocked and unblocked regimes and a non-Gaussian skewed distribution due to interaction with the other 56 modes. We only observe the zonal jet contaminated by noise and apply the ensemble filter algorithm for estimation. Numerically, we find that a three dimensional nonlinear stochastic model with one level of memory mimics the statistical effect of the other 56 modes on the zonal jet in an accurate fashion, including the skew non-Gaussian distribution and autocorrelation decay. On the other hand, a similar stochastic model with zero memory levels fails to capture the crucial non-Gaussian behavior of the zonal jet from the perfect 57-mode model

  13. Theoretical approaches to the steady-state statistical physics of interacting dissipative units

    Science.gov (United States)

    Bertin, Eric

    2017-02-01

    The aim of this review is to provide a concise overview of some of the generic approaches that have been developed to deal with the statistical description of large systems of interacting dissipative ‘units’. The latter notion includes, e.g. inelastic grains, active or self-propelled particles, bubbles in a foam, low-dimensional dynamical systems like driven oscillators, or even spatially extended modes like Fourier modes of the velocity field in a fluid. We first review methods based on the statistical properties of a single unit, starting with elementary mean-field approximations, either static or dynamic, that describe a unit embedded in a ‘self-consistent’ environment. We then discuss how this basic mean-field approach can be extended to account for spatial dependences, in the form of space-dependent mean-field Fokker-Planck equations, for example. We also briefly review the use of kinetic theory in the framework of the Boltzmann equation, which is an appropriate description for dilute systems. We then turn to descriptions in terms of the full N-body distribution, starting from exact solutions of one-dimensional models, using a matrix-product ansatz method when correlations are present. Since exactly solvable models are scarce, we also present some approximation methods which can be used to determine the N-body distribution in a large system of dissipative units. These methods include the Edwards approach for dense granular matter and the approximate treatment of multiparticle Langevin equations with colored noise, which models systems of self-propelled particles. Throughout this review, emphasis is put on methodological aspects of the statistical modeling and on formal similarities between different physical problems, rather than on the specific behavior of a given system.

  14. Developing Statistical Physics Course Handout on Distribution Function Materials Based on Science, Technology, Engineering, and Mathematics

    Science.gov (United States)

    Riandry, M. A.; Ismet, I.; Akhsan, H.

    2017-09-01

    This study aims to produce a valid and practical statistical physics course handout on distribution function materials based on STEM. Rowntree development model is used to produce this handout. The model consists of three stages: planning, development and evaluation stages. In this study, the evaluation stage used Tessmer formative evaluation. It consists of 5 stages: self-evaluation, expert review, one-to-one evaluation, small group evaluation and field test stages. However, the handout is limited to be tested on validity and practicality aspects, so the field test stage is not implemented. The data collection technique used walkthroughs and questionnaires. Subjects of this study are students of 6th and 8th semester of academic year 2016/2017 Physics Education Study Program of Sriwijaya University. The average result of expert review is 87.31% (very valid category). One-to-one evaluation obtained the average result is 89.42%. The result of small group evaluation is 85.92%. From one-to-one and small group evaluation stages, averagestudent response to this handout is 87,67% (very practical category). Based on the results of the study, it can be concluded that the handout is valid and practical.

  15. Equilibrium models and variational inequalities

    CERN Document Server

    Konnov, Igor

    2007-01-01

    The concept of equilibrium plays a central role in various applied sciences, such as physics (especially, mechanics), economics, engineering, transportation, sociology, chemistry, biology and other fields. If one can formulate the equilibrium problem in the form of a mathematical model, solutions of the corresponding problem can be used for forecasting the future behavior of very complex systems and, also, for correcting the the current state of the system under control. This book presents a unifying look on different equilibrium concepts in economics, including several models from related sciences.- Presents a unifying look on different equilibrium concepts and also the present state of investigations in this field- Describes static and dynamic input-output models, Walras, Cassel-Wald, spatial price, auction market, oligopolistic equilibrium models, transportation and migration equilibrium models- Covers the basics of theory and solution methods both for the complementarity and variational inequality probl...

  16. Deterpenation of eucalyptus essential oil by liquid + liquid extraction: Phase equilibrium and physical properties for model systems at T = 298.2 K

    International Nuclear Information System (INIS)

    Gonçalves, Daniel; Koshima, Cristina Chiyoda; Nakamoto, Karina Thiemi; Umeda, Thayla Karla; Aracava, Keila Kazue; Gonçalves, Cintia Bernardo; Rodrigues, Christianne Elisabete da Costa

    2014-01-01

    Highlights: • Fractionation of essential oil compounds. • Liquid + liquid equilibria of limonene, citronellal, ethanol and water were studied. • Distribution coefficients of limonene and citronellal were evaluated. • Densities and viscosities of the phases were experimentally determined. • Solvent selectivities and physical properties were dependent on citronellal and water mass fractions. -- Abstract: As the principal source in Brazil of eucalyptus essential oil extracts, Eucalyptus citriodora contains citronellal, an oxygenated compound responsible for the flavour characteristics. Deterpenation processes, consisting of the removal of terpenic hydrocarbons with the subsequent concentration of the oxygenated compounds, can be used to improve the aromatic characteristics of this essential oil. The purpose of this work was to perform a study of the technical feasibility of using a liquid + liquid extraction process to deterpenate eucalyptus essential oil. Model systems with various mixtures of limonene and citronellal (representing eucalyptus essential oil) as well as solvent (ethanol with various water mass fractions) were used to obtain liquid + liquid equilibrium data. The raffinate and extract phases were also analyzed to characterize the physical properties (density and viscosity). The equilibrium data were used to adjust the NRTL and UNIQUAC parameters. Two empirical models, the simple mixing rule and the Grunberg–Nissan model, were evaluated for use in the descriptions of the densities and viscosities, respectively, of the samples. Increasing the water content in the solvent resulted in decreases in the limonene and citronellal distribution coefficients, with consequential increases in the solvent selectivity values. Increasing values of the densities and viscosities, especially for the solvent-rich phases, were associated with systems using high amounts of hydrated ethanolic solvents

  17. Equilibrium Trust

    OpenAIRE

    Luca Anderlini; Daniele Terlizzese

    2009-01-01

    We build a simple model of trust as an equilibrium phenomenon, departing from standard "selfish" preferences in a minimal way. Agents who are on the receiving end of an other to transact can choose whether to cheat and take away the entire surplus, taking into account a "cost of cheating." The latter has an idiosyncratic component (an agent's type), and a socially determined one. The smaller the mass of agents who cheat, the larger the cost of cheating suffered by those who cheat. Depending o...

  18. Near-equilibrium dumb-bell-shaped figures for cohesionless small bodies

    Science.gov (United States)

    Descamps, Pascal

    2016-02-01

    In a previous paper (Descamps, P. [2015]. Icarus 245, 64-79), we developed a specific method aimed to retrieve the main physical characteristics (shape, density, surface scattering properties) of highly elongated bodies from their rotational lightcurves through the use of dumb-bell-shaped equilibrium figures. The present work is a test of this method. For that purpose we introduce near-equilibrium dumb-bell-shaped figures which are base dumb-bell equilibrium shapes modulated by lognormal statistics. Such synthetic irregular models are used to generate lightcurves from which our method is successfully applied. Shape statistical parameters of such near-equilibrium dumb-bell-shaped objects are in good agreement with those calculated for example for the Asteroid (216) Kleopatra from its dog-bone radar model. It may suggest that such bilobed and elongated asteroids can be approached by equilibrium figures perturbed be the interplay with a substantial internal friction modeled by a Gaussian random sphere.

  19. Modeling the dynamic equilibrium between oligomers of (AlOCH3)n in methylaluminoxane (MAO). A theoretical study based on a combined quantum mechanical and statistical mechanical approach.

    Science.gov (United States)

    Zurek, E; Woo, T K; Firman, T K; Ziegler, T

    2001-01-15

    Density functional theory (DFT) has been used to calculate the energies of 36 different methylaluminoxane (MAO) cage structures with the general formula (MeAlO)n, where n ranges from 4 to 16. A least-squares fit has been used to devise a formula which predicts the total energies of the MAO with different n's giving an rms deviation of 4.70 kcal/mol. These energies in conjunction with frequency calculations based on molecular mechanics have been used to estimate the finite temperature enthalpies, entropies, and free energies for these MAO structures. Furthermore, formulas have been devised which predict finite temperature enthalpies and entropies for MAO structures of any n for a temperature range of 198.15-598.15 K. Using these formulas, the free energies at different temperatures have been predicted for MAO structures where n ranges from 17 to 30. The free energy values were then used to predict the percentage of each n found at a given temperature. Our calculations give an average n value of 18.41, 17.23, 16.89, and 15.72 at 198.15, 298.15, 398.15, and 598.15 K, respectively. Topological arguments have also been used to show that the MAO cage structure contains a limited amount of square faces as compared to octagonal and hexagonal ones. It is also suggested that the limited number of square faces with their strained Al-O bonds explain the high molar Al:catalyst ratio required for activation. Moreover, in this study we outline a general methodology which may be used to calculate the percent abundance of an equilibrium mixture of oligomers with the general formula (X)n.

  20. Applications of modern statistical methods to analysis of data in physical science

    Science.gov (United States)

    Wicker, James Eric

    Modern methods of statistical and computational analysis offer solutions to dilemmas confronting researchers in physical science. Although the ideas behind modern statistical and computational analysis methods were originally introduced in the 1970's, most scientists still rely on methods written during the early era of computing. These researchers, who analyze increasingly voluminous and multivariate data sets, need modern analysis methods to extract the best results from their studies. The first section of this work showcases applications of modern linear regression. Since the 1960's, many researchers in spectroscopy have used classical stepwise regression techniques to derive molecular constants. However, problems with thresholds of entry and exit for model variables plagues this analysis method. Other criticisms of this kind of stepwise procedure include its inefficient searching method, the order in which variables enter or leave the model and problems with overfitting data. We implement an information scoring technique that overcomes the assumptions inherent in the stepwise regression process to calculate molecular model parameters. We believe that this kind of information based model evaluation can be applied to more general analysis situations in physical science. The second section proposes new methods of multivariate cluster analysis. The K-means algorithm and the EM algorithm, introduced in the 1960's and 1970's respectively, formed the basis of multivariate cluster analysis methodology for many years. However, several shortcomings of these methods include strong dependence on initial seed values and inaccurate results when the data seriously depart from hypersphericity. We propose new cluster analysis methods based on genetic algorithms that overcomes the strong dependence on initial seed values. In addition, we propose a generalization of the Genetic K-means algorithm which can accurately identify clusters with complex hyperellipsoidal covariance

  1. What Can Reinforcement Learning Teach Us About Non-Equilibrium Quantum Dynamics

    Science.gov (United States)

    Bukov, Marin; Day, Alexandre; Sels, Dries; Weinberg, Phillip; Polkovnikov, Anatoli; Mehta, Pankaj

    Equilibrium thermodynamics and statistical physics are the building blocks of modern science and technology. Yet, our understanding of thermodynamic processes away from equilibrium is largely missing. In this talk, I will reveal the potential of what artificial intelligence can teach us about the complex behaviour of non-equilibrium systems. Specifically, I will discuss the problem of finding optimal drive protocols to prepare a desired target state in quantum mechanical systems by applying ideas from Reinforcement Learning [one can think of Reinforcement Learning as the study of how an agent (e.g. a robot) can learn and perfect a given policy through interactions with an environment.]. The driving protocols learnt by our agent suggest that the non-equilibrium world features possibilities easily defying intuition based on equilibrium physics.

  2. A statistical methodology for quantification of uncertainty in best estimate code physical models

    International Nuclear Information System (INIS)

    Vinai, Paolo; Macian-Juan, Rafael; Chawla, Rakesh

    2007-01-01

    A novel uncertainty assessment methodology, based on a statistical non-parametric approach, is presented in this paper. It achieves quantification of code physical model uncertainty by making use of model performance information obtained from studies of appropriate separate-effect tests. Uncertainties are quantified in the form of estimated probability density functions (pdf's), calculated with a newly developed non-parametric estimator. The new estimator objectively predicts the probability distribution of the model's 'error' (its uncertainty) from databases reflecting the model's accuracy on the basis of available experiments. The methodology is completed by applying a novel multi-dimensional clustering technique based on the comparison of model error samples with the Kruskall-Wallis test. This takes into account the fact that a model's uncertainty depends on system conditions, since a best estimate code can give predictions for which the accuracy is affected by the regions of the physical space in which the experiments occur. The final result is an objective, rigorous and accurate manner of assigning uncertainty to coded models, i.e. the input information needed by code uncertainty propagation methodologies used for assessing the accuracy of best estimate codes in nuclear systems analysis. The new methodology has been applied to the quantification of the uncertainty in the RETRAN-3D void model and then used in the analysis of an independent separate-effect experiment. This has clearly demonstrated the basic feasibility of the approach, as well as its advantages in yielding narrower uncertainty bands in quantifying the code's accuracy for void fraction predictions

  3. Statistical Thermodynamics of Disperse Systems

    DEFF Research Database (Denmark)

    Shapiro, Alexander

    1996-01-01

    Principles of statistical physics are applied for the description of thermodynamic equilibrium in disperse systems. The cells of disperse systems are shown to possess a number of non-standard thermodynamic parameters. A random distribution of these parameters in the system is determined....... On the basis of this distribution, it is established that the disperse system has an additional degree of freedom called the macro-entropy. A large set of bounded ideal disperse systems allows exact evaluation of thermodynamic characteristics. The theory developed is applied to the description of equilibrium...

  4. Comparative Statistical Mechanics of Muscle and Non-Muscle Contractile Systems: Stationary States of Near-Equilibrium Systems in A Linear Regime

    Directory of Open Access Journals (Sweden)

    Yves Lecarpentier

    2017-10-01

    Full Text Available A. Huxley’s equations were used to determine the mechanical properties of muscle myosin II (MII at the molecular level, as well as the probability of the occurrence of the different stages in the actin–myosin cycle. It was then possible to use the formalism of statistical mechanics with the grand canonical ensemble to calculate numerous thermodynamic parameters such as entropy, internal energy, affinity, thermodynamic flow, thermodynamic force, and entropy production rate. This allows us to compare the thermodynamic parameters of a non-muscle contractile system, such as the normal human placenta, with those of different striated skeletal muscles (soleus and extensor digitalis longus as well as the heart muscle and smooth muscles (trachea and uterus in the rat. In the human placental tissues, it was observed that the kinetics of the actin–myosin crossbridges were considerably slow compared with those of smooth and striated muscular systems. The entropy production rate was also particularly low in the human placental tissues, as compared with that observed in smooth and striated muscular systems. This is partly due to the low thermodynamic flow found in the human placental tissues. However, the unitary force of non-muscle myosin (NMII generated by each crossbridge cycle in the myofibroblasts of the human placental tissues was similar in magnitude to that of MII in the myocytes of both smooth and striated muscle cells. Statistical mechanics represents a powerful tool for studying the thermodynamics of all contractile muscle and non-muscle systems.

  5. Tsunami vs Infragravity Surge: Statistics and Physical Character of Extreme Runup

    Science.gov (United States)

    Lynett, P. J.; Montoya, L. H.

    2017-12-01

    Motivated by recent observations of energetic and impulsive infragravity (IG) flooding events - also known as sneaker waves - we will present recent work on the relative probabilities and dynamics of extreme flooding events from tsunamis and long period wind wave events. The discussion will be founded on videos and records of coastal flooding by both recent tsunamis and IG, such as those in the Philippines during Typhoon Haiyan. From these observations, it is evident that IG surges may approach the coast as breaking bores with periods of minutes; a very tsunami-like character. Numerical simulations will be used to estimate flow elevations and speeds from potential IG surges, and these will be compared with similar values from tsunamis, over a range of different beach profiles. We will examine the relative rareness of each type of flooding event, which for large values of IG runup is a particularly challenging topic. For example, for a given runup elevation or flooding speed, the related tsunami return period may be longer than that associated with IG, implying that deposit information associated with such elevations or speeds are more likely to be caused by IG. Our purpose is to provide a statistical and physical discriminant between tsunami and IG, such that in areas exposed to both, a proper interpretation of overland transport, deposition, and damage is possible.

  6. Statistical and physical content of low-energy photons in nuclear medicine imaging

    International Nuclear Information System (INIS)

    Gagnon, D.; Pouliot, N.; Laperriere, L.; Harel, F.; Gregoire, J.; Arsenault, A.

    1990-01-01

    Limit in the energy resolution of present gamma camera technology prevents a total rejection of Compton events: inclusion of bad photons in the image is inescapable. Various methods acquiring data over a large portion of the spectrum have already been described. This paper investigates the usefulness of low energy photons using statistical and physical models. Holospectral Imaging, for instance, exploits correlation between energy frames to build an information related transformation optimizing primary photon image. One can also use computer simulation to show that a portion of low energy photons is detected at the same location (pixel) as pure primary photons. These events are for instance: photons undergoing scatter interaction in the crystal; photons undergoing a small angle backscatter or forwardscatter interaction in the medium, photons backscattered by the Pyrex into the crystal. For a 140 keV source in 10 cm of water and a 1/4 inch thick crystal, more than 6% of all the photons detected do not have the primary energy and still are located in the right 4 mm pixel. Similarly, it is possible to show that more than 5% of all the photons detected at 140 keV deposit their energy in more than one pixel. These results give additional support to techniques considering low energy photons and more sophisticated ways to segregate between good and bad events

  7. Statistical physics of fracture: scientific discovery through high-performance computing

    International Nuclear Information System (INIS)

    Kumar, Phani; Nukala, V V; Simunovic, Srdan; Mills, Richard T

    2006-01-01

    The paper presents the state-of-the-art algorithmic developments for simulating the fracture of disordered quasi-brittle materials using discrete lattice systems. Large scale simulations are often required to obtain accurate scaling laws; however, due to computational complexity, the simulations using the traditional algorithms were limited to small system sizes. We have developed two algorithms: a multiple sparse Cholesky downdating scheme for simulating 2D random fuse model systems, and a block-circulant preconditioner for simulating 2D random fuse model systems. Using these algorithms, we were able to simulate fracture of largest ever lattice system sizes (L = 1024 in 2D, and L = 64 in 3D) with extensive statistical sampling. Our recent simulations on 1024 processors of Cray-XT3 and IBM Blue-Gene/L have further enabled us to explore fracture of 3D lattice systems of size L = 200, which is a significant computational achievement. These largest ever numerical simulations have enhanced our understanding of physics of fracture; in particular, we analyze damage localization and its deviation from percolation behavior, scaling laws for damage density, universality of fracture strength distribution, size effect on the mean fracture strength, and finally the scaling of crack surface roughness

  8. Thermodynamic chemical energy transfer mechanisms of non-equilibrium, quasi-equilibrium, and equilibrium chemical reactions

    International Nuclear Information System (INIS)

    Roh, Heui-Seol

    2015-01-01

    Chemical energy transfer mechanisms at finite temperature are explored by a chemical energy transfer theory which is capable of investigating various chemical mechanisms of non-equilibrium, quasi-equilibrium, and equilibrium. Gibbs energy fluxes are obtained as a function of chemical potential, time, and displacement. Diffusion, convection, internal convection, and internal equilibrium chemical energy fluxes are demonstrated. The theory reveals that there are chemical energy flux gaps and broken discrete symmetries at the activation chemical potential, time, and displacement. The statistical, thermodynamic theory is the unification of diffusion and internal convection chemical reactions which reduces to the non-equilibrium generalization beyond the quasi-equilibrium theories of migration and diffusion processes. The relationship between kinetic theories of chemical and electrochemical reactions is also explored. The theory is applied to explore non-equilibrium chemical reactions as an illustration. Three variable separation constants indicate particle number constants and play key roles in describing the distinct chemical reaction mechanisms. The kinetics of chemical energy transfer accounts for the four control mechanisms of chemical reactions such as activation, concentration, transition, and film chemical reactions. - Highlights: • Chemical energy transfer theory is proposed for non-, quasi-, and equilibrium. • Gibbs energy fluxes are expressed by chemical potential, time, and displacement. • Relationship between chemical and electrochemical reactions is discussed. • Theory is applied to explore nonequilibrium energy transfer in chemical reactions. • Kinetics of non-equilibrium chemical reactions shows the four control mechanisms

  9. Quasi-homogenous approximation for description of the properties of dispersed systems. The basic approaches to model hardening processes in nanodispersed silica systems. Part 2. The hardening processes from the standpoint of statistical physics

    Directory of Open Access Journals (Sweden)

    KUDRYAVTSEV Pavel Gennadievich

    2015-04-01

    Full Text Available The paper deals with possibilities to use quasi-homogenous approximation for discription of properties of dispersed systems. The authors applied statistical polymer ethod based on consideration of average structures of all possible macromolecules of the same weight. The equiations which allow evaluating many additive parameters of macromolecules and the systems with them were deduced. Statistical polymer method makes it possible to model branched, cross-linked macromolecules and the systems with them which are in equilibrium or non-equilibrium state. Fractal analysis of statistical polymer allows modeling different types of random fractal and other objects examined with the mehods of fractal theory. The method of fractal polymer can be also applied not only to polymers but also to composites, gels, associates in polar liquids and other packaged systems. There is also a description of the states of colloid solutions of silica oxide from the point of view of statistical physics. This approach is based on the idea that colloid solution of silica dioxide – sol of silica dioxide – consists of enormous number of interacting particles which are always in move. The paper is devoted to the research of ideal system of colliding but not interacting particles of sol. The analysis of behavior of silica sol was performed according to distribution Maxwell-Boltzmann and free path length was calculated. Using this data the number of the particles which can overcome the potential barrier in collision was calculated. To model kinetics of sol-gel transition different approaches were studied.

  10. Para-equilibrium phase diagrams

    International Nuclear Information System (INIS)

    Pelton, Arthur D.; Koukkari, Pertti; Pajarre, Risto; Eriksson, Gunnar

    2014-01-01

    Highlights: • A rapidly cooled system may attain a state of para-equilibrium. • In this state rapidly diffusing elements reach equilibrium but others are immobile. • Application of the Phase Rule to para-equilibrium phase diagrams is discussed. • A general algorithm to calculate para-equilibrium phase diagrams is described. - Abstract: If an initially homogeneous system at high temperature is rapidly cooled, a temporary para-equilibrium state may result in which rapidly diffusing elements have reached equilibrium but more slowly diffusing elements have remained essentially immobile. The best known example occurs when homogeneous austenite is quenched. A para-equilibrium phase assemblage may be calculated thermodynamically by Gibbs free energy minimization under the constraint that the ratios of the slowly diffusing elements are the same in all phases. Several examples of calculated para-equilibrium phase diagram sections are presented and the application of the Phase Rule is discussed. Although the rules governing the geometry of these diagrams may appear at first to be somewhat different from those for full equilibrium phase diagrams, it is shown that in fact they obey exactly the same rules with the following provision. Since the molar ratios of non-diffusing elements are the same in all phases at para-equilibrium, these ratios act, as far as the geometry of the diagram is concerned, like “potential” variables (such as T, pressure or chemical potentials) rather than like “normal” composition variables which need not be the same in all phases. A general algorithm to calculate para-equilibrium phase diagrams is presented. In the limit, if a para-equilibrium calculation is performed under the constraint that no elements diffuse, then the resultant phase diagram shows the single phase with the minimum Gibbs free energy at any point on the diagram; such calculations are of interest in physical vapor deposition when deposition is so rapid that phase

  11. [The improvement of the abilities to maintain motor coordination and equilibrium in the students presenting with the functional disorders of the musculoskeletal system by introducing the elements of therapeutic physical training into the structure of academic schedule of physical education].

    Science.gov (United States)

    Kapilevich, L V; Davlet'yarova, K V; Ovchinnikova, N A

    The problem of deterioration of the health status in the university students at present remains as topical as it was before being a major cause of impaired working capacity, disability and/or poor social adaptation of the large number of graduates. It has been proposed to introduce a class of therapeutic physical training (TPT) into the schedule of physical education for the students. The objective of the present study was to evaluate the effectiveness of the formation of the skills needed to maintain motor coordination and equilibrium in the students presenting with the functional disorders of the musculoskeletal system (MSS) including scoliosis by the introduction of the elements of therapeutic physical training into their academic schedules. The main study group was comprised of 32 students (men) at the age of 18-19 years presenting with the disorders of the musculoskeletal system (type III scoliosis, osteochondropathy, and osteochondrosis). The students of this group received a curriculum aimed at improving their motor skills with the emphasis laid on the selected elements of therapeutic physical training. The control group was composed of 17 students without disorders of the musculoskeletal system who attended the physical education classes following the traditional program. The coordination abilities and balance skills were evaluated based on the analysis with the use of the Stabilan-1 stabilographic apparatus. In addition, the stability test and the Romberg test with open and closed eyes were performed. The results of the study give evidence that the introduction of the elements of therapeutic physical training into the structure of academic schedule of physical education for the students suffering from diseases of the musculoskeletal system has beneficial effect on the parameters of stability and the general ability to maintain the posture and balance. Specifically, in the beginning of the academic year, the students of the main study group presenting with

  12. Statistical Data Analyses of Trace Chemical, Biochemical, and Physical Analytical Signatures

    Energy Technology Data Exchange (ETDEWEB)

    Udey, Ruth Norma [Michigan State Univ., East Lansing, MI (United States)

    2013-01-01

    Analytical and bioanalytical chemistry measurement results are most meaningful when interpreted using rigorous statistical treatments of the data. The same data set may provide many dimensions of information depending on the questions asked through the applied statistical methods. Three principal projects illustrated the wealth of information gained through the application of statistical data analyses to diverse problems.

  13. Thermodynamic evolution far from equilibrium

    Science.gov (United States)

    Khantuleva, Tatiana A.

    2018-05-01

    The presented model of thermodynamic evolution of an open system far from equilibrium is based on the modern results of nonequilibrium statistical mechanics, the nonlocal theory of nonequilibrium transport developed by the author and the Speed Gradient principle introduced in the theory of adaptive control. Transition to a description of the system internal structure evolution at the mesoscopic level allows a new insight at the stability problem of non-equilibrium processes. The new model is used in a number of specific tasks.

  14. Information processing in bacteria: memory, computation, and statistical physics: a key issues review

    International Nuclear Information System (INIS)

    Lan, Ganhui; Tu, Yuhai

    2016-01-01

    preserving information, it does not reveal the underlying mechanism that leads to the observed input-output relationship, nor does it tell us much about which information is important for the organism and how biological systems use information to carry out specific functions. To do that, we need to develop models of the biological machineries, e.g. biochemical networks and neural networks, to understand the dynamics of biological information processes. This is a much more difficult task. It requires deep knowledge of the underlying biological network—the main players (nodes) and their interactions (links)—in sufficient detail to build a model with predictive power, as well as quantitative input-output measurements of the system under different perturbations (both genetic variations and different external conditions) to test the model predictions to guide further development of the model. Due to the recent growth of biological knowledge thanks in part to high throughput methods (sequencing, gene expression microarray, etc) and development of quantitative in vivo techniques such as various florescence technology, these requirements are starting to be realized in different biological systems. The possible close interaction between quantitative experimentation and theoretical modeling has made systems biology an attractive field for physicists interested in quantitative biology. In this review, we describe some of the recent work in developing a quantitative predictive model of bacterial chemotaxis, which can be considered as the hydrogen atom of systems biology. Using statistical physics approaches, such as the Ising model and Langevin equation, we study how bacteria, such as E. coli, sense and amplify external signals, how they keep a working memory of the stimuli, and how they use these data to compute the chemical gradient. In particular, we will describe how E. coli cells avoid cross-talk in a heterogeneous receptor cluster to keep a ligand-specific memory. We will also

  15. Information processing in bacteria: memory, computation, and statistical physics: a key issues review

    Science.gov (United States)

    Lan, Ganhui; Tu, Yuhai

    2016-05-01

    preserving information, it does not reveal the underlying mechanism that leads to the observed input-output relationship, nor does it tell us much about which information is important for the organism and how biological systems use information to carry out specific functions. To do that, we need to develop models of the biological machineries, e.g. biochemical networks and neural networks, to understand the dynamics of biological information processes. This is a much more difficult task. It requires deep knowledge of the underlying biological network—the main players (nodes) and their interactions (links)—in sufficient detail to build a model with predictive power, as well as quantitative input-output measurements of the system under different perturbations (both genetic variations and different external conditions) to test the model predictions to guide further development of the model. Due to the recent growth of biological knowledge thanks in part to high throughput methods (sequencing, gene expression microarray, etc) and development of quantitative in vivo techniques such as various florescence technology, these requirements are starting to be realized in different biological systems. The possible close interaction between quantitative experimentation and theoretical modeling has made systems biology an attractive field for physicists interested in quantitative biology. In this review, we describe some of the recent work in developing a quantitative predictive model of bacterial chemotaxis, which can be considered as the hydrogen atom of systems biology. Using statistical physics approaches, such as the Ising model and Langevin equation, we study how bacteria, such as E. coli, sense and amplify external signals, how they keep a working memory of the stimuli, and how they use these data to compute the chemical gradient. In particular, we will describe how E. coli cells avoid cross-talk in a heterogeneous receptor cluster to keep a ligand-specific memory. We will also

  16. Information processing in bacteria: memory, computation, and statistical physics: a key issues review.

    Science.gov (United States)

    Lan, Ganhui; Tu, Yuhai

    2016-05-01

    preserving information, it does not reveal the underlying mechanism that leads to the observed input-output relationship, nor does it tell us much about which information is important for the organism and how biological systems use information to carry out specific functions. To do that, we need to develop models of the biological machineries, e.g. biochemical networks and neural networks, to understand the dynamics of biological information processes. This is a much more difficult task. It requires deep knowledge of the underlying biological network-the main players (nodes) and their interactions (links)-in sufficient detail to build a model with predictive power, as well as quantitative input-output measurements of the system under different perturbations (both genetic variations and different external conditions) to test the model predictions to guide further development of the model. Due to the recent growth of biological knowledge thanks in part to high throughput methods (sequencing, gene expression microarray, etc) and development of quantitative in vivo techniques such as various florescence technology, these requirements are starting to be realized in different biological systems. The possible close interaction between quantitative experimentation and theoretical modeling has made systems biology an attractive field for physicists interested in quantitative biology. In this review, we describe some of the recent work in developing a quantitative predictive model of bacterial chemotaxis, which can be considered as the hydrogen atom of systems biology. Using statistical physics approaches, such as the Ising model and Langevin equation, we study how bacteria, such as E. coli, sense and amplify external signals, how they keep a working memory of the stimuli, and how they use these data to compute the chemical gradient. In particular, we will describe how E. coli cells avoid cross-talk in a heterogeneous receptor cluster to keep a ligand-specific memory. We will also

  17. On the local equilibrium condition

    International Nuclear Information System (INIS)

    Hessling, H.

    1994-11-01

    A physical system is in local equilibrium if it cannot be distinguished from a global equilibrium by ''infinitesimally localized measurements''. This should be a natural characterization of local equilibrium, but the problem is to give a precise meaning to the qualitative phrase ''infinitesimally localized measurements''. A solution is suggested in form of a Local Equilibrium Condition (LEC), which can be applied to linear relativistic quantum field theories but not directly to selfinteracting quantum fields. The concept of local temperature resulting from LEC is compared to an old approach to local temperature based on the principle of maximal entropy. It is shown that the principle of maximal entropy does not always lead to physical states if it is applied to relativistic quantum field theories. (orig.)

  18. RECENT CONTRIBUTIONS OF THE STATISTICAL PHYSICS IN THE RESEARCH OF BANKING, STOCK EXCHANGE AND FOREIGN EXCHANGE MARKETS

    Directory of Open Access Journals (Sweden)

    PIRVU DANIELA

    2016-04-01

    Full Text Available This paper proposes a framework for exploring the main research approaches of the financial markets, conducted in the past years by statistical physics specialists. It, also, presents the global financial developments in the last few years, as well as a review of the most important steps in the development of the physical and mathematical modelling of the socioeconomic phenomena. In this regard, we analysed research findings published in the notable international journals. Our research demonstrated that the econophysical models developed in the past few years for the description of the financial phenomena and processes do not provide satisfactory results for the construction of complete solutions able to answer the nowadays financial challenges. We believe that research instrumentation of statistical physics has developed significantly lately and the research approaches in this field should continue and should be enhanced.

  19. Global regionalized seismicity in view of Non-Extensive Statistical Physics

    Science.gov (United States)

    Chochlaki, Kalliopi; Vallianatos, Filippos; Michas, Georgios

    2018-03-01

    In the present work we study the distribution of Earth's shallow seismicity on different seismic zones, as occurred from 1981 to 2011 and extracted from the Centroid Moment Tensor (CMT) catalog. Our analysis is based on the subdivision of the Earth's surface into seismic zones that are homogeneous with regards to seismic activity and orientation of the predominant stress field. For this, we use the Flinn-Engdahl regionalization (FE) (Flinn and Engdahl, 1965), which consists of fifty seismic zones as modified by Lombardi and Marzocchi (2007). The latter authors grouped the 50 FE zones into larger tectonically homogeneous ones, utilizing the cumulative moment tensor method, resulting into thirty-nine seismic zones. In each one of these seismic zones we study the distribution of seismicity in terms of the frequency-magnitude distribution and the inter-event time distribution between successive earthquakes, a task that is essential for hazard assessments and to better understand the global and regional geodynamics. In our analysis we use non-extensive statistical physics (NESP), which seems to be one of the most adequate and promising methodological tools for analyzing complex systems, such as the Earth's seismicity, introducing the q-exponential formulation as the expression of probability distribution function that maximizes the Sq entropy as defined by Tsallis, (1988). The qE parameter is significantly greater than one for all the seismic regions analyzed with value range from 1.294 to 1.504, indicating that magnitude correlations are particularly strong. Furthermore, the qT parameter shows some temporal correlations but variations with cut-off magnitude show greater temporal correlations when the smaller magnitude earthquakes are included. The qT for earthquakes with magnitude greater than 5 takes values from 1.043 to 1.353 and as we increase the cut-off magnitude to 5.5 and 6 the qT value ranges from 1.001 to 1.242 and from 1.001 to 1.181 respectively, presenting

  20. Nuclear multifragmentation, its relation to general physics. A rich test ground of the fundamentals of statistical mechanics

    International Nuclear Information System (INIS)

    Gross, D.H.E.

    2006-01-01

    Heat can flow from cold to hot at any phase separation even in macroscopic systems. Therefore also Lynden-Bell's famous gravo-thermal catastrophe must be reconsidered. In contrast to traditional canonical Boltzmann-Gibbs statistics this is correctly described only by microcanonical statistics. Systems studied in chemical thermodynamics (ChTh) by using canonical statistics consist of several homogeneous macroscopic phases. Evidently, macroscopic statistics as in chemistry cannot and should not be applied to non-extensive or inhomogeneous systems like nuclei or galaxies. Nuclei are small and inhomogeneous. Multifragmented nuclei are even more inhomogeneous and the fragments even smaller. Phase transitions of first order and especially phase separations therefore cannot be described by a (homogeneous) canonical ensemble. Taking this serious, fascinating perspectives open for statistical nuclear fragmentation as test ground for the basic principles of statistical mechanics, especially of phase transitions, without the use of the thermodynamic limit. Moreover, there is also a lot of similarity between the accessible phase space of fragmenting nuclei and inhomogeneous multistellar systems. This underlines the fundamental significance for statistical physics in general. (orig.)

  1. Physicochemical modeling of reactive violet 5 dye adsorption on home-made cocoa shell and commercial activated carbons using the statistical physics theory

    Directory of Open Access Journals (Sweden)

    Lotfi Sellaoui

    Full Text Available Two equilibrium models based on statistical physics, i.e., monolayer model with single energy and multilayer model with saturation, were developed and employed to access the steric and energetic aspects in the adsorption of reactive violet 5 dye (RV-5 on cocoa shell activated carbon (AC and commercial activated carbon (CAC, at different temperatures (from 298 to 323 K. The results showed that the multilayer model with saturation was able to represent the adsorption system. This model assumes that the adsorption occurs by a formation of certain number of layers. The n values ranged from 1.10 to 2.98, indicating that the adsorbate molecules interacted in an inclined position on the adsorbent surface and aggregate in solution. The study of the total number of the formed layers (1 + L2 showed that the steric hindrance is the dominant factor. The description of the adsorbate–adsorbent interactions by calculation of the adsorption energy indicated that the process occurred by physisorption in nature, since the values were lower than 40 kJ mol−1. Keywords: RV-5 dye, Activated carbon, Modeling, Aggregation

  2. Student learning of upper-level thermal and statistical physics: The derivation and use of the Boltzmann factor

    Science.gov (United States)

    Thompson, John

    2015-04-01

    As the Physical Review Focused Collection demonstrates, recent frontiers in physics education research include systematic investigations at the upper division. As part of a collaborative project, we have examined student understanding of several topics in upper-division thermal and statistical physics. A fruitful context for research is the Boltzmann factor in statistical mechanics: the standard derivation involves several physically justified mathematical steps as well as the invocation of a Taylor series expansion. We have investigated student understanding of the physical significance of the Boltzmann factor as well as its utility in various circumstances, and identified various lines of student reasoning related to the use of the Boltzmann factor. Results from written data as well as teaching interviews suggest that many students do not use the Boltzmann factor when answering questions related to probability in applicable physical situations, even after lecture instruction. We designed an inquiry-based tutorial activity to guide students through a derivation of the Boltzmann factor and to encourage deep connections between the physical quantities involved and the mathematics. Observations of students working through the tutorial suggest that many students at this level can recognize and interpret Taylor series expansions, but they often lack fluency in creating and using Taylor series appropriately, despite previous exposure in both calculus and physics courses. Our findings also suggest that tutorial participation not only increases the prevalence of relevant invocation of the Boltzmann factor, but also helps students gain an appreciation of the physical implications and meaning of the mathematical formalism behind the formula. Supported in part by NSF Grants DUE-0817282, DUE-0837214, and DUE-1323426.

  3. Towards an Industrial Application of Statistical Uncertainty Analysis Methods to Multi-physical Modelling and Safety Analyses

    International Nuclear Information System (INIS)

    Zhang, Jinzhao; Segurado, Jacobo; Schneidesch, Christophe

    2013-01-01

    Since 1980's, Tractebel Engineering (TE) has being developed and applied a multi-physical modelling and safety analyses capability, based on a code package consisting of the best estimate 3D neutronic (PANTHER), system thermal hydraulic (RELAP5), core sub-channel thermal hydraulic (COBRA-3C), and fuel thermal mechanic (FRAPCON/FRAPTRAN) codes. A series of methodologies have been developed to perform and to license the reactor safety analysis and core reload design, based on the deterministic bounding approach. Following the recent trends in research and development as well as in industrial applications, TE has been working since 2010 towards the application of the statistical sensitivity and uncertainty analysis methods to the multi-physical modelling and licensing safety analyses. In this paper, the TE multi-physical modelling and safety analyses capability is first described, followed by the proposed TE best estimate plus statistical uncertainty analysis method (BESUAM). The chosen statistical sensitivity and uncertainty analysis methods (non-parametric order statistic method or bootstrap) and tool (DAKOTA) are then presented, followed by some preliminary results of their applications to FRAPCON/FRAPTRAN simulation of OECD RIA fuel rod codes benchmark and RELAP5/MOD3.3 simulation of THTF tests. (authors)

  4. Beyond quantum microcanonical statistics

    International Nuclear Information System (INIS)

    Fresch, Barbara; Moro, Giorgio J.

    2011-01-01

    Descriptions of molecular systems usually refer to two distinct theoretical frameworks. On the one hand the quantum pure state, i.e., the wavefunction, of an isolated system is determined to calculate molecular properties and their time evolution according to the unitary Schroedinger equation. On the other hand a mixed state, i.e., a statistical density matrix, is the standard formalism to account for thermal equilibrium, as postulated in the microcanonical quantum statistics. In the present paper an alternative treatment relying on a statistical analysis of the possible wavefunctions of an isolated system is presented. In analogy with the classical ergodic theory, the time evolution of the wavefunction determines the probability distribution in the phase space pertaining to an isolated system. However, this alone cannot account for a well defined thermodynamical description of the system in the macroscopic limit, unless a suitable probability distribution for the quantum constants of motion is introduced. We present a workable formalism assuring the emergence of typical values of thermodynamic functions, such as the internal energy and the entropy, in the large size limit of the system. This allows the identification of macroscopic properties independently of the specific realization of the quantum state. A description of material systems in agreement with equilibrium thermodynamics is then derived without constraints on the physical constituents and interactions of the system. Furthermore, the canonical statistics is recovered in all generality for the reduced density matrix of a subsystem.

  5. A statistical law in the perception of risks and physical quantities in traffic

    DEFF Research Database (Denmark)

    Elvik, Rune

    2015-01-01

    This paper suggests that a universal psychophysical law influences the perception of risks and physical quantities in traffic. This law states that there will be a tendency to overestimate low probabilities or small quantities, while high probabilities or large quantities may be underestimated....... Studies of the perception of risk and physical quantities in traffic have found a highly consistent pattern....

  6. Non-equilibrium thermodynamics

    CERN Document Server

    De Groot, Sybren Ruurds

    1984-01-01

    The study of thermodynamics is especially timely today, as its concepts are being applied to problems in biology, biochemistry, electrochemistry, and engineering. This book treats irreversible processes and phenomena - non-equilibrium thermodynamics.S. R. de Groot and P. Mazur, Professors of Theoretical Physics, present a comprehensive and insightful survey of the foundations of the field, providing the only complete discussion of the fluctuating linear theory of irreversible thermodynamics. The application covers a wide range of topics: the theory of diffusion and heat conduction, fluid dyn

  7. Experimental benchmark of non-local-thermodynamic-equilibrium plasma atomic physics codes; Validation experimentale des codes de physique atomique des plasmas hors equilibre thermodynamique local

    Energy Technology Data Exchange (ETDEWEB)

    Nagels-Silvert, V

    2004-09-15

    The main purpose of this thesis is to get experimental data for the testing and validation of atomic physics codes dealing with non-local-thermodynamical-equilibrium plasmas. The first part is dedicated to the spectroscopic study of xenon and krypton plasmas that have been produced by a nanosecond laser pulse interacting with a gas jet. A Thomson scattering diagnostic has allowed us to measure independently plasma parameters such as electron temperature, electron density and the average ionisation state. We have obtained time integrated spectra in the range between 5 and 10 angstroms. We have identified about one hundred xenon rays between 8.6 and 9.6 angstroms via the use of the Relac code. We have discovered unknown rays for the krypton between 5.2 and 7.5 angstroms. In a second experiment we have extended the wavelength range to the X UV domain. The Averroes/Transpec code has been tested in the ranges from 9 to 15 angstroms and from 10 to 130 angstroms, the first range has been well reproduced while the second range requires a more complex data analysis. The second part is dedicated to the spectroscopic study of aluminium, selenium and samarium plasmas in femtosecond operating rate. We have designed an interferometry diagnostic in the frequency domain that has allowed us to measure the expanding speed of the target's backside. Via the use of an adequate isothermal model this parameter has led us to know the plasma electron temperature. Spectra and emission times of various rays from the aluminium and selenium plasmas have been computed satisfactorily with the Averroes/Transpec code coupled with Film and Multif hydrodynamical codes. (A.C.)

  8. Experimental benchmark of non-local-thermodynamic-equilibrium plasma atomic physics codes; Validation experimentale des codes de physique atomique des plasmas hors equilibre thermodynamique local

    Energy Technology Data Exchange (ETDEWEB)

    Nagels-Silvert, V

    2004-09-15

    The main purpose of this thesis is to get experimental data for the testing and validation of atomic physics codes dealing with non-local-thermodynamical-equilibrium plasmas. The first part is dedicated to the spectroscopic study of xenon and krypton plasmas that have been produced by a nanosecond laser pulse interacting with a gas jet. A Thomson scattering diagnostic has allowed us to measure independently plasma parameters such as electron temperature, electron density and the average ionisation state. We have obtained time integrated spectra in the range between 5 and 10 angstroms. We have identified about one hundred xenon rays between 8.6 and 9.6 angstroms via the use of the Relac code. We have discovered unknown rays for the krypton between 5.2 and 7.5 angstroms. In a second experiment we have extended the wavelength range to the X UV domain. The Averroes/Transpec code has been tested in the ranges from 9 to 15 angstroms and from 10 to 130 angstroms, the first range has been well reproduced while the second range requires a more complex data analysis. The second part is dedicated to the spectroscopic study of aluminium, selenium and samarium plasmas in femtosecond operating rate. We have designed an interferometry diagnostic in the frequency domain that has allowed us to measure the expanding speed of the target's backside. Via the use of an adequate isothermal model this parameter has led us to know the plasma electron temperature. Spectra and emission times of various rays from the aluminium and selenium plasmas have been computed satisfactorily with the Averroes/Transpec code coupled with Film and Multif hydrodynamical codes. (A.C.)

  9. Statistical Learning Is Not Affected by a Prior Bout of Physical Exercise.

    Science.gov (United States)

    Stevens, David J; Arciuli, Joanne; Anderson, David I

    2016-05-01

    This study examined the effect of a prior bout of exercise on implicit cognition. Specifically, we examined whether a prior bout of moderate intensity exercise affected performance on a statistical learning task in healthy adults. A total of 42 participants were allocated to one of three conditions-a control group, a group that exercised for 15 min prior to the statistical learning task, and a group that exercised for 30 min prior to the statistical learning task. The participants in the exercise groups cycled at 60% of their respective V˙O2 max. Each group demonstrated significant statistical learning, with similar levels of learning among the three groups. Contrary to previous research that has shown that a prior bout of exercise can affect performance on explicit cognitive tasks, the results of the current study suggest that the physiological stress induced by moderate-intensity exercise does not affect implicit cognition as measured by statistical learning. Copyright © 2015 Cognitive Science Society, Inc.

  10. Numerical Verification Of Equilibrium Chemistry

    International Nuclear Information System (INIS)

    Piro, Markus; Lewis, Brent; Thompson, William T.; Simunovic, Srdjan; Besmann, Theodore M.

    2010-01-01

    A numerical tool is in an advanced state of development to compute the equilibrium compositions of phases and their proportions in multi-component systems of importance to the nuclear industry. The resulting software is being conceived for direct integration into large multi-physics fuel performance codes, particularly for providing boundary conditions in heat and mass transport modules. However, any numerical errors produced in equilibrium chemistry computations will be propagated in subsequent heat and mass transport calculations, thus falsely predicting nuclear fuel behaviour. The necessity for a reliable method to numerically verify chemical equilibrium computations is emphasized by the requirement to handle the very large number of elements necessary to capture the entire fission product inventory. A simple, reliable and comprehensive numerical verification method is presented which can be invoked by any equilibrium chemistry solver for quality assurance purposes.

  11. A general theory of non-equilibrium dynamics of lipid-protein fluid membranes

    DEFF Research Database (Denmark)

    Lomholt, Michael Andersen; Hansen, Per Lyngs; Miao, L.

    2005-01-01

    We present a general and systematic theory of non-equilibrium dynamics of multi-component fluid membranes, in general, and membranes containing transmembrane proteins, in particular. Developed based on a minimal number of principles of statistical physics and designed to be a meso...

  12. DbAccess: Interactive Statistics and Graphics for Plasma Physics Databases

    International Nuclear Information System (INIS)

    Davis, W.; Mastrovito, D.

    2003-01-01

    DbAccess is an X-windows application, written in IDL(reg s ign), meeting many specialized statistical and graphical needs of NSTX [National Spherical Torus Experiment] plasma physicists, such as regression statistics and the analysis of variance. Flexible ''views'' and ''joins,'' which include options for complex SQL expressions, facilitate mixing data from different database tables. General Atomics Plot Objects add extensive graphical and interactive capabilities. An example is included for plasma confinement-time scaling analysis using a multiple linear regression least-squares power fit

  13. Statistical analysis of morphometric indicators and physical readiness variability of students

    Directory of Open Access Journals (Sweden)

    R.A. Gainullin

    2017-10-01

    Full Text Available Aim: To evaluate the interaction of morphometric characteristics with the reactions of the cardiorespiratory system and the indices of physical training during the process of physical exercise training at the university. Material: The students of the first course (n = 91, aged 17-18 took part in the survey. The students were divided into 6 groups. All students were engaged in physical training. All the studied indicators were conditionally divided into two groups. The first group of studies included indicators of physical fitness. The second group was formed by morphofunctional indices. Results: The indicators of the physical preparedness of students demonstrate a wide range and heterogeneity. This should be taken into account when staffing training groups. When using the technique of development of local regional muscular endurance, the values of orthostatic test and the Skibinski index show significant variability. Also high and significant correlation interactions are shown by indicators: manual dynamometry; strength endurance; the values of the Skibinski index. Also, in the orthotropic test, the same effect was observed: age, body length, heart rate. A similar analysis of morphofunctional indices shows significant correlation links: the Skibinski index and orthotropic tests; age and the Skibinski index; weight and body length. Conclusions: from the point of view of physical fitness, groups of sports training (the second group and hypertensive groups (group 5 proved to be the most stable. A group of volunteers turned out to be the most stable relative to the morphofunctional indicators.

  14. Nonequilibrium statistical physics of small systems: fluctuation relations and beyond (annual reviews of nonlinear dynamics and complexity (vch))

    CERN Document Server

    2013-01-01

    This book offers a comprehensive picture of nonequilibrium phenomena in nanoscale systems. Written by internationally recognized experts in the field, this book strikes a balance between theory and experiment, and includes in-depth introductions to nonequilibrium fluctuation relations, nonlinear dynamics and transport, single molecule experiments, and molecular diffusion in nanopores. The authors explore the application of these concepts to nano- and biosystems by cross-linking key methods and ideas from nonequilibrium statistical physics, thermodynamics, stochastic theory, and dynamical s

  15. Physical property, phase equilibrium, distillation. Measurement and prediction of vapor-liquid and liquid-liquid equilibria; Bussei / heiko / joryu. Kieki, ekieki heiko no sokutei to suisan

    Energy Technology Data Exchange (ETDEWEB)

    Tochigi, K. [Nihon Univ., Tokyo (Japan)

    1998-08-05

    The data on vapor-liquid equilibrium are basic data indispensable to the designing of a distillation process. The stage required for separation depends greatly upon the x-y curve, and the existence/nonexistence of an azeotropic point is also an important item to be checked. This paper describes the measurement of vapor-liquid equilibrium and liquid-liquid equilibrium, and then introduces reliable data on vapor-liquid equilibrium and parameters of an activity coefficient formula. For the prediction of vapor-liquid equilibrium, the ASOG, UNIFAC, and modified NIFAC, all being group contributive methods are utilized. The differences between these group contributive methods are based on the differences between the contributive items based on the differences in size of molecules influencing the activity coefficients and the expression of the group activity coefficient formula. The applicable number of groups of the ASOG is 43, while that of groups of the UNIFAC is 50. The modified UNIFAC covers 43 groups. The prediction of liquid-liquid equilibrium by using a group contributive method has little progressed since the of the results of the study of Magnussen et al. using the UNIFAC. 12 refs., 8 figs., 1 tab.

  16. Relationship between physical fitness and game-related statistics in elite professional basketball players: Regular season vs. playoffs

    Directory of Open Access Journals (Sweden)

    João Henrique Gomes

    2017-05-01

    Full Text Available Abstract AIMS This study aimed to verify th erelation ship between of anthropometric and physical performance variables with game-related statistics in professional elite basketball players during a competition. METHODS Eleven male basketball players were evaluated during 10 weeks in two distinct moments (regular season and playoffs. Overall, 11 variables of physical fitness and 13 variables of game-related statistics were analysed. RESULTS The following significant Pearson’scorrelations were found in regular season: percentage of fat mass with assists (r = -0.62 and steals (r = -0.63; height (r = 0.68, lean mass (r = 0.64, and maximum strength (r = 0.67 with blocks; squat jump with steals (r = 0.63; and time in the T-test with success ful two-point field-goals (r = -0.65, success ful free-throws (r = -0.61, and steals (r = -0.62. However, in playoffs, only stature and lean mass maintained these correlations (p ≤ 0.05. CONCLUSIONS The anthropometric and physical characteristics of the players showed few correlations with the game-related statistics in regular season, and these correlations are even lower in the playoff games of a professional elite Champion ship, wherefore, not being good predictors of technical performance.

  17. A statistical physics view of pitch fluctuations in the classical music from Bach to Chopin: evidence for scaling.

    Science.gov (United States)

    Liu, Lu; Wei, Jianrong; Zhang, Huishu; Xin, Jianhong; Huang, Jiping

    2013-01-01

    Because classical music has greatly affected our life and culture in its long history, it has attracted extensive attention from researchers to understand laws behind it. Based on statistical physics, here we use a different method to investigate classical music, namely, by analyzing cumulative distribution functions (CDFs) and autocorrelation functions of pitch fluctuations in compositions. We analyze 1,876 compositions of five representative classical music composers across 164 years from Bach, to Mozart, to Beethoven, to Mendelsohn, and to Chopin. We report that the biggest pitch fluctuations of a composer gradually increase as time evolves from Bach time to Mendelsohn/Chopin time. In particular, for the compositions of a composer, the positive and negative tails of a CDF of pitch fluctuations are distributed not only in power laws (with the scale-free property), but also in symmetry (namely, the probability of a treble following a bass and that of a bass following a treble are basically the same for each composer). The power-law exponent decreases as time elapses. Further, we also calculate the autocorrelation function of the pitch fluctuation. The autocorrelation function shows a power-law distribution for each composer. Especially, the power-law exponents vary with the composers, indicating their different levels of long-range correlation of notes. This work not only suggests a way to understand and develop music from a viewpoint of statistical physics, but also enriches the realm of traditional statistical physics by analyzing music.

  18. A statistical physics view of pitch fluctuations in the classical music from Bach to Chopin: evidence for scaling.

    Directory of Open Access Journals (Sweden)

    Lu Liu

    Full Text Available Because classical music has greatly affected our life and culture in its long history, it has attracted extensive attention from researchers to understand laws behind it. Based on statistical physics, here we use a different method to investigate classical music, namely, by analyzing cumulative distribution functions (CDFs and autocorrelation functions of pitch fluctuations in compositions. We analyze 1,876 compositions of five representative classical music composers across 164 years from Bach, to Mozart, to Beethoven, to Mendelsohn, and to Chopin. We report that the biggest pitch fluctuations of a composer gradually increase as time evolves from Bach time to Mendelsohn/Chopin time. In particular, for the compositions of a composer, the positive and negative tails of a CDF of pitch fluctuations are distributed not only in power laws (with the scale-free property, but also in symmetry (namely, the probability of a treble following a bass and that of a bass following a treble are basically the same for each composer. The power-law exponent decreases as time elapses. Further, we also calculate the autocorrelation function of the pitch fluctuation. The autocorrelation function shows a power-law distribution for each composer. Especially, the power-law exponents vary with the composers, indicating their different levels of long-range correlation of notes. This work not only suggests a way to understand and develop music from a viewpoint of statistical physics, but also enriches the realm of traditional statistical physics by analyzing music.

  19. An introduction to equilibrium thermodynamics

    CERN Document Server

    Morrill, Bernard; Hartnett, James P; Hughes, William F

    1973-01-01

    An Introduction to Equilibrium Thermodynamics discusses classical thermodynamics and irreversible thermodynamics. It introduces the laws of thermodynamics and the connection between statistical concepts and observable macroscopic properties of a thermodynamic system. Chapter 1 discusses the first law of thermodynamics while Chapters 2 through 4 deal with statistical concepts. The succeeding chapters describe the link between entropy and the reversible heat process concept of entropy; the second law of thermodynamics; Legendre transformations and Jacobian algebra. Finally, Chapter 10 provides a

  20. Statistical Learning Is Not Affected by a Prior Bout of Physical Exercise

    Science.gov (United States)

    Stevens, David J.; Arciuli, Joanne; Anderson, David I.

    2016-01-01

    This study examined the effect of a prior bout of exercise on implicit cognition. Specifically, we examined whether a prior bout of moderate intensity exercise affected performance on a statistical learning task in healthy adults. A total of 42 participants were allocated to one of three conditions--a control group, a group that exercised for…

  1. Retrieval of Specific Leaf Area From Landsat-8 Surface Reflectance Data Using Statistical and Physical Models

    NARCIS (Netherlands)

    Ali, Abebe Mohammed; Darvishzadeh, R.; Skidmore, Andrew K.

    2017-01-01

    One of the key traits in the assessment of ecosystem functions is a specific leaf area (SLA). The main aim of this study was to examine the potential of new generation satellite images, such as Landsat-8 imagery, for the retrieval of SLA at regional and global scales. Therefore, both statistical and

  2. ``Statistical treatment of the spectral properties of plasmas in local thermodynamical equilibrium using a screened hydrogenic model``; ``Traitement statistique des proprietes spectrales des plasmas a l`equilibre thermodynamique local dans le cadre du modele hydrogenique ecrante``

    Energy Technology Data Exchange (ETDEWEB)

    Faussurier, G.

    1996-12-31

    A new screened hydrogenic model is presented. The screening constants depend both on the principal n and orbital l quantum numbers. They have been obtained from numerical fits over a large data base containing ionization potentials and one-electron excitation energies of ions. A rapid and original method to compute the bound-bound and bound-free oscillator strengths is proposed. The discrete spectrum and the series continuum are connected by continuity, and the sum rules are respected. The screened hydrogenic average atom is well-adapted to describe multicharged ion plasmas in local thermodynamic equilibrium (LTE). Using the key principle of statistical mechanics, it is shown first that this model is properly defined and thermodynamically coherent. Secondly, a new method of detailed ionization stage accounting of a LTE plasma is explained. It can be used to reconstruct the distribution of integer charge states in such a plasma from any average atom model. The l -splitting allows one-electron transitions between two subshells with the same principal quantum number n. They may be of great importance when the Rosseland opacity is computed. Though, methods of classical statistical mechanics are required to calculate the distribution of the configurations around the average atom one and so to improve the spectral opacities. The splitting in integer ionic stages can be easily included. The formalism is tested by comparisons with theoretical and experimental results published in the literature. From the photoabsorption spectra encountered, the main results are the correct estimations of both the Rosseland opacity and the detailed charge degrees accounting. (author).

  3. Nonequilibrium statistical mechanics ensemble method

    CERN Document Server

    Eu, Byung Chan

    1998-01-01

    In this monograph, nonequilibrium statistical mechanics is developed by means of ensemble methods on the basis of the Boltzmann equation, the generic Boltzmann equations for classical and quantum dilute gases, and a generalised Boltzmann equation for dense simple fluids The theories are developed in forms parallel with the equilibrium Gibbs ensemble theory in a way fully consistent with the laws of thermodynamics The generalised hydrodynamics equations are the integral part of the theory and describe the evolution of macroscopic processes in accordance with the laws of thermodynamics of systems far removed from equilibrium Audience This book will be of interest to researchers in the fields of statistical mechanics, condensed matter physics, gas dynamics, fluid dynamics, rheology, irreversible thermodynamics and nonequilibrium phenomena

  4. Brownian ratchets from statistical physics to bio and nano-motors

    CERN Document Server

    Cubero, David

    2016-01-01

    Illustrating the development of Brownian ratchets, from their foundations, to their role in the description of life at the molecular scale and in the design of artificial nano-machinery, this text will appeal to both advanced graduates and researchers entering the field. Providing a self-contained introduction to Brownian ratchets, devices which rectify microscopic fluctuations, Part I avoids technicalities and sets out the broad range of physical systems where the concept of ratchets is relevant. Part II supplies a single source for a complete and modern theoretical analysis of ratchets in regimes such as classical vs quantum and stochastic vs deterministic, and in Part III readers are guided through experimental developments in different physical systems, each highlighting a specific unique feature of ratchets. The thorough and systematic approach to the topic ensures that this book provides a complete guide to Brownian ratchets for newcomers and established researchers in physics, biology and biochemistry.

  5. Ordered phase and non-equilibrium fluctuation in stock market

    Science.gov (United States)

    Maskawa, Jun-ichi

    2002-08-01

    We analyze the statistics of daily price change of stock market in the framework of a statistical physics model for the collective fluctuation of stock portfolio. In this model the time series of price changes are coded into the sequences of up and down spins, and the Hamiltonian of the system is expressed by spin-spin interactions as in spin glass models of disordered magnetic systems. Through the analysis of Dow-Jones industrial portfolio consisting of 30 stock issues by this model, we find a non-equilibrium fluctuation mode on the point slightly below the boundary between ordered and disordered phases. The remaining 29 modes are still in disordered phase and well described by Gibbs distribution. The variance of the fluctuation is outlined by the theoretical curve and peculiarly large in the non-equilibrium mode compared with those in the other modes remaining in ordinary phase.

  6. Statistical Modelling of Synaptic Vesicles Distribution and Analysing their Physical Characteristics

    DEFF Research Database (Denmark)

    Khanmohammadi, Mahdieh

    transmission electron microscopy is used to acquire images from two experimental groups of rats: 1) rats subjected to a behavioral model of stress and 2) rats subjected to sham stress as the control group. The synaptic vesicle distribution and interactions are modeled by employing a point process approach......This Ph.D. thesis deals with mathematical and statistical modeling of synaptic vesicle distribution, shape, orientation and interactions. The first major part of this thesis treats the problem of determining the effect of stress on synaptic vesicle distribution and interactions. Serial section...... on differences of statistical measures in section and the same measures in between sections. Three-dimensional (3D) datasets are reconstructed by using image registration techniques and estimated thicknesses. We distinguish the effect of stress by estimating the synaptic vesicle densities and modeling...

  7. WE-E-201-01: Use and Abuse of Common Statistics in Radiological Physics

    International Nuclear Information System (INIS)

    Labby, Z.

    2015-01-01

    Physicists are often expected to have a solid grounding in experimental design and statistical analysis, sometimes filling in when biostatisticians or other experts are not available for consultation. Unfortunately, graduate education on these topics is seldom emphasized and few opportunities for continuing education exist. Clinical physicists incorporate new technology and methods into their practice based on published literature. A poor understanding of experimental design and analysis could Result in inappropriate use of new techniques. Clinical physicists also improve current practice through quality initiatives that require sound experimental design and analysis. Academic physicists with a poor understanding of design and analysis may produce ambiguous (or misleading) results. This can Result in unnecessary rewrites, publication rejection, and experimental redesign (wasting time, money, and effort). This symposium will provide a practical review of error and uncertainty, common study designs, and statistical tests. Instruction will primarily focus on practical implementation through examples and answer questions such as: where would you typically apply the test/design and where is the test/design typically misapplied (i.e., common pitfalls)? An analysis of error and uncertainty will also be explored using biological studies and associated modeling as a specific use case. Learning Objectives: Understand common experimental testing and clinical trial designs, what questions they can answer, and how to interpret the results Determine where specific statistical tests are appropriate and identify common pitfalls Understand the how uncertainty and error are addressed in biological testing and associated biological modeling

  8. WE-E-201-01: Use and Abuse of Common Statistics in Radiological Physics

    Energy Technology Data Exchange (ETDEWEB)

    Labby, Z. [University of Wisconsin (United States)

    2015-06-15

    Physicists are often expected to have a solid grounding in experimental design and statistical analysis, sometimes filling in when biostatisticians or other experts are not available for consultation. Unfortunately, graduate education on these topics is seldom emphasized and few opportunities for continuing education exist. Clinical physicists incorporate new technology and methods into their practice based on published literature. A poor understanding of experimental design and analysis could Result in inappropriate use of new techniques. Clinical physicists also improve current practice through quality initiatives that require sound experimental design and analysis. Academic physicists with a poor understanding of design and analysis may produce ambiguous (or misleading) results. This can Result in unnecessary rewrites, publication rejection, and experimental redesign (wasting time, money, and effort). This symposium will provide a practical review of error and uncertainty, common study designs, and statistical tests. Instruction will primarily focus on practical implementation through examples and answer questions such as: where would you typically apply the test/design and where is the test/design typically misapplied (i.e., common pitfalls)? An analysis of error and uncertainty will also be explored using biological studies and associated modeling as a specific use case. Learning Objectives: Understand common experimental testing and clinical trial designs, what questions they can answer, and how to interpret the results Determine where specific statistical tests are appropriate and identify common pitfalls Understand the how uncertainty and error are addressed in biological testing and associated biological modeling.

  9. Gibbs equilibrium averages and Bogolyubov measure

    International Nuclear Information System (INIS)

    Sankovich, D.P.

    2011-01-01

    Application of the functional integration methods in equilibrium statistical mechanics of quantum Bose-systems is considered. We show that Gibbs equilibrium averages of Bose-operators can be represented as path integrals over a special Gauss measure defined in the corresponding space of continuous functions. We consider some problems related to integration with respect to this measure

  10. Bose and his statistics

    International Nuclear Information System (INIS)

    Venkataraman, G.

    1992-01-01

    Treating radiation gas as a classical gas, Einstein derived Planck's law of radiation by considering the dynamic equilibrium between atoms and radiation. Dissatisfied with this treatment, S.N. Bose derived Plank's law by another original way. He treated the problem in generality: he counted how many cells were available for the photon gas in phase space and distributed the photons into these cells. In this manner of distribution, there were three radically new ideas: The indistinguishability of particles, the spin of the photon (with only two possible orientations) and the nonconservation of photon number. This gave rise to a new discipline of quantum statistical mechanics. Physics underlying Bose's discovery, its significance and its role in development of the concept of ideal gas, spin-statistics theorem and spin particles are described. The book has been written in a simple and direct language in an informal style aiming to stimulate the curiosity of a reader. (M.G.B.)

  11. Applications of statistical physics and information theory to the analysis of DNA sequences

    Science.gov (United States)

    Grosse, Ivo

    2000-10-01

    DNA carries the genetic information of most living organisms, and the of genome projects is to uncover that genetic information. One basic task in the analysis of DNA sequences is the recognition of protein coding genes. Powerful computer programs for gene recognition have been developed, but most of them are based on statistical patterns that vary from species to species. In this thesis I address the question if there exist universal statistical patterns that are different in coding and noncoding DNA of all living species, regardless of their phylogenetic origin. In search for such species-independent patterns I study the mutual information function of genomic DNA sequences, and find that it shows persistent period-three oscillations. To understand the biological origin of the observed period-three oscillations, I compare the mutual information function of genomic DNA sequences to the mutual information function of stochastic model sequences. I find that the pseudo-exon model is able to reproduce the mutual information function of genomic DNA sequences. Moreover, I find that a generalization of the pseudo-exon model can connect the existence and the functional form of long-range correlations to the presence and the length distributions of coding and noncoding regions. Based on these theoretical studies I am able to find an information-theoretical quantity, the average mutual information (AMI), whose probability distributions are significantly different in coding and noncoding DNA, while they are almost identical in all studied species. These findings show that there exist universal statistical patterns that are different in coding and noncoding DNA of all studied species, and they suggest that the AMI may be used to identify genes in different living species, irrespective of their taxonomic origin.

  12. A generalization of random matrix theory and its application to statistical physics.

    Science.gov (United States)

    Wang, Duan; Zhang, Xin; Horvatic, Davor; Podobnik, Boris; Eugene Stanley, H

    2017-02-01

    To study the statistical structure of crosscorrelations in empirical data, we generalize random matrix theory and propose a new method of cross-correlation analysis, known as autoregressive random matrix theory (ARRMT). ARRMT takes into account the influence of auto-correlations in the study of cross-correlations in multiple time series. We first analytically and numerically determine how auto-correlations affect the eigenvalue distribution of the correlation matrix. Then we introduce ARRMT with a detailed procedure of how to implement the method. Finally, we illustrate the method using two examples taken from inflation rates for air pressure data for 95 US cities.

  13. Understanding Thermal Equilibrium through Activities

    Science.gov (United States)

    Pathare, Shirish; Huli, Saurabhee; Nachane, Madhura; Ladage, Savita; Pradhan, Hemachandra

    2015-01-01

    Thermal equilibrium is a basic concept in thermodynamics. In India, this concept is generally introduced at the first year of undergraduate education in physics and chemistry. In our earlier studies (Pathare and Pradhan 2011 "Proc. episteme-4 Int. Conf. to Review Research on Science Technology and Mathematics Education" pp 169-72) we…

  14. The Statistical Segment Length of DNA: Opportunities for Biomechanical Modeling in Polymer Physics and Next-Generation Genomics.

    Science.gov (United States)

    Dorfman, Kevin D

    2018-02-01

    The development of bright bisintercalating dyes for deoxyribonucleic acid (DNA) in the 1990s, most notably YOYO-1, revolutionized the field of polymer physics in the ensuing years. These dyes, in conjunction with modern molecular biology techniques, permit the facile observation of polymer dynamics via fluorescence microscopy and thus direct tests of different theories of polymer dynamics. At the same time, they have played a key role in advancing an emerging next-generation method known as genome mapping in nanochannels. The effect of intercalation on the bending energy of DNA as embodied by a change in its statistical segment length (or, alternatively, its persistence length) has been the subject of significant controversy. The precise value of the statistical segment length is critical for the proper interpretation of polymer physics experiments and controls the phenomena underlying the aforementioned genomics technology. In this perspective, we briefly review the model of DNA as a wormlike chain and a trio of methods (light scattering, optical or magnetic tweezers, and atomic force microscopy (AFM)) that have been used to determine the statistical segment length of DNA. We then outline the disagreement in the literature over the role of bisintercalation on the bending energy of DNA, and how a multiscale biomechanical approach could provide an important model for this scientifically and technologically relevant problem.

  15. Pramana – Journal of Physics | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    pp 785-801 Invited Talks:- Topic 1. Rigorous results and exact solutions; general aspects of statistical physics; thermodynamics. Classical charged fluids at equilibrium near an interface: Exact analytical density profiles and surface tension · Françoise Cornu · More Details Abstract Fulltext PDF. The structure of equilibrium ...

  16. Statistical Analysis Methods for Physics Models Verification and Validation

    CERN Document Server

    De Luca, Silvia

    2017-01-01

    The validation and verification process is a fundamental step for any software like Geant4 and GeantV, which aim to perform data simulation using physics models and Monte Carlo techniques. As experimental physicists, we have to face the problem to compare the results obtained using simulations with what the experiments actually observed. One way to solve the problem is to perform a consistency test. Within the Geant group, we developed a C++ compact library which will be added to the automated validation process on the Geant Validation Portal

  17. Non-equilibrium dog-flea model

    Science.gov (United States)

    Ackerson, Bruce J.

    2017-11-01

    We develop the open dog-flea model to serve as a check of proposed non-equilibrium theories of statistical mechanics. The model is developed in detail. Then it is applied to four recent models for non-equilibrium statistical mechanics. Comparison of the dog-flea solution with these different models allows checking claims and giving a concrete example of the theoretical models.

  18. A Statistical Study of Socio-economic and Physical Risk Factors of Myocardial Infarction

    Directory of Open Access Journals (Sweden)

    M. Alamgir

    2005-07-01

    Full Text Available A sample of 506 patients from various hospitals in Peshawar was examined to determine significant socio-economic and physical risk factors of Myocardial Infarction (heart attack. The factors examined were smoking (S, hypertension (H, cholesterol (C, diabetes (D, family history (F, residence (R, own a house (OH, number of dependents (ND, household income (I, obesity and lack of exercise (E. The response variable MI was binary. Therefore, logistic regression was applied (using GLIM and SPSS packages to analyze the data and to select a parsimonious model. Logistic regression models have been obtained indicating significant risk factors for both sexes, for males and for females separately. The best-selected model for both sexes is of factors S, F, D, H and C. The best-selected model for males is of factors CIFH, S, H, D, C and F, while the best-selected model for females is of factors D, H, C and F.

  19. Implementation of statistical analysis methods for medical physics data; Implementacao de metodos de analise estatistica para dados de fisica medica

    Energy Technology Data Exchange (ETDEWEB)

    Teixeira, Marilia S.; Pinto, Nivia G.P.; Barroso, Regina C.; Oliveira, Luis F., E-mail: mariliasilvat@gmail.co, E-mail: lfolive@oi.com.b, E-mail: cely_barroso@hotmail.co, E-mail: nitatag@gmail.co [Universidade do Estado do Rio de Janeiro (UERJ), Rio de Janeiro, RJ (Brazil). Inst. de Fisica

    2009-07-01

    The objective of biomedical research with different radiation natures is to contribute for the understanding of the basic physics and biochemistry of the biological systems, the disease diagnostic and the development of the therapeutic techniques. The main benefits are: the cure of tumors through the therapy, the anticipated detection of diseases through the diagnostic, the using as prophylactic mean for blood transfusion, etc. Therefore, for the better understanding of the biological interactions occurring after exposure to radiation, it is necessary for the optimization of therapeutic procedures and strategies for reduction of radioinduced effects. The group pf applied physics of the Physics Institute of UERJ have been working in the characterization of biological samples (human tissues, teeth, saliva, soil, plants, sediments, air, water, organic matrixes, ceramics, fossil material, among others) using X-rays diffraction and X-ray fluorescence. The application of these techniques for measurement, analysis and interpretation of the biological tissues characteristics are experimenting considerable interest in the Medical and Environmental Physics. All quantitative data analysis must be initiated with descriptive statistic calculation (means and standard deviations) in order to obtain a previous notion on what the analysis will reveal. It is well known que o high values of standard deviation found in experimental measurements of biologicals samples can be attributed to biological factors, due to the specific characteristics of each individual (age, gender, environment, alimentary habits, etc). This work has the main objective the development of a program for the use of specific statistic methods for the optimization of experimental data an analysis. The specialized programs for this analysis are proprietary, another objective of this work is the implementation of a code which is free and can be shared by the other research groups. As the program developed since the

  20. Equilibrium Droplets on Deformable Substrates: Equilibrium Conditions.

    Science.gov (United States)

    Koursari, Nektaria; Ahmed, Gulraiz; Starov, Victor M

    2018-05-15

    Equilibrium conditions of droplets on deformable substrates are investigated, and it is proven using Jacobi's sufficient condition that the obtained solutions really provide equilibrium profiles of both the droplet and the deformed support. At the equilibrium, the excess free energy of the system should have a minimum value, which means that both necessary and sufficient conditions of the minimum should be fulfilled. Only in this case, the obtained profiles provide the minimum of the excess free energy. The necessary condition of the equilibrium means that the first variation of the excess free energy should vanish, and the second variation should be positive. Unfortunately, the mentioned two conditions are not the proof that the obtained profiles correspond to the minimum of the excess free energy and they could not be. It is necessary to check whether the sufficient condition of the equilibrium (Jacobi's condition) is satisfied. To the best of our knowledge Jacobi's condition has never been verified for any already published equilibrium profiles of both the droplet and the deformable substrate. A simple model of the equilibrium droplet on the deformable substrate is considered, and it is shown that the deduced profiles of the equilibrium droplet and deformable substrate satisfy the Jacobi's condition, that is, really provide the minimum to the excess free energy of the system. To simplify calculations, a simplified linear disjoining/conjoining pressure isotherm is adopted for the calculations. It is shown that both necessary and sufficient conditions for equilibrium are satisfied. For the first time, validity of the Jacobi's condition is verified. The latter proves that the developed model really provides (i) the minimum of the excess free energy of the system droplet/deformable substrate and (ii) equilibrium profiles of both the droplet and the deformable substrate.

  1. Statistical mechanics

    CERN Document Server

    Davidson, Norman

    2003-01-01

    Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody

  2. Statistical Modelling of Global Tectonic Activity and some Physical Consequences of its Results

    Directory of Open Access Journals (Sweden)

    Konstantin Statnikov

    2015-02-01

    Full Text Available Based on the analysis of global earthquake data bank for the last thirty years, a global tectonic activity indicator was proposed comprising a weekly globally averaged mean earthquake magnitude value. It was shown that 84% of indicator variability is a harmonic oscillation with a fundamental period of 37.2 years, twice the maximum period in the tidal oscillation spectrum (18.6 years. From this observation, a conclusion was drawn that parametric resonance (PR exists between global tectonic activity and low-frequency tides. The conclusion was also confirmed by the existence of the statistically significant PR response at the second lowest tidal frequency i.e. 182.6 days. It was shown that the global earthquake flow, with a determination factor 93%, is a sum of two Gaussian streams, nearly equally intense, with mean values of 23 and 83 events per week and standard deviations of 9 and 30 events per week, respectively. The Earth periphery to 'mean time interval between earthquakes' ratios in the first and the second flow modes described above match, by the order of magnitude, the sound velocity in the fluid (~1500 m/s and in elastic medium (5500 m/s.

  3. Neocortical dynamics at multiple scales: EEG standing waves, statistical mechanics, and physical analogs.

    Science.gov (United States)

    Ingber, Lester; Nunez, Paul L

    2011-02-01

    The dynamic behavior of scalp potentials (EEG) is apparently due to some combination of global and local processes with important top-down and bottom-up interactions across spatial scales. In treating global mechanisms, we stress the importance of myelinated axon propagation delays and periodic boundary conditions in the cortical-white matter system, which is topologically close to a spherical shell. By contrast, the proposed local mechanisms are multiscale interactions between cortical columns via short-ranged non-myelinated fibers. A mechanical model consisting of a stretched string with attached nonlinear springs demonstrates the general idea. The string produces standing waves analogous to large-scale coherent EEG observed in some brain states. The attached springs are analogous to the smaller (mesoscopic) scale columnar dynamics. Generally, we expect string displacement and EEG at all scales to result from both global and local phenomena. A statistical mechanics of neocortical interactions (SMNI) calculates oscillatory behavior consistent with typical EEG, within columns, between neighboring columns via short-ranged non-myelinated fibers, across cortical regions via myelinated fibers, and also derives a string equation consistent with the global EEG model. Copyright © 2010 Elsevier Inc. All rights reserved.

  4. Blessing of dimensionality: mathematical foundations of the statistical physics of data.

    Science.gov (United States)

    Gorban, A N; Tyukin, I Y

    2018-04-28

    The concentrations of measure phenomena were discovered as the mathematical background to statistical mechanics at the end of the nineteenth/beginning of the twentieth century and have been explored in mathematics ever since. At the beginning of the twenty-first century, it became clear that the proper utilization of these phenomena in machine learning might transform the curse of dimensionality into the blessing of dimensionality This paper summarizes recently discovered phenomena of measure concentration which drastically simplify some machine learning problems in high dimension, and allow us to correct legacy artificial intelligence systems. The classical concentration of measure theorems state that i.i.d. random points are concentrated in a thin layer near a surface (a sphere or equators of a sphere, an average or median-level set of energy or another Lipschitz function, etc.). The new stochastic separation theorems describe the thin structure of these thin layers: the random points are not only concentrated in a thin layer but are all linearly separable from the rest of the set, even for exponentially large random sets. The linear functionals for separation of points can be selected in the form of the linear Fisher's discriminant. All artificial intelligence systems make errors. Non-destructive correction requires separation of the situations (samples) with errors from the samples corresponding to correct behaviour by a simple and robust classifier. The stochastic separation theorems provide us with such classifiers and determine a non-iterative (one-shot) procedure for their construction.This article is part of the theme issue 'Hilbert's sixth problem'. © 2018 The Author(s).

  5. Blessing of dimensionality: mathematical foundations of the statistical physics of data

    Science.gov (United States)

    Gorban, A. N.; Tyukin, I. Y.

    2018-04-01

    The concentrations of measure phenomena were discovered as the mathematical background to statistical mechanics at the end of the nineteenth/beginning of the twentieth century and have been explored in mathematics ever since. At the beginning of the twenty-first century, it became clear that the proper utilization of these phenomena in machine learning might transform the curse of dimensionality into the blessing of dimensionality. This paper summarizes recently discovered phenomena of measure concentration which drastically simplify some machine learning problems in high dimension, and allow us to correct legacy artificial intelligence systems. The classical concentration of measure theorems state that i.i.d. random points are concentrated in a thin layer near a surface (a sphere or equators of a sphere, an average or median-level set of energy or another Lipschitz function, etc.). The new stochastic separation theorems describe the thin structure of these thin layers: the random points are not only concentrated in a thin layer but are all linearly separable from the rest of the set, even for exponentially large random sets. The linear functionals for separation of points can be selected in the form of the linear Fisher's discriminant. All artificial intelligence systems make errors. Non-destructive correction requires separation of the situations (samples) with errors from the samples corresponding to correct behaviour by a simple and robust classifier. The stochastic separation theorems provide us with such classifiers and determine a non-iterative (one-shot) procedure for their construction. This article is part of the theme issue `Hilbert's sixth problem'.

  6. Thermodynamic theory of equilibrium fluctuations

    International Nuclear Information System (INIS)

    Mishin, Y.

    2015-01-01

    The postulational basis of classical thermodynamics has been expanded to incorporate equilibrium fluctuations. The main additional elements of the proposed thermodynamic theory are the concept of quasi-equilibrium states, a definition of non-equilibrium entropy, a fundamental equation of state in the entropy representation, and a fluctuation postulate describing the probability distribution of macroscopic parameters of an isolated system. Although these elements introduce a statistical component that does not exist in classical thermodynamics, the logical structure of the theory is different from that of statistical mechanics and represents an expanded version of thermodynamics. Based on this theory, we present a regular procedure for calculations of equilibrium fluctuations of extensive parameters, intensive parameters and densities in systems with any number of fluctuating parameters. The proposed fluctuation formalism is demonstrated by four applications: (1) derivation of the complete set of fluctuation relations for a simple fluid in three different ensembles; (2) fluctuations in finite-reservoir systems interpolating between the canonical and micro-canonical ensembles; (3) derivation of fluctuation relations for excess properties of grain boundaries in binary solid solutions, and (4) derivation of the grain boundary width distribution for pre-melted grain boundaries in alloys. The last two applications offer an efficient fluctuation-based approach to calculations of interface excess properties and extraction of the disjoining potential in pre-melted grain boundaries. Possible future extensions of the theory are outlined.

  7. Statistical homogeneity tests applied to large data sets from high energy physics experiments

    Science.gov (United States)

    Trusina, J.; Franc, J.; Kůs, V.

    2017-12-01

    Homogeneity tests are used in high energy physics for the verification of simulated Monte Carlo samples, it means if they have the same distribution as a measured data from particle detector. Kolmogorov-Smirnov, χ 2, and Anderson-Darling tests are the most used techniques to assess the samples’ homogeneity. Since MC generators produce plenty of entries from different models, each entry has to be re-weighted to obtain the same sample size as the measured data has. One way of the homogeneity testing is through the binning. If we do not want to lose any information, we can apply generalized tests based on weighted empirical distribution functions. In this paper, we propose such generalized weighted homogeneity tests and introduce some of their asymptotic properties. We present the results based on numerical analysis which focuses on estimations of the type-I error and power of the test. Finally, we present application of our homogeneity tests to data from the experiment DØ in Fermilab.

  8. Ion exchange equilibrium constants

    CERN Document Server

    Marcus, Y

    2013-01-01

    Ion Exchange Equilibrium Constants focuses on the test-compilation of equilibrium constants for ion exchange reactions. The book first underscores the scope of the compilation, equilibrium constants, symbols used, and arrangement of the table. The manuscript then presents the table of equilibrium constants, including polystyrene sulfonate cation exchanger, polyacrylate cation exchanger, polymethacrylate cation exchanger, polysterene phosphate cation exchanger, and zirconium phosphate cation exchanger. The text highlights zirconium oxide anion exchanger, zeolite type 13Y cation exchanger, and

  9. Quantity Constrained General Equilibrium

    NARCIS (Netherlands)

    Babenko, R.; Talman, A.J.J.

    2006-01-01

    In a standard general equilibrium model it is assumed that there are no price restrictions and that prices adjust infinitely fast to their equilibrium values.In case of price restrictions a general equilibrium may not exist and rationing on net demands or supplies is needed to clear the markets.In

  10. "Secrets" of High Pressure Phase Equilibrium Experiment.

    Czech Academy of Sciences Publication Activity Database

    Wichterle, Ivan

    2005-01-01

    Roč. 54, č. 11 (2005), s. 477-479 ISSN 0022-9830 Institutional research plan: CEZ:AV0Z40720504 Keywords : vapour-liquid equilibrium * experimental work Subject RIV: CF - Physical ; Theoretical Chemistry

  11. The large deviation approach to statistical mechanics

    International Nuclear Information System (INIS)

    Touchette, Hugo

    2009-01-01

    The theory of large deviations is concerned with the exponential decay of probabilities of large fluctuations in random systems. These probabilities are important in many fields of study, including statistics, finance, and engineering, as they often yield valuable information about the large fluctuations of a random system around its most probable state or trajectory. In the context of equilibrium statistical mechanics, the theory of large deviations provides exponential-order estimates of probabilities that refine and generalize Einstein's theory of fluctuations. This review explores this and other connections between large deviation theory and statistical mechanics, in an effort to show that the mathematical language of statistical mechanics is the language of large deviation theory. The first part of the review presents the basics of large deviation theory, and works out many of its classical applications related to sums of random variables and Markov processes. The second part goes through many problems and results of statistical mechanics, and shows how these can be formulated and derived within the context of large deviation theory. The problems and results treated cover a wide range of physical systems, including equilibrium many-particle systems, noise-perturbed dynamics, nonequilibrium systems, as well as multifractals, disordered systems, and chaotic systems. This review also covers many fundamental aspects of statistical mechanics, such as the derivation of variational principles characterizing equilibrium and nonequilibrium states, the breaking of the Legendre transform for nonconcave entropies, and the characterization of nonequilibrium fluctuations through fluctuation relations.

  12. The large deviation approach to statistical mechanics

    Science.gov (United States)

    Touchette, Hugo

    2009-07-01

    The theory of large deviations is concerned with the exponential decay of probabilities of large fluctuations in random systems. These probabilities are important in many fields of study, including statistics, finance, and engineering, as they often yield valuable information about the large fluctuations of a random system around its most probable state or trajectory. In the context of equilibrium statistical mechanics, the theory of large deviations provides exponential-order estimates of probabilities that refine and generalize Einstein’s theory of fluctuations. This review explores this and other connections between large deviation theory and statistical mechanics, in an effort to show that the mathematical language of statistical mechanics is the language of large deviation theory. The first part of the review presents the basics of large deviation theory, and works out many of its classical applications related to sums of random variables and Markov processes. The second part goes through many problems and results of statistical mechanics, and shows how these can be formulated and derived within the context of large deviation theory. The problems and results treated cover a wide range of physical systems, including equilibrium many-particle systems, noise-perturbed dynamics, nonequilibrium systems, as well as multifractals, disordered systems, and chaotic systems. This review also covers many fundamental aspects of statistical mechanics, such as the derivation of variational principles characterizing equilibrium and nonequilibrium states, the breaking of the Legendre transform for nonconcave entropies, and the characterization of nonequilibrium fluctuations through fluctuation relations.

  13. The statistical mechanics of financial markets

    CERN Document Server

    Voit, Johannes

    2003-01-01

    From the reviews of the first edition - "Provides an excellent introduction for physicists interested in the statistical properties of financial markets. Appropriately early in the book the basic financial terms such as shorts, limit orders, puts, calls, and other terms are clearly defined. Examples, often with graphs, augment the reader’s understanding of what may be a plethora of new terms and ideas… [This is] an excellent starting point for the physicist interested in the subject. Some of the book’s strongest features are its careful definitions, its detailed examples, and the connection it establishes to physical systems." PHYSICS TODAY "This book is excellent at illustrating the similarities of financial markets with other non-equilibrium physical systems. [...] In summary, a very good book that offers more than just qualitative comparisons of physics and finance." (www.quantnotes.com) This highly-praised introductory treatment describes parallels between statistical physics and finance - both thos...

  14. Beyond the second law entropy production and non-equilibrium systems

    CERN Document Server

    Lineweaver, Charles; Niven, Robert; Regenauer-Lieb, Klaus

    2014-01-01

    The Second Law, a cornerstone of thermodynamics, governs the average direction of dissipative, non-equilibrium processes. But it says nothing about their actual rates or the probability of fluctuations about the average. This interdisciplinary book, written and peer-reviewed by international experts, presents recent advances in the search for new non-equilibrium principles beyond the Second Law, and their applications to a wide range of systems across physics, chemistry and biology. Beyond The Second Law brings together traditionally isolated areas of non-equilibrium research and highlights potentially fruitful connections between them, with entropy production playing the unifying role. Key theoretical concepts include the Maximum Entropy Production principle, the Fluctuation Theorem, and the Maximum Entropy method of statistical inference. Applications of these principles are illustrated in such diverse fields as climatology, cosmology, crystal growth morphology, Earth system science, environmental physics, ...

  15. VIII Spanish meeting on statistical physics: Proceeding of the Meeting held at Universidad Carlos III de Madrid

    International Nuclear Information System (INIS)

    Cuesta, J.A.; Sanchez, A.

    1998-01-01

    This book contains the Proceedings of ''Fisica Estadistica'97'' (FisEs'97, VIII Spanish Meeting on Statistical Physics), held at the Campus of Getafe (Madrid, Spain) of the Universidad Carlos III de Madrid on September 25 through 27, 1997. Although this is the first time the Proceedings of a Meeting in this series are published, ''Fisica Estasdistica'' dates back to 1986, when about fifty Spanish scientists attended the first edition in Barcelona. That first Meeting was organized by a group of young and not so young physicists who wanted to set up a national conference of an international level and with a broader, more interdisciplinary scope than others held at that time. Their idea quickly got off the ground and following the first edition, sequels took place every year and a half: Palma de Mallorca (1988), Badajoz (1990), Cabuenas, Asturies (1991), El Escorial, Madrid (1993), Sevilla (1994), and Zaragoza (1996)

  16. Annotations to quantum statistical mechanics

    CERN Document Server

    Kim, In-Gee

    2018-01-01

    This book is a rewritten and annotated version of Leo P. Kadanoff and Gordon Baym’s lectures that were presented in the book Quantum Statistical Mechanics: Green’s Function Methods in Equilibrium and Nonequilibrium Problems. The lectures were devoted to a discussion on the use of thermodynamic Green’s functions in describing the properties of many-particle systems. The functions provided a method for discussing finite-temperature problems with no more conceptual difficulty than ground-state problems, and the method was equally applicable to boson and fermion systems and equilibrium and nonequilibrium problems. The lectures also explained nonequilibrium statistical physics in a systematic way and contained essential concepts on statistical physics in terms of Green’s functions with sufficient and rigorous details. In-Gee Kim thoroughly studied the lectures during one of his research projects but found that the unspecialized method used to present them in the form of a book reduced their readability. He st...

  17. Non-Equilibrium Heavy Flavored Hadron Yields from Chemical Equilibrium Strangeness-Rich QGP

    OpenAIRE

    Kuznetsova, Inga; Rafelski, Johann

    2008-01-01

    The yields of heavy flavored hadrons emitted from strangeness-rich QGP are evaluated within chemical non-equilibrium statistical hadronization model, conserving strangeness, charm, and entropy yields at hadronization.

  18. Practical Statistics

    CERN Document Server

    Lyons, L.

    2016-01-01

    Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.

  19. Theoretical physics vol. 2. Quantum mechanics, relativistic quantum mechanics, quantum field theory, elementar-particle theory, thermodynamics and statistics

    International Nuclear Information System (INIS)

    Rebhan, E.

    2005-01-01

    The present second volume treats quantum mechanics, relativistic quantum mechanics, the foundations of quantum-field and elementary-particle theory as well as thermodynamics and statistics. Both volumes comprehend all fields, which are usually offered in a course about theoretical physics. In all treated fields a very careful introduction to the basic natural laws forms the starting point, whereby it is thoroughly analysed, which of them is based on empirics, which is logically deducible, and which role play basic definitions. Extendingly the matter extend of the corresponding courses starting from the relativistic quantum theory an introduction to the elementary particles is developed. All problems are very thoroughly and such extensively studied, that each step is singularly reproducible. On motivation and good understandability is cared much about. The mixing of mathematical difficulties with problems of physical nature often obstructive in the learning is so circumvented, that important mathematical methods are presented in own chapters (for instance Hilbert spaces, Lie groups). By means of many examples and problems (for a large part with solutions) the matter worked out is deepened and exercised. Developments, which are indeed important, but seem for the first approach abandonable, are pursued in excurses. This book starts from courses, which the author has held at the Heinrich-Heine university in Duesseldorf, and was in many repetitions fitted to the requirements of the students. It is conceived in such a way, that it is also after the study suited as dictionary or for the regeneration

  20. Statistical Physics of Adaptation

    Science.gov (United States)

    2016-08-23

    Massachusetts Institute of Technology, Floor 6, 400 Tech Square, Cambridge , Massachusetts 02139, USA (Received 23 December 2014; revised manuscript...population composed of two types of exponentially growing self-replicators—we illustrate a simple relationship between outcome-likelihood and...flux that does apply in a very broad class of driven systems, thus illustrating , more generally, what role dissipative history plays in determining the

  1. Information-theoretic equilibrium and observable thermalization

    Science.gov (United States)

    Anzà, F.; Vedral, V.

    2017-03-01

    A crucial point in statistical mechanics is the definition of the notion of thermal equilibrium, which can be given as the state that maximises the von Neumann entropy, under the validity of some constraints. Arguing that such a notion can never be experimentally probed, in this paper we propose a new notion of thermal equilibrium, focused on observables rather than on the full state of the quantum system. We characterise such notion of thermal equilibrium for an arbitrary observable via the maximisation of its Shannon entropy and we bring to light the thermal properties that it heralds. The relation with Gibbs ensembles is studied and understood. We apply such a notion of equilibrium to a closed quantum system and show that there is always a class of observables which exhibits thermal equilibrium properties and we give a recipe to explicitly construct them. Eventually, an intimate connection with the Eigenstate Thermalisation Hypothesis is brought to light.

  2. Physics-based and statistical earthquake forecasting in a continental rift zone: the case study of Corinth Gulf (Greece)

    Science.gov (United States)

    Segou, Margarita

    2016-01-01

    I perform a retrospective forecast experiment in the most rapid extensive continental rift worldwide, the western Corinth Gulf (wCG, Greece), aiming to predict shallow seismicity (depth statistics, four physics-based (CRS) models, combining static stress change estimations and the rate-and-state laboratory law and one hybrid model. For the latter models, I incorporate the stress changes imparted from 31 earthquakes with magnitude M ≥ 4.5 at the extended area of wCG. Special attention is given on the 3-D representation of active faults, acting as potential receiver planes for the estimation of static stress changes. I use reference seismicity between 1990 and 1995, corresponding to the learning phase of physics-based models, and I evaluate the forecasts for six months following the 1995 M = 6.4 Aigio earthquake using log-likelihood performance metrics. For the ETAS realizations, I use seismic events with magnitude M ≥ 2.5 within daily update intervals to enhance their predictive power. For assessing the role of background seismicity, I implement a stochastic reconstruction (aka declustering) aiming to answer whether M > 4.5 earthquakes correspond to spontaneous events and identify, if possible, different triggering characteristics between aftershock sequences and swarm-type seismicity periods. I find that: (1) ETAS models outperform CRS models in most time intervals achieving very low rejection ratio RN = 6 per cent, when I test their efficiency to forecast the total number of events inside the study area, (2) the best rejection ratio for CRS models reaches RN = 17 per cent, when I use varying target depths and receiver plane geometry, (3) 75 per cent of the 1995 Aigio aftershocks that occurred within the first month can be explained by static stress changes, (4) highly variable performance on behalf of both statistical and physical models is suggested by large confidence intervals of information gain per earthquake and (5) generic ETAS models can adequately

  3. Hanging an Airplane: A Case Study in Static Equilibrium

    Science.gov (United States)

    Katz, Debora M.

    2009-01-01

    Our classrooms are filled with engineering majors who take a semester-long course in static equilibrium. Many students find this class too challenging and drop their engineering major. In our introductory physics class, we often breeze through static equilibrium; to physicists equilibrium is just a special case of Newton's second law. While it is…

  4. A Statistical and Wavelet Analysis of Physical Property Data From the 2950 m Deep Bellevue Borehole, Bushveld Complex, South Africa

    Science.gov (United States)

    Webb, S. J.; Ashwal, L. D.; Cooper, G. R.

    2007-12-01

    Susceptibility (n=~110,000) and density (n=~~2500) measurements on core samples have been collected in a stratigraphic context from the Bellevue (BV-1) 2950 m deep borehole in the Northern Lobe of the Bushveld Complex. This drill core starts in the granitoid roof rocks, extends through the entire Upper Zone, and ends approximately in the middle of the Main Zone. These physical property measurements now provide an extensive database useful for geophysical modeling and stratigraphic studies. In an effort to quantify the periodicity of the layering we have applied various statistical and wavelet methods to analyze the susceptibility and density data. The density data have revealed a strong periodic layering with a scale of ~~80 m that extends through the Main and Upper Zones. In the Main Zone the layering is unusual in that the density values increase upwards by as much as 10%. This is due to systematic variation in the modal abundance of mafic silicates and appears to be related to separate pulses during emplacement. The magnetic susceptibility data in the Upper Zone also show a strong cyclicity of similar scale. The discrete wavelet transform, using the real Haar wavelet, has been applied to help discretise the susceptibility data and clarifies the geological boundaries without blurring them, which is a common problem with multipoint moving averages. As expected, the histogram of the entire data set is non-Gaussian, with a long tail for high values. We can roughly fit a power law to the log histogram plot indicating a probable fractal distribution of susceptibilities. However if we window the data in the range 750-1000 m the histogram is very different. This region shows a strong peak and no power law relationship. This dramatic change in statistical properties prompted us to investigate these properties more thoroughly. To complement the wavelet analysis we have calculated various statistical measures (mean, standard deviation, skew, and

  5. Phase equilibrium engineering

    CERN Document Server

    Brignole, Esteban Alberto

    2013-01-01

    Traditionally, the teaching of phase equilibria emphasizes the relationships between the thermodynamic variables of each phase in equilibrium rather than its engineering applications. This book changes the focus from the use of thermodynamics relationships to compute phase equilibria to the design and control of the phase conditions that a process needs. Phase Equilibrium Engineering presents a systematic study and application of phase equilibrium tools to the development of chemical processes. The thermodynamic modeling of mixtures for process development, synthesis, simulation, design and

  6. Equilibrium and generators

    International Nuclear Information System (INIS)

    Balter, H.S.

    1994-01-01

    This work studies the behaviour of radionuclides when it produce a desintegration activity,decay and the isotopes stable creation. It gives definitions about the equilibrium between activity of parent and activity of the daughter, radioactive decay,isotope stable and transient equilibrium and maxim activity time. Some considerations had been given to generators that permit a disgregation of two radioisotopes in equilibrium and its good performance. Tabs

  7. Final Report on DTRA Basic Research Project #BRCALL08-Per3-C-2-0006 "High-Z Non-Equilibrium Physics and Bright X-ray Sources with New Laser Targets"

    Energy Technology Data Exchange (ETDEWEB)

    Colvin, Jeffrey D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-06-01

    This project had two major goals. Final Goal: obtain spectrally resolved, absolutely calibrated x-ray emission data from uniquely uniform mm-scale near-critical-density high-Z plasmas not in local thermodynamic equilibrium (LTE) to benchmark modern detailed atomic physics models. Scientific significance: advance understanding of non-LTE atomic physics. Intermediate Goal: develop new nano-fabrication techniques to make suitable laser targets that form the required highly uniform non-LTE plasmas when illuminated by high-intensity laser light. Scientific significance: advance understanding of nano-science. The new knowledge will allow us to make x-ray sources that are bright at the photon energies of most interest for testing radiation hardening technologies, the spectral energy range where current x-ray sources are weak. All project goals were met.

  8. Study Modules for Calculus-Based General Physics. [Includes Modules 11-14: Collisions; Equilibrium of Rigid Bodies; Rotational Dynamics; and Fluid Mechanics].

    Science.gov (United States)

    Fuller, Robert G., Ed.; And Others

    This is part of a series of 42 Calculus Based Physics (CBP) modules totaling about 1,000 pages. The modules include study guides, practice tests, and mastery tests for a full-year individualized course in calculus-based physics based on the Personalized System of Instruction (PSI). The units are not intended to be used without outside materials;…

  9. Disturbances in equilibrium function after major earthquake.

    Science.gov (United States)

    Honma, Motoyasu; Endo, Nobutaka; Osada, Yoshihisa; Kim, Yoshiharu; Kuriyama, Kenichi

    2012-01-01

    Major earthquakes were followed by a large number of aftershocks and significant outbreaks of dizziness occurred over a large area. However it is unclear why major earthquake causes dizziness. We conducted an intergroup trial on equilibrium dysfunction and psychological states associated with equilibrium dysfunction in individuals exposed to repetitive aftershocks versus those who were rarely exposed. Greater equilibrium dysfunction was observed in the aftershock-exposed group under conditions without visual compensation. Equilibrium dysfunction in the aftershock-exposed group appears to have arisen from disturbance of the inner ear, as well as individual vulnerability to state anxiety enhanced by repetitive exposure to aftershocks. We indicate potential effects of autonomic stress on equilibrium function after major earthquake. Our findings may contribute to risk management of psychological and physical health after major earthquakes with aftershocks, and allow development of a new empirical approach to disaster care after such events.

  10. On solutions to equilibrium problems for systems of stiffened gases

    OpenAIRE

    Flåtten, Tore; Morin, Alexandre; Munkejord, Svend Tollak

    2011-01-01

    We consider an isolated system of N immiscible fluids, each following a stiffened-gas equation of state. We consider the problem of calculating equilibrium states from the conserved fluid-mechanical properties, i.e., the partial densities and internal energies. We consider two cases; in each case mechanical equilibrium is assumed, but the fluids may or may not be in thermal equilibrium. For both cases, we address the issues of existence, uniqueness, and physical validity of equilibrium soluti...

  11. The equilibrium response to doubling atmospheric CO2

    International Nuclear Information System (INIS)

    Mitchell, J.F.B.

    1990-01-01

    The equilibrium response of climate to increased atmospheric carbon dioxide as simulated by general circulation models is assessed. Changes that are physically plausible are summarized, along with an indication of the confidence attributable to those changes. The main areas of uncertainty are highlighted. They include: equilibrium experiments with mixed-layer oceans focusing on temperature, precipitation, and soil moisture; equilibrium studies with dynamical ocean-atmosphere models; results deduced from equilibrium CO 2 experiments; and priorities for future research to improve atmosphere models

  12. An analysis of Greek seismicity based on Non Extensive Statistical Physics: The interdependence of magnitude, interevent time and interevent distance.

    Science.gov (United States)

    Efstathiou, Angeliki; Tzanis, Andreas; Vallianatos, Filippos

    2014-05-01

    The context of Non Extensive Statistical Physics (NESP) has recently been suggested to comprise an appropriate tool for the analysis of complex dynamic systems with scale invariance, long-range interactions, long-range memory and systems that evolve in a fractal-like space-time. This is because the active tectonic grain is thought to comprise a (self-organizing) complex system; therefore, its expression (seismicity) should be manifested in the temporal and spatial statistics of energy release rates. In addition to energy release rates expressed by the magnitude M, measures of the temporal and spatial interactions are the time (Δt) and hypocentral distance (Δd) between consecutive events. Recent work indicated that if the distributions of M, Δt and Δd are independent so that the joint probability p(M,Δt,Δd) factorizes into the probabilities of M, Δt and Δd, i.e. p(M,Δt,Δd)= p(M)p(Δt)p(Δd), then the frequency of earthquake occurrence is multiply related, not only to magnitude as the celebrated Gutenberg - Richter law predicts, but also to interevent time and distance by means of well-defined power-laws consistent with NESP. The present work applies these concepts to investigate the self-organization and temporal/spatial dynamics of seismicity in Greece and western Turkey, for the period 1964-2011. The analysis was based on the ISC earthquake catalogue which is homogenous by construction with consistently determined hypocenters and magnitude. The presentation focuses on the analysis of bivariate Frequency-Magnitude-Time distributions, while using the interevent distances as spatial constraints (or spatial filters) for studying the spatial dependence of the energy and time dynamics of the seismicity. It is demonstrated that the frequency of earthquake occurrence is multiply related to the magnitude and the interevent time by means of well-defined multi-dimensional power-laws consistent with NESP and has attributes of universality,as its holds for a broad

  13. Fall Back Equilibrium

    NARCIS (Netherlands)

    Kleppe, J.; Borm, P.E.M.; Hendrickx, R.L.P.

    2008-01-01

    Fall back equilibrium is a refinement of the Nash equilibrium concept. In the underly- ing thought experiment each player faces the possibility that, after all players decided on their action, his chosen action turns out to be blocked. Therefore, each player has to decide beforehand on a back-up

  14. Physics-Based Image Segmentation Using First Order Statistical Properties and Genetic Algorithm for Inductive Thermography Imaging.

    Science.gov (United States)

    Gao, Bin; Li, Xiaoqing; Woo, Wai Lok; Tian, Gui Yun

    2018-05-01

    Thermographic inspection has been widely applied to non-destructive testing and evaluation with the capabilities of rapid, contactless, and large surface area detection. Image segmentation is considered essential for identifying and sizing defects. To attain a high-level performance, specific physics-based models that describe defects generation and enable the precise extraction of target region are of crucial importance. In this paper, an effective genetic first-order statistical image segmentation algorithm is proposed for quantitative crack detection. The proposed method automatically extracts valuable spatial-temporal patterns from unsupervised feature extraction algorithm and avoids a range of issues associated with human intervention in laborious manual selection of specific thermal video frames for processing. An internal genetic functionality is built into the proposed algorithm to automatically control the segmentation threshold to render enhanced accuracy in sizing the cracks. Eddy current pulsed thermography will be implemented as a platform to demonstrate surface crack detection. Experimental tests and comparisons have been conducted to verify the efficacy of the proposed method. In addition, a global quantitative assessment index F-score has been adopted to objectively evaluate the performance of different segmentation algorithms.

  15. Non-Equilibrium Properties from Equilibrium Free Energy Calculations

    Science.gov (United States)

    Pohorille, Andrew; Wilson, Michael A.

    2012-01-01

    Calculating free energy in computer simulations is of central importance in statistical mechanics of condensed media and its applications to chemistry and biology not only because it is the most comprehensive and informative quantity that characterizes the eqUilibrium state, but also because it often provides an efficient route to access dynamic and kinetic properties of a system. Most of applications of equilibrium free energy calculations to non-equilibrium processes rely on a description in which a molecule or an ion diffuses in the potential of mean force. In general case this description is a simplification, but it might be satisfactorily accurate in many instances of practical interest. This hypothesis has been tested in the example of the electrodiffusion equation . Conductance of model ion channels has been calculated directly through counting the number of ion crossing events observed during long molecular dynamics simulations and has been compared with the conductance obtained from solving the generalized Nernst-Plank equation. It has been shown that under relatively modest conditions the agreement between these two approaches is excellent, thus demonstrating the assumptions underlying the diffusion equation are fulfilled. Under these conditions the electrodiffusion equation provides an efficient approach to calculating the full voltage-current dependence routinely measured in electrophysiological experiments.

  16. The Theory of Variances in Equilibrium Reconstruction

    International Nuclear Information System (INIS)

    Zakharov, Leonid E.; Lewandowski, Jerome; Foley, Elizabeth L.; Levinton, Fred M.; Yuh, Howard Y.; Drozdov, Vladimir; McDonald, Darren

    2008-01-01

    The theory of variances of equilibrium reconstruction is presented. It complements existing practices with information regarding what kind of plasma profiles can be reconstructed, how accurately, and what remains beyond the abilities of diagnostic systems. The σ-curves, introduced by the present theory, give a quantitative assessment of quality of effectiveness of diagnostic systems in constraining equilibrium reconstructions. The theory also suggests a method for aligning the accuracy of measurements of different physical nature

  17. Equilibrium and non equilibrium in fragmentation

    International Nuclear Information System (INIS)

    Dorso, C.O.; Chernomoretz, A.; Lopez, J.A.

    2001-01-01

    Full text: In this communication we present recent results regarding the interplay of equilibrium and non equilibrium in the process of fragmentation of excited finite Lennard Jones drops. Because the general features of such a potential resemble the ones of the nuclear interaction (fact that is reinforced by the similarity between the EOS of both systems) these studies are not only relevant from a fundamental point of view but also shed light on the problem of nuclear multifragmentation. We focus on the microscopic analysis of the state of the fragmenting system at fragmentation time. We show that the Caloric Curve (i e. the functional relationship between the temperature of the system and the excitation energy) is of the type rise plateau with no vapor branch. The usual rise plateau rise pattern is only recovered when equilibrium is artificially imposed. This result puts a serious question on the validity of the freeze out hypothesis. This feature is independent of the dimensionality or excitation mechanism. Moreover we explore the behavior of magnitudes which can help us determine the degree of the assumed phase transition. It is found that no clear cut criteria is presently available. (Author)

  18. Asymmetry of price returns-Analysis and perspectives from a non-extensive statistical physics point of view.

    Directory of Open Access Journals (Sweden)

    Łukasz Bil

    Full Text Available We study how the approach grounded on non-extensive statistical physics can be applied to describe and distinguish different stages of the stock and money market development. A particular attention is given to asymmetric behavior of fat tailed distributions of positive and negative returns. A new method to measure this asymmetry is proposed. It is based on the value of the non-extensive Tsallis parameter q. The new quantifier of the relative asymmetry level between tails in terms of the Tsallis parameters q± is provided to analyze the effect of memory in data caused by nonlinear autocorrelations. The presented analysis takes into account data of separate stocks from the main developing stock market in Europe, i.e., the Warsaw Stock Exchange (WSE in Poland and-for comparison-data from the most mature money market (Forex. It is argued that the proposed new quantifier is able to describe the stage of market development and its robustness to speculation. The main strength is put on a description and interpretation of the asymmetry between statistical properties of positive and negative returns for various stocks and for diversified time-lags Δt of data counting. The particular caution in this context is addressed to the difference between intraday and interday returns. Our search is extended to study memory effects and their dependence on the quotation frequency for similar large companies-owners of food-industrial retail supermarkets acting on both Polish and European markets (Eurocash, Jeronimo-Martins, Carrefour, Tesco-but traded on various European stock markets of diversified economical maturity (respectively in Warsaw, Lisbon, Paris and London. The latter analysis seems to indicate quantitatively that stocks from the same economic sector traded on different markets within European Union (EU may be a target of diversified level of speculations involved in trading independently on the true economic situation of the company. Our work thus gives

  19. Asymmetry of price returns—Analysis and perspectives from a non-extensive statistical physics point of view

    Science.gov (United States)

    Bil, Łukasz; Zienowicz, Magdalena

    2017-01-01

    We study how the approach grounded on non-extensive statistical physics can be applied to describe and distinguish different stages of the stock and money market development. A particular attention is given to asymmetric behavior of fat tailed distributions of positive and negative returns. A new method to measure this asymmetry is proposed. It is based on the value of the non-extensive Tsallis parameter q. The new quantifier of the relative asymmetry level between tails in terms of the Tsallis parameters q± is provided to analyze the effect of memory in data caused by nonlinear autocorrelations. The presented analysis takes into account data of separate stocks from the main developing stock market in Europe, i.e., the Warsaw Stock Exchange (WSE) in Poland and—for comparison—data from the most mature money market (Forex). It is argued that the proposed new quantifier is able to describe the stage of market development and its robustness to speculation. The main strength is put on a description and interpretation of the asymmetry between statistical properties of positive and negative returns for various stocks and for diversified time-lags Δt of data counting. The particular caution in this context is addressed to the difference between intraday and interday returns. Our search is extended to study memory effects and their dependence on the quotation frequency for similar large companies—owners of food-industrial retail supermarkets acting on both Polish and European markets (Eurocash, Jeronimo-Martins, Carrefour, Tesco)—but traded on various European stock markets of diversified economical maturity (respectively in Warsaw, Lisbon, Paris and London). The latter analysis seems to indicate quantitatively that stocks from the same economic sector traded on different markets within European Union (EU) may be a target of diversified level of speculations involved in trading independently on the true economic situation of the company. Our work thus gives

  20. Asymmetry of price returns-Analysis and perspectives from a non-extensive statistical physics point of view.

    Science.gov (United States)

    Bil, Łukasz; Grech, Dariusz; Zienowicz, Magdalena

    2017-01-01

    We study how the approach grounded on non-extensive statistical physics can be applied to describe and distinguish different stages of the stock and money market development. A particular attention is given to asymmetric behavior of fat tailed distributions of positive and negative returns. A new method to measure this asymmetry is proposed. It is based on the value of the non-extensive Tsallis parameter q. The new quantifier of the relative asymmetry level between tails in terms of the Tsallis parameters q± is provided to analyze the effect of memory in data caused by nonlinear autocorrelations. The presented analysis takes into account data of separate stocks from the main developing stock market in Europe, i.e., the Warsaw Stock Exchange (WSE) in Poland and-for comparison-data from the most mature money market (Forex). It is argued that the proposed new quantifier is able to describe the stage of market development and its robustness to speculation. The main strength is put on a description and interpretation of the asymmetry between statistical properties of positive and negative returns for various stocks and for diversified time-lags Δt of data counting. The particular caution in this context is addressed to the difference between intraday and interday returns. Our search is extended to study memory effects and their dependence on the quotation frequency for similar large companies-owners of food-industrial retail supermarkets acting on both Polish and European markets (Eurocash, Jeronimo-Martins, Carrefour, Tesco)-but traded on various European stock markets of diversified economical maturity (respectively in Warsaw, Lisbon, Paris and London). The latter analysis seems to indicate quantitatively that stocks from the same economic sector traded on different markets within European Union (EU) may be a target of diversified level of speculations involved in trading independently on the true economic situation of the company. Our work thus gives indications

  1. Chemical Principles Revisited: Chemical Equilibrium.

    Science.gov (United States)

    Mickey, Charles D.

    1980-01-01

    Describes: (1) Law of Mass Action; (2) equilibrium constant and ideal behavior; (3) general form of the equilibrium constant; (4) forward and reverse reactions; (5) factors influencing equilibrium; (6) Le Chatelier's principle; (7) effects of temperature, changing concentration, and pressure on equilibrium; and (8) catalysts and equilibrium. (JN)

  2. Equilibrium and non-equilibrium phenomena in arcs and torches

    NARCIS (Netherlands)

    Mullen, van der J.J.A.M.

    2000-01-01

    A general treatment of non-equilibrium plasma aspects is obtained by relating transport fluxes to equilibrium restoring processes in so-called disturbed Bilateral Relations. The (non) equilibrium stage of a small microwave induced plasma serves as case study.

  3. Vital statistics

    CERN Document Server

    MacKenzie, Dana

    2004-01-01

    The drawbacks of using 19th-century mathematics in physics and astronomy are illustrated. To continue with the expansion of the knowledge about the cosmos, the scientists will have to come in terms with modern statistics. Some researchers have deliberately started importing techniques that are used in medical research. However, the physicists need to identify the brand of statistics that will be suitable for them, and make a choice between the Bayesian and the frequentists approach. (Edited abstract).

  4. Modelling Thomson scattering for systems with non-equilibrium electron distributions

    Directory of Open Access Journals (Sweden)

    Chapman D.A.

    2013-11-01

    Full Text Available We investigate the effect of non-equilibrium electron distributions in the analysis of Thomson scattering for a range of conditions of interest to inertial confinement fusion experiments. Firstly, a generalised one-component model based on quantum statistical theory is given in the random phase approximation (RPA. The Chihara expression for electron-ion plasmas is then adapted to include the new non-equilibrium electron physics. The theoretical scattering spectra for both diffuse and dense plasmas in which non-equilibrium electron distributions are expected to arise are considered. We find that such distributions strongly influence the spectra and are hence an important consideration for accurately determining the plasma conditions.

  5. Statistical mechanics and the foundations of thermodynamics

    International Nuclear Information System (INIS)

    Martin-Loef, A.

    1979-01-01

    These lectures are designed as an introduction to classical statistical mechanics and its relation to thermodynamics. They are intended to bridge the gap between the treatment of the subject in physics text books and the modern presentations of mathematically rigorous results. We shall first introduce the probability distributions, ensembles, appropriate for describing systems in equilibrium and consider some of their basic physical applications. We also discuss the problem of approach to equilibrium and how irreversibility comes into the dynamics. We then give a detailed description of how the law of large numbers for macrovariables in equilibrium is derived from the fact that entropy is an extensive quantity in the thermodynamic limit. We show in a natural way how to split the energy changes in an thermodynamical process into work and heat leading to a derivation of the first and second laws of thermodynamics from the rules of thermodynamical equilibrium. We have elaborated this part in detail because we feel it is quite satisfactory, that the establishment of the limit of thermodynamic functions as achieved in the modern development of the mathematical aspects of statistical mechanics allows a more general and logically clearer presentation of the bases of thermodynamics. We close these lectures by presenting the basic facts about fluctuation theory. The treatment aims to be reasonably self-contained both concerning the physics and mathematics needed. No knowledge of quantum mechanics is presupposed. Since we spent a large part on mathematical proofs and give many technical facts these lectures are probably most digestive for the mathematically inclined reader who wants to understand the physics of the subject. (HJ)

  6. Non extensive statistical physics applied in fracture-induced electric signals during triaxial deformation of Carrara marble

    Science.gov (United States)

    Cartwright-Taylor, Alexis; Vallianatos, Filippos; Sammonds, Peter

    2014-05-01

    We have conducted room-temperature, triaxial compression experiments on samples of Carrara marble, recording concurrently acoustic and electric current signals emitted during the deformation process as well as mechanical loading information and ultrasonic wave velocities. Our results reveal that in a dry non-piezoelectric rock under simulated crustal pressure conditions, a measurable electric current (nA) is generated within the stressed sample. The current is detected only in the region beyond (quasi-)linear elastic deformation; i.e. in the region of permanent deformation beyond the yield point of the material and in the presence of microcracking. Our results extend to shallow crustal conditions previous observations of electric current signals in quartz-free rocks undergoing uniaxial deformation and support the idea of a universal electrification mechanism related to deformation. Confining pressure conditions of our slow strain rate (10-6 s-1) experiments range from the purely brittle regime (10 MPa) to the semi-brittle transition (30-100MPa) where cataclastic flow is the dominant deformation mechanism. Electric current is generated under all confining pressures,implying the existence of a current-producing mechanism during both microfracture and frictional sliding. Some differences are seen in the current evolution between these two regimes, possibly related to crack localisation. In all cases, the measured electric current exhibits episodes of strong fluctuations over short timescales; calm periods punctuated by bursts of strong activity. For the analysis, we adopt an entropy-based statistical physics approach (Tsallis, 1988), particularly suited to the study of fracture related phenomena. We find that the probability distribution of normalised electric current fluctuations over short time intervals (0.5 s) can be well described by a q-Gaussian distribution of a form similar to that which describes turbulent flows. This approach yields different entropic

  7. Random forest learning of ultrasonic statistical physics and object spaces for lesion detection in 2D sonomammography

    Science.gov (United States)

    Sheet, Debdoot; Karamalis, Athanasios; Kraft, Silvan; Noël, Peter B.; Vag, Tibor; Sadhu, Anup; Katouzian, Amin; Navab, Nassir; Chatterjee, Jyotirmoy; Ray, Ajoy K.

    2013-03-01

    Breast cancer is the most common form of cancer in women. Early diagnosis can significantly improve lifeexpectancy and allow different treatment options. Clinicians favor 2D ultrasonography for breast tissue abnormality screening due to high sensitivity and specificity compared to competing technologies. However, inter- and intra-observer variability in visual assessment and reporting of lesions often handicaps its performance. Existing Computer Assisted Diagnosis (CAD) systems though being able to detect solid lesions are often restricted in performance. These restrictions are inability to (1) detect lesion of multiple sizes and shapes, and (2) differentiate between hypo-echoic lesions from their posterior acoustic shadowing. In this work we present a completely automatic system for detection and segmentation of breast lesions in 2D ultrasound images. We employ random forests for learning of tissue specific primal to discriminate breast lesions from surrounding normal tissues. This enables it to detect lesions of multiple shapes and sizes, as well as discriminate between hypo-echoic lesion from associated posterior acoustic shadowing. The primal comprises of (i) multiscale estimated ultrasonic statistical physics and (ii) scale-space characteristics. The random forest learns lesion vs. background primal from a database of 2D ultrasound images with labeled lesions. For segmentation, the posterior probabilities of lesion pixels estimated by the learnt random forest are hard thresholded to provide a random walks segmentation stage with starting seeds. Our method achieves detection with 99.19% accuracy and segmentation with mean contour-to-contour error < 3 pixels on a set of 40 images with 49 lesions.

  8. Non-equilibrium Economics

    Directory of Open Access Journals (Sweden)

    Katalin Martinás

    2007-02-01

    Full Text Available A microeconomic, agent based framework to dynamic economics is formulated in a materialist approach. An axiomatic foundation of a non-equilibrium microeconomics is outlined. Economic activity is modelled as transformation and transport of commodities (materials owned by the agents. Rate of transformations (production intensity, and the rate of transport (trade are defined by the agents. Economic decision rules are derived from the observed economic behaviour. The non-linear equations are solved numerically for a model economy. Numerical solutions for simple model economies suggest that the some of the results of general equilibrium economics are consequences only of the equilibrium hypothesis. We show that perfect competition of selfish agents does not guarantee the stability of economic equilibrium, but cooperativity is needed, too.

  9. DIAGNOSIS OF FINANCIAL EQUILIBRIUM

    Directory of Open Access Journals (Sweden)

    SUCIU GHEORGHE

    2013-04-01

    Full Text Available The analysis based on the balance sheet tries to identify the state of equilibrium (disequilibrium that exists in a company. The easiest way to determine the state of equilibrium is by looking at the balance sheet and at the information it offers. Because in the balance sheet there are elements that do not reflect their real value, the one established on the market, they must be readjusted, and those elements which are not related to the ordinary operating activities must be eliminated. The diagnosis of financial equilibrium takes into account 2 components: financing sources (ownership equity, loaned, temporarily attracted. An efficient financial equilibrium must respect 2 fundamental requirements: permanent sources represented by ownership equity and loans for more than 1 year should finance permanent needs, and temporary resources should finance the operating cycle.

  10. Non-equilibrium phase transition

    International Nuclear Information System (INIS)

    Mottola, E.; Cooper, F.M.; Bishop, A.R.; Habib, S.; Kluger, Y.; Jensen, N.G.

    1998-01-01

    This is the final report of a one-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). Non-equilibrium phase transitions play a central role in a very broad range of scientific areas, ranging from nuclear, particle, and astrophysics to condensed matter physics and the material and biological sciences. The aim of this project was to explore the path to a deeper and more fundamental understanding of the common physical principles underlying the complex real time dynamics of phase transitions. The main emphasis was on the development of general theoretical tools to deal with non-equilibrium processes, and of numerical methods robust enough to capture the time-evolving structures that occur in actual experimental situations. Specific applications to Laboratory multidivisional efforts in relativistic heavy-ion physics (transition to a new phase of nuclear matter consisting of a quark-gluon plasma) and layered high-temperature superconductors (critical currents and flux flow at the National High Magnetic Field Laboratory) were undertaken

  11. Computing Equilibrium Chemical Compositions

    Science.gov (United States)

    Mcbride, Bonnie J.; Gordon, Sanford

    1995-01-01

    Chemical Equilibrium With Transport Properties, 1993 (CET93) computer program provides data on chemical-equilibrium compositions. Aids calculation of thermodynamic properties of chemical systems. Information essential in design and analysis of such equipment as compressors, turbines, nozzles, engines, shock tubes, heat exchangers, and chemical-processing equipment. CET93/PC is version of CET93 specifically designed to run within 640K memory limit of MS-DOS operating system. CET93/PC written in FORTRAN.

  12. Non-equilibrium dynamics of one-dimensional Bose gases

    International Nuclear Information System (INIS)

    Langen, T.

    2013-01-01

    Understanding the non-equilibrium dynamics of isolated quantum many-body systems is an open problem on vastly different energy, length, and time scales. Examples range from the dynamics of the early universe and heavy-ion collisions to the subtle coherence and transport properties in condensed matter physics. However, realizations of such quantum many-body systems, which are both well isolated from the environment and accessible to experimental study are scarce. This thesis presents a series of experiments with ultracold one-dimensional Bose gases. These gases combine a nearly perfect isolation from the environment with many well-established methods to manipulate and probe their quantum states. This makes them an ideal model system to explore the physics of quantum many body systems out of equilibrium. In the experiments, a well-defined non-equilibrium state is created by splitting a single one-dimensional gas coherently into two parts. The relaxation of this state is probed using matter-wave interferometry. The Observations reveal the emergence of a prethermalized steady state which differs strongly from thermal equilibrium. Such thermal-like states had previously been predicted for a large variety of systems, but never been observed directly. Studying the relaxation process in further detail shows that the thermal correlations of the prethermalized state emerge locally in their final form and propagate through the system in a light-cone-like evolution. This provides first experimental evidence for the local relaxation conjecture, which links relaxation processes in quantum many-body systems to the propagation of correlations. Furthermore, engineering the initial state of the evolution demonstrates that the prethermalized state is described by a generalized Gibbs ensemble, an observation which substantiates the importance of this ensemble as an extension of standard statistical mechanics. Finally, an experiment is presented, where pairs of gases with an atom

  13. Neutron scattering on equilibrium and nonequilibrium phonons, excitons and polaritons

    International Nuclear Information System (INIS)

    Broude, V.L.; Sheka, E.F.

    1978-01-01

    A number of problems of solid-state physics representing interest for neutron spectroscopy of future is considered. The development of the neutron inelastic scattering spectroscopy (neutron spectroscopy of equilibrium phonons) is discussed with application to nuclear dynamics of crystals in the thermodynamic equilibrium. The results of high-flux neutron source experiments on molecular crystals are presented. The advantages of neutron inelastic scattering over optical spectroscopy are discussed. The spectroscopy of quasi-equilibrium and non-equilibrium quasi-particles is discussed. In particular, the neutron scattering on polaritons, excitons in thermal equilibrium and production of light-excitons are considered. The problem of the possibility of such experiments is elucidated

  14. Quantum gases finite temperature and non-equilibrium dynamics

    CERN Document Server

    Szymanska, Marzena; Davis, Matthew; Gardiner, Simon

    2013-01-01

    The 1995 observation of Bose-Einstein condensation in dilute atomic vapours spawned the field of ultracold, degenerate quantum gases. Unprecedented developments in experimental design and precision control have led to quantum gases becoming the preferred playground for designer quantum many-body systems. This self-contained volume provides a broad overview of the principal theoretical techniques applied to non-equilibrium and finite temperature quantum gases. Covering Bose-Einstein condensates, degenerate Fermi gases, and the more recently realised exciton-polariton condensates, it fills a gap by linking between different methods with origins in condensed matter physics, quantum field theory, quantum optics, atomic physics, and statistical mechanics. Thematically organised chapters on different methodologies, contributed by key researchers using a unified notation, provide the first integrated view of the relative merits of individual approaches, aided by pertinent introductory chapters and the guidance of ed...

  15. The Using of Scientific Based Physics Module in Learning to Enhance High School Students’ Critical Thinking Skills on Rotation Dynamics and Equilibrium of Rigid Body

    Directory of Open Access Journals (Sweden)

    Dhimas Nur Setyawan

    2017-05-01

    Full Text Available The purpose of this study was to determine the effectiveness of using a scientific based physics module to improve high school students' critical thinking skills. This study is a quasi experimental study which uses two classes taken at random experiment consists of one class and the control class. Class experiments using the scientific study using scientific-based modules and classroom experiments using books that have been owned by students. Experimental class numbered 25 students and control class numbered 28 students. The research was conducted in the first half (one Academic Year 2016/2017. The method used is the test method with a pretest-posttest design. Data were analyzed with quantitative and qualitative methods. Data were analyzed using a pretest form of the homogeneity test to find out that the experimental class and controls used homogeneous. Posttest results were analyzed using normality test to determine the normally distributed data, N-gain to determine the increase critical thinking skills, as well as test two parties not bound to determine whether or not there is a difference in the increase in critical thinking skills. Conclusions and recommendations are the use of scientifically-based modules effectively improve the ability to think critically and use physics-based scientific modules should be adjusted to the prevailing syllabus and curriculum so that learning can take place properly.

  16. Experimental determination of some equilibrium parameter of Damavand tokamak by magnetic probe measurements for representing a physical model for plasma vertical movement.

    Science.gov (United States)

    Farahani, N Darestani; Davani, F Abbasi

    2015-10-01

    This investigation is about plasma modeling for the control of vertical instabilities in Damavand tokamak. This model is based on online magnetic measurement. The algebraic equation defining the vertical position in this model is based on instantaneous force-balance. Two parameters in this equation, including decay index, n, and lambda, Λ, have been considered as functions of time-varying poloidal field coil currents and plasma current. Then these functions have been used in a code generated for modeling the open loop response of plasma. The main restriction of the suitability analysis of the model is that the experiments always have to be performed in the presence of a control loop for stabilizing vertical position. As a result, open loop response of the system has been identified from closed loop experimental data by nonlinear neural network identification method. The results of comparison of physical model with identified open loop response from closed loop experiments show root mean square error percentage less than 10%. The results are satisfying that the physical model is useful as a Damavand tokamak vertical movement simulator.

  17. Equilibrium and out-of-equilibrium thermodynamics in supercooled liquids and glasses

    International Nuclear Information System (INIS)

    Mossa, S; Nave, E La; Tartaglia, P; Sciortino, F

    2003-01-01

    We review the inherent structure thermodynamical formalism and the formulation of an equation of state (EOS) for liquids in equilibrium based on the (volume) derivatives of the statistical properties of the potential energy surface. We also show that, under the hypothesis that during ageing the system explores states associated with equilibrium configurations, it is possible to generalize the proposed EOS to out-of-equilibrium (OOE) conditions. The proposed formulation is based on the introduction of one additional parameter which, in the chosen thermodynamic formalism, can be chosen as the local minimum where the slowly relaxing OOE liquid is trapped

  18. A method for statistical comparison of data sets and its uses in analysis of nuclear physics data

    International Nuclear Information System (INIS)

    Bityukov, S.I.; Smirnova, V.V.; Krasnikov, N.V.; Maksimushkina, A.V.; Nikitenko, A.N.

    2014-01-01

    Authors propose a method for statistical comparison of two data sets. The method is based on the method of statistical comparison of histograms. As an estimator of quality of the decision made, it is proposed to use the value which it is possible to call the probability that the decision (data sets are various) is correct [ru

  19. The automated design of materials far from equilibrium

    Science.gov (United States)

    Miskin, Marc Z.

    density. We examine how the results of a design process are contingent upon operating conditions by studying which shapes dissipate energy fastest in a granular gas. We even move to create optimization algorithms for the expressed purpose of material design, by integrating them with statistical mechanics. In all of these cases, we show that turning to machines puts a fresh perspective on materials far from equilibrium. By matching forms to functions, complexities become possibilities, motifs emerge that describe new physics, and the door opens to rational design.

  20. Immunity by equilibrium.

    Science.gov (United States)

    Eberl, Gérard

    2016-08-01

    The classical model of immunity posits that the immune system reacts to pathogens and injury and restores homeostasis. Indeed, a century of research has uncovered the means and mechanisms by which the immune system recognizes danger and regulates its own activity. However, this classical model does not fully explain complex phenomena, such as tolerance, allergy, the increased prevalence of inflammatory pathologies in industrialized nations and immunity to multiple infections. In this Essay, I propose a model of immunity that is based on equilibrium, in which the healthy immune system is always active and in a state of dynamic equilibrium between antagonistic types of response. This equilibrium is regulated both by the internal milieu and by the microbial environment. As a result, alteration of the internal milieu or microbial environment leads to immune disequilibrium, which determines tolerance, protective immunity and inflammatory pathology.

  1. Neutrino statistics: elementary problems and some applications

    Energy Technology Data Exchange (ETDEWEB)

    Kuchowicz, B

    1973-01-01

    The treatment of neutrinos includes neutrinos in statistical equilibrium, mathematical refinements, application to stars, the relic neutrinos in cosmology, and some unsolved problems and prospects. (JFP)

  2. Scaling studies of spheromak formation and equilibrium

    International Nuclear Information System (INIS)

    Geddes, C.G.; Kornack, T.W.; Brown, M.R.

    1998-01-01

    Formation and equilibrium studies have been performed on the Swarthmore Spheromak Experiment (SSX). Spheromaks are formed with a magnetized coaxial plasma gun and equilibrium is established in both small (d small =0.16 m) and large (d large =3d small =0.50 m) copper flux conservers. Using magnetic probe arrays it has been verified that spheromak formation is governed solely by gun physics (in particular the ratio of gun current to flux, μ 0 I gun /Φ gun ) and is independent of the flux conserver dimensions. It has also been verified that equilibrium is well described by the force free condition ∇xB=λB (λ=constant), particularly early in decay. Departures from the force-free state are due to current profile effects described by a quadratic function λ=λ(ψ). Force-free SSX spheromaks will be merged to study magnetic reconnection in simple magnetofluid structures. copyright 1998 American Institute of Physics

  3. Equilibrium shoreface profiles

    DEFF Research Database (Denmark)

    Aagaard, Troels; Hughes, Michael G

    2017-01-01

    Large-scale coastal behaviour models use the shoreface profile of equilibrium as a fundamental morphological unit that is translated in space to simulate coastal response to, for example, sea level oscillations and variability in sediment supply. Despite a longstanding focus on the shoreface...... profile and its relevance to predicting coastal response to changing environmental conditions, the processes and dynamics involved in shoreface equilibrium are still not fully understood. Here, we apply a process-based empirical sediment transport model, combined with morphodynamic principles to provide......; there is no tuning or calibration and computation times are short. It is therefore easily implemented with repeated iterations to manage uncertainty....

  4. Understanding Statistics - Cancer Statistics

    Science.gov (United States)

    Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.

  5. Statistical mechanics of two-dimensional and geophysical flows

    International Nuclear Information System (INIS)

    Bouchet, Freddy; Venaille, Antoine

    2012-01-01

    The theoretical study of the self-organization of two-dimensional and geophysical turbulent flows is addressed based on statistical mechanics methods. This review is a self-contained presentation of classical and recent works on this subject; from the statistical mechanics basis of the theory up to applications to Jupiter’s troposphere and ocean vortices and jets. Emphasize has been placed on examples with available analytical treatment in order to favor better understanding of the physics and dynamics. After a brief presentation of the 2D Euler and quasi-geostrophic equations, the specificity of two-dimensional and geophysical turbulence is emphasized. The equilibrium microcanonical measure is built from the Liouville theorem. Important statistical mechanics concepts (large deviations and mean field approach) and thermodynamic concepts (ensemble inequivalence and negative heat capacity) are briefly explained and described. On this theoretical basis, we predict the output of the long time evolution of complex turbulent flows as statistical equilibria. This is applied to make quantitative models of two-dimensional turbulence, the Great Red Spot and other Jovian vortices, ocean jets like the Gulf-Stream, and ocean vortices. A detailed comparison between these statistical equilibria and real flow observations is provided. We also present recent results for non-equilibrium situations, for the studies of either the relaxation towards equilibrium or non-equilibrium steady states. In this last case, forces and dissipation are in a statistical balance; fluxes of conserved quantity characterize the system and microcanonical or other equilibrium measures no longer describe the system.

  6. Equilibrium distribution function in collisionless systems

    International Nuclear Information System (INIS)

    Pergamenshchik, V.M.

    1988-01-01

    Collisionless systems of a large number of N particles interacting by Coulomb forces are widely spread in cosmic and laboratory plasma. A statistical theory of equilibrium state of collisionless Coulomb systems which evolution obeys Vlasov equation is proposed. The developed formalism permits a sequential consideration of such distributed in one-particle six-dimensional phase space of a system and to obtain a simple result: equilibrium distribution function has the form of Fermi-Dirac distribution and doesn't depend on initial state factors

  7. A Progression of Static Equilibrium Laboratory Exercises

    Science.gov (United States)

    Kutzner, Mickey; Kutzner, Andrew

    2013-01-01

    Although simple architectural structures like bridges, catwalks, cantilevers, and Stonehenge have been integral in human societies for millennia, as have levers and other simple tools, modern students of introductory physics continue to grapple with Newton's conditions for static equilibrium. As formulated in typical introductory physics…

  8. Static Equilibrium Configurations of Charged Metallic Bodies ...

    African Journals Online (AJOL)

    In this paper we developed a simple numerical scheme to determine the static equilibrium configuration of charged metallic bodies by minimizing the potential energy function. The method developed has some advantages; it combines the general theory and the physical meanings nested in the mathematical model and this ...

  9. Microeconomics : Equilibrium and Efficiency

    NARCIS (Netherlands)

    Ten Raa, T.

    2013-01-01

    Microeconomics: Equilibrium and Efficiency teaches how to apply microeconomic theory in an innovative, intuitive and concise way. Using real-world, empirical examples, this book not only covers the building blocks of the subject, but helps gain a broad understanding of microeconomic theory and

  10. Differential Equation of Equilibrium

    African Journals Online (AJOL)

    user

    ABSTRACT. Analysis of underground circular cylindrical shell is carried out in this work. The forth order differential equation of equilibrium, comparable to that of beam on elastic foundation, was derived from static principles on the assumptions of P. L Pasternak. Laplace transformation was used to solve the governing ...

  11. Incorporation of quantum statistical features in molecular dynamics

    International Nuclear Information System (INIS)

    Ohnishi, Akira; Randrup, J.

    1995-01-01

    We formulate a method for incorporating quantum fluctuations into molecular-dynamics simulations of many-body systems, such as those employed for energetic nuclear collision processes. Based on Fermi's Golden Rule, we allow spontaneous transitions to occur between the wave packets which are not energy eigenstates. The ensuing diffusive evolution in the space of the wave packet parameters exhibits appealing physical properties, including relaxation towards quantum-statistical equilibrium. (author)

  12. Comments on equilibrium, transient equilibrium, and secular equilibrium in serial radioactive decay

    International Nuclear Information System (INIS)

    Prince, J.R.

    1979-01-01

    Equations describing serial radioactive decay are reviewed along with published descriptions or transient and secular equilibrium. It is shown that terms describing equilibrium are not used in the same way by various authors. Specific definitions are proposed; they suggest that secular equilibrium is a subset of transient equilibrium

  13. Correlation between physical examination and intraoperative findings in shoulder disease treated by arthroscopy. Statistical analysis of 150 patients.

    Science.gov (United States)

    García Parra, P; Anaya Rojas, M; Jiménez Bravo, B; González Oria, M O; Lisbona Muñoz, M; Gil Álvarez, J J; Cano Luis, P

    2016-01-01

    Only a few clinical exploratory manoeuvres are truly discriminatory and useful in shoulder disease. The aim of this study is to correlate the physical examination results of the shoulder with the true diagnosis found by arthroscopy. A retrospective case series of 150 patients with the most common surgical conditions of the shoulder. Data were collected on the suspicion of each pathology, the physical examination of the patient, and the actual discovery of the disease during arthroscopic surgery. The Bankart examination manoeuvres of the lesion show the best results, with a 92.1% positive prediction value (PPV), a 99.1% negative predictive value (NPV), followed by the impingement syndrome, with a PPV of 94.4%, and total cuff rupture with a PPV of 92.3%.Exploration of the superior labrum anterior to posterior (SLAP) lesion had an NPV of 99.1%. Physical examination is sufficient to diagnose or rule out Bankart. A positive physical examination provides the complete rupture of the rotator cuff, and requires further studies. The patients suspected of subacromial syndrome only need an NMR if the physical tests are negative. The conclusions drawn from this work can have a significant impact on both cost savings (by reducing forward tests), and saving time in certain cases in which, after appropriate physical examination, surgery may be indicated without losing time in intermediate steps. Copyright © 2016 SECOT. Publicado por Elsevier España, S.L.U. All rights reserved.

  14. Predictors for physical activity in adolescent girls using statistical shrinkage techniques for hierarchical longitudinal mixed effects models.

    Directory of Open Access Journals (Sweden)

    Edward M Grant

    Full Text Available We examined associations among longitudinal, multilevel variables and girls' physical activity to determine the important predictors for physical activity change at different adolescent ages. The Trial of Activity for Adolescent Girls 2 study (Maryland contributed participants from 8th (2009 to 11th grade (2011 (n=561. Questionnaires were used to obtain demographic, and psychosocial information (individual- and social-level variables; height, weight, and triceps skinfold to assess body composition; interviews and surveys for school-level data; and self-report for neighborhood-level variables. Moderate to vigorous physical activity minutes were assessed from accelerometers. A doubly regularized linear mixed effects model was used for the longitudinal multilevel data to identify the most important covariates for physical activity. Three fixed effects at the individual level and one random effect at the school level were chosen from an initial total of 66 variables, consisting of 47 fixed effects and 19 random effects variables, in additional to the time effect. Self-management strategies, perceived barriers, and social support from friends were the three selected fixed effects, and whether intramural or interscholastic programs were offered in middle school was the selected random effect. Psychosocial factors and friend support, plus a school's physical activity environment, affect adolescent girl's moderate to vigorous physical activity longitudinally.

  15. Analysis of non-equilibrium phenomena in inductively coupled plasma generators

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, W.; Panesi, M., E-mail: mpanesi@illinois.edu [University of Illinois at Urbana-Champaign, Urbana, Illinois 61822 (United States); Lani, A. [Von Karman Institute for Fluid Dynamics, Rhode-Saint-Genèse (Belgium)

    2016-07-15

    This work addresses the modeling of non-equilibrium phenomena in inductively coupled plasma discharges. In the proposed computational model, the electromagnetic induction equation is solved together with the set of Navier-Stokes equations in order to compute the electromagnetic and flow fields, accounting for their mutual interaction. Semi-classical statistical thermodynamics is used to determine the plasma thermodynamic properties, while transport properties are obtained from kinetic principles, with the method of Chapman and Enskog. Particle ambipolar diffusive fluxes are found by solving the Stefan-Maxwell equations with a simple iterative method. Two physico-mathematical formulations are used to model the chemical reaction processes: (1) A Local Thermodynamics Equilibrium (LTE) formulation and (2) a thermo-chemical non-equilibrium (TCNEQ) formulation. In the TCNEQ model, thermal non-equilibrium between the translational energy mode of the gas and the vibrational energy mode of individual molecules is accounted for. The electronic states of the chemical species are assumed in equilibrium with the vibrational temperature, whereas the rotational energy mode is assumed to be equilibrated with translation. Three different physical models are used to account for the coupling of chemistry and energy transfer processes. Numerical simulations obtained with the LTE and TCNEQ formulations are used to characterize the extent of non-equilibrium of the flow inside the Plasmatron facility at the von Karman Institute. Each model was tested using different kinetic mechanisms to assess the sensitivity of the results to variations in the reaction parameters. A comparison of temperatures and composition profiles at the outlet of the torch demonstrates that the flow is in non-equilibrium for operating conditions characterized by pressures below 30 000 Pa, frequency 0.37 MHz, input power 80 kW, and mass flow 8 g/s.

  16. Statistical and off-equilibrium production of fragments in heavy ion collisions at intermediate energies; Production statistique et hors-equilibre de fragments dans les collisions d`ions lourdes aux energies intermediaires

    Energy Technology Data Exchange (ETDEWEB)

    Bocage, Frederic [Lab. de Physique Corpusculaire, Caen Univ., 14 - Caen (France)

    1998-12-15

    The study of reaction products, fragments and light charged particles, emitted during heavy-ion collisions at intermediate energies has shown the dominant binary dissipative character of the reaction, which is persisting for almost all impact parameters. However, in comparison with this purely binary process, an excess of nuclear matter is observed in-between the quasi-projectile and the quasi-target. To understand the mechanisms producing such an excess, this work studies more precisely the breakup in two fragments of the quasi-projectile formed in Xe+Sn, from 25 to 50 MeV/u, and Gd+C and Gd+U at 36 MeV/u. The data were obtained during the first INDRA experiment at GANIL. The angular distributions of the two fragments show the competition between statistical fission and non-equilibrated breakup of the quasi-projectile. In the second case, the two fragments are aligned along the separation axis of the two primary partners. The comparison of the fission directions and probabilities with statistical models allows us to measure the fission time, as well as the angular momentum, temperature and size of the fissioning residue. The relative velocities are compatible with Coulomb and thermal effects in the case of statistical fission and are found much higher for the breakup of a non-equilibrated quasi-projectile, which indicates that the projectile was deformed during interaction with the target. Such deformations should be compared with dynamical calculations in order to constrain the viscosity of nuclear matter and the parameters of the nucleon-nucleon interaction, (author) 148 refs., 77 figs., 11 tabs.

  17. Statistical mechanics rigorous results

    CERN Document Server

    Ruelle, David

    1999-01-01

    This classic book marks the beginning of an era of vigorous mathematical progress in equilibrium statistical mechanics. Its treatment of the infinite system limit has not been superseded, and the discussion of thermodynamic functions and states remains basic for more recent work. The conceptual foundation provided by the Rigorous Results remains invaluable for the study of the spectacular developments of statistical mechanics in the second half of the 20th century.

  18. Statistical mechanics

    CERN Document Server

    Jana, Madhusudan

    2015-01-01

    Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...

  19. Recent advances in mathematical criminology. Comment on "Statistical physics of crime: A review" by M.R. D'Orsogna and M. Perc

    Science.gov (United States)

    Rodríguez, Nancy

    2015-03-01

    The use of mathematical tools has long proved to be useful in gaining understanding of complex systems in physics [1]. Recently, many researchers have realized that there is an analogy between emerging phenomena in complex social systems and complex physical or biological systems [4,5,12]. This realization has particularly benefited the modeling and understanding of crime, a ubiquitous phenomena that is far from being understood. In fact, when one is interested in the bulk behavior of patterns that emerge from small and seemingly unrelated interactions as well as decisions that occur at the individual level, the mathematical tools that have been developed in statistical physics, game theory, network theory, dynamical systems, and partial differential equations can be useful in shedding light into the dynamics of these patterns [2-4,6,12].

  20. Equilibrium and pre-equilibrium emissions in proton-induced ...

    Indian Academy of Sciences (India)

    necessary for the domain of fission-reactor technology for the calculation of nuclear transmutation ... tions occur in three stages: INC, pre-equilibrium and equilibrium (or compound. 344. Pramana ... In the evaporation phase of the reaction, the.