Causality Statistical Perspectives and Applications
Berzuini, Carlo; Bernardinell, Luisa
2012-01-01
A state of the art volume on statistical causality Causality: Statistical Perspectives and Applications presents a wide-ranging collection of seminal contributions by renowned experts in the field, providing a thorough treatment of all aspects of statistical causality. It covers the various formalisms in current use, methods for applying them to specific problems, and the special requirements of a range of examples from medicine, biology and economics to political science. This book:Provides a clear account and comparison of formal languages, concepts and models for statistical causality. Addr
Maximally causal quantum mechanics
International Nuclear Information System (INIS)
Roy, S.M.
1998-01-01
We present a new causal quantum mechanics in one and two dimensions developed recently at TIFR by this author and V. Singh. In this theory both position and momentum for a system point have Hamiltonian evolution in such a way that the ensemble of system points leads to position and momentum probability densities agreeing exactly with ordinary quantum mechanics. (author)
Explaining through causal mechanisms
Biesbroek, Robbert; Dupuis, Johann; Wellstead, Adam
2017-01-01
This paper synthesizes and builds on recent critiques of the resilience literature; namely that the field has largely been unsuccessful in capturing the complexity of governance processes, in particular cause–effects relationships. We demonstrate that absence of a causal model is reflected in the
Davidson, Norman
2003-01-01
Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody
Schwabl, Franz
2006-01-01
The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...
Jana, Madhusudan
2015-01-01
Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...
Quantum mechanics, relativity and causality
International Nuclear Information System (INIS)
Tati, Takao.
1975-07-01
In quantum mechanics, the state is prepared by a measurement on a space-like surface sigma. What is that determines the surface sigma on which the measurement prepares the state It is considered either a mechanism proper to the measuring process (apparatus) or a universal property of space-time. In the former case, problems arise, concerning causality or conservation of probability due to that the velocity of reduction of wave-packet is considered to exceed the light velocity. The theory of finite degree of freedom proposed previously belongs to the latter case. In this theory, the surface sigma is restricted to the hyper-plane perpendicular to a universal time-like vector governing causal relations. We propose an experiment to discriminate between the above-mentioned two cases and to test the existence of the universal time-like vector. (auth.)
Selecting appropriate cases when tracing causal mechanisms
DEFF Research Database (Denmark)
Beach, Derek; Pedersen, Rasmus Brun
2016-01-01
The last decade has witnessed resurgence in the interest in studying the causal mechanisms linking causes and outcomes in the social sciences. This article explores the overlooked implications for case selection when tracing mechanisms using in-depth case studies. Our argument is that existing case...... selection guidelines are appropriate for research aimed at making cross-case claims about causal relationships, where case selection is primarily used to control for other causes. However, existing guidelines are not in alignment with case-based research that aims to trace mechanisms, where the goal...... is to unpack the causal mechanism between X and Y, enabling causal inferences to be made because empirical evidence is provided for how the mechanism actually operated in a particular case. The in-depth, within-case tracing of how mechanisms operate in particular cases produces what can be termed mechanistic...
Temporal and Statistical Information in Causal Structure Learning
McCormack, Teresa; Frosch, Caren; Patrick, Fiona; Lagnado, David
2015-01-01
Three experiments examined children's and adults' abilities to use statistical and temporal information to distinguish between common cause and causal chain structures. In Experiment 1, participants were provided with conditional probability information and/or temporal information and asked to infer the causal structure of a 3-variable mechanical…
Statistical causal inferences and their applications in public health research
Wu, Pan; Chen, Ding-Geng
2016-01-01
This book compiles and presents new developments in statistical causal inference. The accompanying data and computer programs are publicly available so readers may replicate the model development and data analysis presented in each chapter. In this way, methodology is taught so that readers may implement it directly. The book brings together experts engaged in causal inference research to present and discuss recent issues in causal inference methodological development. This is also a timely look at causal inference applied to scenarios that range from clinical trials to mediation and public health research more broadly. In an academic setting, this book will serve as a reference and guide to a course in causal inference at the graduate level (Master's or Doctorate). It is particularly relevant for students pursuing degrees in Statistics, Biostatistics and Computational Biology. Researchers and data analysts in public health and biomedical research will also find this book to be an important reference.
Causal localizations in relativistic quantum mechanics
Castrigiano, Domenico P. L.; Leiseifer, Andreas D.
2015-07-01
Causal localizations describe the position of quantum systems moving not faster than light. They are constructed for the systems with finite spinor dimension. At the center of interest are the massive relativistic systems. For every positive mass, there is the sequence of Dirac tensor-localizations, which provides a complete set of inequivalent irreducible causal localizations. They obey the principle of special relativity and are fully Poincaré covariant. The boosters are determined by the causal position operator and the other Poincaré generators. The localization with minimal spinor dimension is the Dirac localization. Thus, the Dirac equation is derived here as a mere consequence of the principle of causality. Moreover, the higher tensor-localizations, not known so far, follow from Dirac's localization by a simple construction. The probability of localization for positive energy states results to be described by causal positive operator valued (PO-) localizations, which are the traces of the causal localizations on the subspaces of positive energy. These causal Poincaré covariant PO-localizations for every irreducible massive relativistic system were, all the more, not known before. They are shown to be separated. Hence, the positive energy systems can be localized within every open region by a suitable preparation as accurately as desired. Finally, the attempt is made to provide an interpretation of the PO-localization operators within the frame of conventional quantum mechanics attributing an important role to the negative energy states.
Klose, Christian D.
2013-01-01
A global catalog of small- to large-sized earthquakes was systematically analyzed to identify causality and correlatives between human-made mass shifts in the upper Earth's crust and the occurrence of earthquakes. The mass shifts, ranging between 1 kt and 1 Tt, result from large-scale geoengineering operations, including mining, water reservoirs, hydrocarbon production, fluid injection/extractions, deep geothermal energy production and coastal management. This article shows evidence that geomechanical relationships exist with statistical significance between (a) seismic moment magnitudes M of observed earthquakes, (b) lateral distances of the earthquake hypocenters to the geoengineering "operation points" and (c) mass removals or accumulations on the Earth's crust. Statistical findings depend on uncertainties, in particular, of source parameter estimations of seismic events before instrumental recoding. Statistical observations, however, indicate that every second, seismic event tends to occur after a decade. The chance of an earthquake to nucleate after 2 or 20 years near an area with a significant mass shift is 25 or 75 %, respectively. Moreover, causative effects of seismic activities highly depend on the tectonic stress regime in which the operations take place (i.e., extensive, transverse or compressive). Results are summarized as follows: First, seismic moment magnitudes increase the more mass is locally shifted on the Earth's crust. Second, seismic moment magnitudes increase the larger the area in the crust is geomechanically polluted. Third, reverse faults tend to be more trigger-sensitive than normal faults due to a stronger alteration of the minimum vertical principal stress component. Pure strike-slip faults seem to rupture randomly and independently from the magnitude of the mass changes. Finally, mainly due to high estimation uncertainties of source parameters and, in particular, of shallow seismic events (events (>M6) seem to be triggered. The rupture
Whose statistical reasoning is facilitated by a causal structure intervention?
McNair, Simon; Feeney, Aidan
2015-02-01
People often struggle when making Bayesian probabilistic estimates on the basis of competing sources of statistical evidence. Recently, Krynski and Tenenbaum (Journal of Experimental Psychology: General, 136, 430-450, 2007) proposed that a causal Bayesian framework accounts for peoples' errors in Bayesian reasoning and showed that, by clarifying the causal relations among the pieces of evidence, judgments on a classic statistical reasoning problem could be significantly improved. We aimed to understand whose statistical reasoning is facilitated by the causal structure intervention. In Experiment 1, although we observed causal facilitation effects overall, the effect was confined to participants high in numeracy. We did not find an overall facilitation effect in Experiment 2 but did replicate the earlier interaction between numerical ability and the presence or absence of causal content. This effect held when we controlled for general cognitive ability and thinking disposition. Our results suggest that clarifying causal structure facilitates Bayesian judgments, but only for participants with sufficient understanding of basic concepts in probability and statistics.
Causal Inference for Statistics, Social, and Biomedical Sciences: An Introduction
Imbens, Guido W.; Rubin, Donald B.
2015-01-01
Most questions in social and biomedical sciences are causal in nature: what would happen to individuals, or to groups, if part of their environment were changed? In this groundbreaking text, two world-renowned experts present statistical methods for studying such questions. This book starts with the notion of potential outcomes, each corresponding…
Pearl, Judea
2000-03-01
Written by one of the pre-eminent researchers in the field, this book provides a comprehensive exposition of modern analysis of causation. It shows how causality has grown from a nebulous concept into a mathematical theory with significant applications in the fields of statistics, artificial intelligence, philosophy, cognitive science, and the health and social sciences. Pearl presents a unified account of the probabilistic, manipulative, counterfactual and structural approaches to causation, and devises simple mathematical tools for analyzing the relationships between causal connections, statistical associations, actions and observations. The book will open the way for including causal analysis in the standard curriculum of statistics, artifical intelligence, business, epidemiology, social science and economics. Students in these areas will find natural models, simple identification procedures, and precise mathematical definitions of causal concepts that traditional texts have tended to evade or make unduly complicated. This book will be of interest to professionals and students in a wide variety of fields. Anyone who wishes to elucidate meaningful relationships from data, predict effects of actions and policies, assess explanations of reported events, or form theories of causal understanding and causal speech will find this book stimulating and invaluable.
Equilibrium statistical mechanics
Jackson, E Atlee
2000-01-01
Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t
Lectures on statistical mechanics
Bowler, M G
1982-01-01
Anyone dissatisfied with the almost ritual dullness of many 'standard' texts in statistical mechanics will be grateful for the lucid explanation and generally reassuring tone. Aimed at securing firm foundations for equilibrium statistical mechanics, topics of great subtlety are presented transparently and enthusiastically. Very little mathematical preparation is required beyond elementary calculus and prerequisites in physics are limited to some elementary classical thermodynamics. Suitable as a basis for a first course in statistical mechanics, the book is an ideal supplement to more convent
Equilibrium statistical mechanics
Mayer, J E
1968-01-01
The International Encyclopedia of Physical Chemistry and Chemical Physics, Volume 1: Equilibrium Statistical Mechanics covers the fundamental principles and the development of theoretical aspects of equilibrium statistical mechanics. Statistical mechanical is the study of the connection between the macroscopic behavior of bulk matter and the microscopic properties of its constituent atoms and molecules. This book contains eight chapters, and begins with a presentation of the master equation used for the calculation of the fundamental thermodynamic functions. The succeeding chapters highlight t
De Broglie's causal interpretations of quantum mechanics
International Nuclear Information System (INIS)
Ben-Dov, Y.
1989-01-01
In this article we trace the history of de Broglie's two causal interpretations of quantum mechanics, namely the double solution and the pilot wave theories, at the two periods in which he developed them: 1924-27 and 1952 onwards. Examining the reasons for which he always preferred the first theory to the second, reasons that are mainly concerned with the question of the physical nature of the quantum wave function, we try to show the continuity and the coherence of his underlying vision
International Nuclear Information System (INIS)
Tonchev, N.; Shumovskij, A.S.
1986-01-01
The history of investigations, conducted at the JINR in the field of statistical mechanics, beginning with the fundamental works by Bogolyubov N.N. on superconductivity microscopic theory is presented. Ideas, introduced in these works and methods developed in them, have largely determined the ways for developing statistical mechanics in the JINR and Hartree-Fock-Bogolyubov variational principle has become an important method of the modern nucleus theory. A brief review of the main achievements, connected with the development of statistical mechanics methods and their application in different fields of physical science is given
Statistical mechanics of superconductivity
Kita, Takafumi
2015-01-01
This book provides a theoretical, step-by-step comprehensive explanation of superconductivity for undergraduate and graduate students who have completed elementary courses on thermodynamics and quantum mechanics. To this end, it adopts the unique approach of starting with the statistical mechanics of quantum ideal gases and successively adding and clarifying elements and techniques indispensible for understanding it. They include the spin-statistics theorem, second quantization, density matrices, the Bloch–De Dominicis theorem, the variational principle in statistical mechanics, attractive interaction, and bound states. Ample examples of their usage are also provided in terms of topics from advanced statistical mechanics such as two-particle correlations of quantum ideal gases, derivation of the Hartree–Fock equations, and Landau’s Fermi-liquid theory, among others. With these preliminaries, the fundamental mean-field equations of superconductivity are derived with maximum mathematical clarity based on ...
Playing at Statistical Mechanics
Clark, Paul M.; And Others
1974-01-01
Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)
Statistical mechanics rigorous results
Ruelle, David
1999-01-01
This classic book marks the beginning of an era of vigorous mathematical progress in equilibrium statistical mechanics. Its treatment of the infinite system limit has not been superseded, and the discussion of thermodynamic functions and states remains basic for more recent work. The conceptual foundation provided by the Rigorous Results remains invaluable for the study of the spectacular developments of statistical mechanics in the second half of the 20th century.
Introducing mechanics by tapping core causal knowledge
International Nuclear Information System (INIS)
Klaassen, Kees; Westra, Axel; Emmett, Katrina; Eijkelhof, Harrie; Lijnse, Piet
2008-01-01
This article concerns an outline of an introductory mechanics course. It is based on the argument that various uses of the concept of force (e.g. from Kepler, Newton and everyday life) share an explanatory strategy based on core causal knowledge. The strategy consists of (a) the idea that a force causes a deviation from how an object would move of its own accord (i.e. its force-free motion), and (b) an incentive to search, where the motion deviates from the assumed force-free motion, for recurring configurations with which such deviations can be correlated (interaction theory). Various assumptions can be made concerning both the force-free motion and the interaction theory, thus giving rise to a variety of specific explanations. Kepler's semi-implicit intuition about the force-free motion is rest, Newton's explicit assumption is uniform rectilinear motion, while in everyday explanations a diversity of pragmatic suggestions can be recognized. The idea is that the explanatory strategy, once made explicit by drawing on students' intuitive causal knowledge, can be made to function for students as an advance organizer, in the sense of a general scheme that they recognize but do not yet know how to detail for scientific purposes
Statistical mechanics of anyons
International Nuclear Information System (INIS)
Arovas, D.P.
1985-01-01
We study the statistical mechanics of a two-dimensional gas of free anyons - particles which interpolate between Bose-Einstein and Fermi-Dirac character. Thermodynamic quantities are discussed in the low-density regime. In particular, the second virial coefficient is evaluated by two different methods and is found to exhibit a simple, periodic, but nonanalytic behavior as a function of the statistics determining parameter. (orig.)
Einstein's statistical mechanics
Energy Technology Data Exchange (ETDEWEB)
Baracca, A; Rechtman S, R
1985-08-01
The foundation of equilibrium classical statistical mechanics were laid down in 1902 independently by Gibbs and Einstein. The latter's contribution, developed in three papers published between 1902 and 1904, is usually forgotten and when not, rapidly dismissed as equivalent to Gibb's. We review in detail Einstein's ideas on the foundations of statistical mechanics and show that they constitute the beginning of a research program that led Einstein to quantum theory. We also show how these ideas may be used as a starting point for an introductory course on the subject.
Einstein's statistical mechanics
International Nuclear Information System (INIS)
Baracca, A.; Rechtman S, R.
1985-01-01
The foundation of equilibrium classical statistical mechanics were laid down in 1902 independently by Gibbs and Einstein. The latter's contribution, developed in three papers published between 1902 and 1904, is usually forgotten and when not, rapidly dismissed as equivalent to Gibb's. We review in detail Einstein's ideas on the foundations of statistical mechanics and show that they constitute the beginning of a research program that led Einstein to quantum theory. We also show how these ideas may be used as a starting point for an introductory course on the subject. (author)
Infinite Random Graphs as Statistical Mechanical Models
DEFF Research Database (Denmark)
Durhuus, Bergfinnur Jøgvan; Napolitano, George Maria
2011-01-01
We discuss two examples of infinite random graphs obtained as limits of finite statistical mechanical systems: a model of two-dimensional dis-cretized quantum gravity defined in terms of causal triangulated surfaces, and the Ising model on generic random trees. For the former model we describe a ...
Statistical mechanics of solitons
International Nuclear Information System (INIS)
Bishop, A.
1980-01-01
The status of statistical mechanics theory (classical and quantum, statics and dynamics) is reviewed for 1-D soliton or solitary-wave-bearing systems. Primary attention is given to (i) perspective for existing results with evaluation and representative literature guide; (ii) motivation and status report for remaining problems; (iii) discussion of connections with other 1-D topics
Wave Mechanics or Wave Statistical Mechanics
International Nuclear Information System (INIS)
Qian Shangwu; Xu Laizi
2007-01-01
By comparison between equations of motion of geometrical optics and that of classical statistical mechanics, this paper finds that there should be an analogy between geometrical optics and classical statistical mechanics instead of geometrical mechanics and classical mechanics. Furthermore, by comparison between the classical limit of quantum mechanics and classical statistical mechanics, it finds that classical limit of quantum mechanics is classical statistical mechanics not classical mechanics, hence it demonstrates that quantum mechanics is a natural generalization of classical statistical mechanics instead of classical mechanics. Thence quantum mechanics in its true appearance is a wave statistical mechanics instead of a wave mechanics.
Journey Through Statistical Mechanics
Yang, C. N.
2013-05-01
My first involvement with statistical mechanics and the many body problem was when I was a student at The National Southwest Associated University in Kunming during the war. At that time Professor Wang Zhu-Xi had just come back from Cambridge, England, where he was a student of Fowler, and his thesis was on phase transitions, a hot topic at that time, and still a very hot topic today...
Graphene Statistical Mechanics
Bowick, Mark; Kosmrlj, Andrej; Nelson, David; Sknepnek, Rastko
2015-03-01
Graphene provides an ideal system to test the statistical mechanics of thermally fluctuating elastic membranes. The high Young's modulus of graphene means that thermal fluctuations over even small length scales significantly stiffen the renormalized bending rigidity. We study the effect of thermal fluctuations on graphene ribbons of width W and length L, pinned at one end, via coarse-grained Molecular Dynamics simulations and compare with analytic predictions of the scaling of width-averaged root-mean-squared height fluctuations as a function of distance along the ribbon. Scaling collapse as a function of W and L also allows us to extract the scaling exponent eta governing the long-wavelength stiffening of the bending rigidity. A full understanding of the geometry-dependent mechanical properties of graphene, including arrays of cuts, may allow the design of a variety of modular elements with desired mechanical properties starting from pure graphene alone. Supported by NSF grant DMR-1435794
International Nuclear Information System (INIS)
Dienes, J.K.
1993-01-01
Although it is possible to simulate the ground blast from a single explosive shot with a simple computer algorithm and appropriate constants, the most commonly used modelling methods do not account for major changes in geology or shot energy because mechanical features such as tectonic stresses, fault structure, microcracking, brittle-ductile transition, and water content are not represented in significant detail. An alternative approach for modelling called Statistical Crack Mechanics is presented in this paper. This method, developed in the seventies as a part of the oil shale program, accounts for crack opening, shear, growth, and coalescence. Numerous photographs and micrographs show that shocked materials tend to involve arrays of planar cracks. The approach described here provides a way to account for microstructure and give a representation of the physical behavior of a material at the microscopic level that can account for phenomena such as permeability, fragmentation, shear banding, and hot-spot formation in explosives
Semiclassical statistical mechanics
International Nuclear Information System (INIS)
Stratt, R.M.
1979-04-01
On the basis of an approach devised by Miller, a formalism is developed which allows the nonperturbative incorporation of quantum effects into equilibrium classical statistical mechanics. The resulting expressions bear a close similarity to classical phase space integrals and, therefore, are easily molded into forms suitable for examining a wide variety of problems. As a demonstration of this, three such problems are briefly considered: the simple harmonic oscillator, the vibrational state distribution of HCl, and the density-independent radial distribution function of He 4 . A more detailed study is then made of two more general applications involving the statistical mechanics of nonanalytic potentials and of fluids. The former, which is a particularly difficult problem for perturbative schemes, is treated with only limited success by restricting phase space and by adding an effective potential. The problem of fluids, however, is readily found to yield to a semiclassical pairwise interaction approximation, which in turn permits any classical many-body model to be expressed in a convenient form. The remainder of the discussion concentrates on some ramifications of having a phase space version of quantum mechanics. To test the breadth of the formulation, the task of constructing quantal ensemble averages of phase space functions is undertaken, and in the process several limitations of the formalism are revealed. A rather different approach is also pursued. The concept of quantum mechanical ergodicity is examined through the use of numerically evaluated eigenstates of the Barbanis potential, and the existence of this quantal ergodicity - normally associated with classical phase space - is verified. 21 figures, 4 tables
Child Care Subsidy Use and Child Development: Potential Causal Mechanisms
Hawkinson, Laura E.
2011-01-01
Research using an experimental design is needed to provide firm causal evidence on the impacts of child care subsidy use on child development, and on underlying causal mechanisms since subsidies can affect child development only indirectly via changes they cause in children's early experiences. However, before costly experimental research is…
To a causal formulation of quantum mechanics
International Nuclear Information System (INIS)
Brody, T.A.; Cetto, A.M.; Pena, L. de la
1979-01-01
This paper consists of two parts. In the first one we analyze the elements that a theory of quantum mechanics (QM) must contain in order to provide a physical explanation of the most notable quantum features (random behaviour, wave-particle duality, discrete spectra). We conclude that the theory that possesses the qualitative elements required is stochastic electrodynamics (SED), according to which the quantum behavior of the electron arises from its interaction with the stochastic electromagnetic background fiel associated with the zero-point energy. In the second part we show that the postulates of SED are suitable for the construction of a theory of the motion of the electron from which QM may be derived as an approximate description; hence, the mathematical formalism of QM too is justified by SED. Thus, the present theory generalizes QM and moreover, provides an objective statistical interpretation of it. (author)
Topics in statistical mechanics
International Nuclear Information System (INIS)
Elser, V.
1984-05-01
This thesis deals with four independent topics in statistical mechanics: (1) the dimer problem is solved exactly for a hexagonal lattice with general boundary using a known generating function from the theory of partitions. It is shown that the leading term in the entropy depends on the shape of the boundary; (2) continuum models of percolation and self-avoiding walks are introduced with the property that their series expansions are sums over linear graphs with intrinsic combinatorial weights and explicit dimension dependence; (3) a constrained SOS model is used to describe the edge of a simple cubic crystal. Low and high temperature results are derived as well as the detailed behavior near the crystal facet; (4) the microscopic model of the lambda-transition involving atomic permutation cycles is reexamined. In particular, a new derivation of the two-component field theory model of the critical behavior is presented. Results for a lattice model originally proposed by Kikuchi are extended with a high temperature series expansion and Monte Carlo simulation. 30 references
Statistical mechanics of nonequilibrium liquids
Evans, Denis J; Craig, D P; McWeeny, R
1990-01-01
Statistical Mechanics of Nonequilibrium Liquids deals with theoretical rheology. The book discusses nonlinear response of systems and outlines the statistical mechanical theory. In discussing the framework of nonequilibrium statistical mechanics, the book explains the derivation of a nonequilibrium analogue of the Gibbsian basis for equilibrium statistical mechanics. The book reviews the linear irreversible thermodynamics, the Liouville equation, and the Irving-Kirkwood procedure. The text then explains the Green-Kubo relations used in linear transport coefficients, the linear response theory,
Quantum causality conceptual issues in the causal theory of quantum mechanics
Riggs, Peter J; French, Steven RD
2009-01-01
This is a treatise devoted to the foundations of quantum physics and the role that causality plays in the microscopic world governed by the laws of quantum mechanics. The book is controversial and will engender some lively debate on the various issues raised.
Special Relativity, Causality and Quantum Mechanics - 1
Indian Academy of Sciences (India)
postulate of the special theory of relativity (STR) stipulating the ... STR may be a more general principle to orga- nize our ... keep the laws of mechanics invariant in all inertial frames. .... cording to a different set of transformation equations.
Monin, A S
2007-01-01
""If ever a field needed a definitive book, it is the study of turbulence; if ever a book on turbulence could be called definitive, it is this book."" - ScienceWritten by two of Russia's most eminent and productive scientists in turbulence, oceanography, and atmospheric physics, this two-volume survey is renowned for its clarity as well as its comprehensive treatment. The first volume begins with an outline of laminar and turbulent flow. The remainder of the book treats a variety of aspects of turbulence: its statistical and Lagrangian descriptions, shear flows near surfaces and free turbulenc
Mechanism-Based Causal Reasoning in Young Children
Buchanan, David W.; Sobel, David M.
2011-01-01
The hypothesis that children develop an understanding of causal mechanisms was tested across 3 experiments. In Experiment 1 (N = 48), preschoolers had to choose as efficacious either a cause that had worked in the past, but was now disconnected from its effect, or a cause that had failed to work previously, but was now connected. Four-year-olds…
Statistical mechanics in a nutshell
Peliti, Luca
2011-01-01
Statistical mechanics is one of the most exciting areas of physics today, and it also has applications to subjects as diverse as economics, social behavior, algorithmic theory, and evolutionary biology. Statistical Mechanics in a Nutshell offers the most concise, self-contained introduction to this rapidly developing field. Requiring only a background in elementary calculus and elementary mechanics, this book starts with the basics, introduces the most important developments in classical statistical mechanics over the last thirty years, and guides readers to the very threshold of today
Statistical Evaluation of Causal Factors Associated with Astronaut Shoulder Injury in Space Suits.
Anderson, Allison P; Newman, Dava J; Welsch, Roy E
2015-07-01
Shoulder injuries due to working inside the space suit are some of the most serious and debilitating injuries astronauts encounter. Space suit injuries occur primarily in the Neutral Buoyancy Laboratory (NBL) underwater training facility due to accumulated musculoskeletal stress. We quantitatively explored the underlying causal mechanisms of injury. Logistic regression was used to identify relevant space suit components, training environment variables, and anthropometric dimensions related to an increased propensity for space-suited injury. Two groups of subjects were analyzed: those whose reported shoulder incident is attributable to the NBL or working in the space suit, and those whose shoulder incidence began in active duty, meaning working in the suit could be a contributing factor. For both groups, percent of training performed in the space suit planar hard upper torso (HUT) was the most important predictor variable for injury. Frequency of training and recovery between training were also significant metrics. The most relevant anthropometric dimensions were bideltoid breadth, expanded chest depth, and shoulder circumference. Finally, record of previous injury was found to be a relevant predictor for subsequent injury. The first statistical model correctly identifies 39% of injured subjects, while the second model correctly identifies 68% of injured subjects. A review of the literature suggests this is the first work to quantitatively evaluate the hypothesized causal mechanisms of all space-suited shoulder injuries. Although limited in predictive capability, each of the identified variables can be monitored and modified operationally to reduce future impacts on an astronaut's health.
Introduction to quantum statistical mechanics
International Nuclear Information System (INIS)
Bogolyubov, N.N.; Bogolyubov, N.N.
1980-01-01
In a set of lectures, which has been delivered at the Physical Department of Moscow State University as a special course for students represented are some basic ideas of quantum statistical mechanics. Considered are in particular, the Liouville equations in classical and quantum mechanics, canonical distribution and thermodynamical functions, two-time correlation functions and Green's functions in the theory of thermal equilibrium
QUANTUM MECHANICS WITHOUT STATISTICAL POSTULATES
International Nuclear Information System (INIS)
Geiger, G.
2000-01-01
The Bohmian formulation of quantum mechanics describes the measurement process in an intuitive way without a reduction postulate. Due to the chaotic motion of the hidden classical particle all statistical features of quantum mechanics during a sequence of repeated measurements can be derived in the framework of a deterministic single system theory
Renyi statistics in equilibrium statistical mechanics
International Nuclear Information System (INIS)
Parvan, A.S.; Biro, T.S.
2010-01-01
The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.
Quantum mechanics from classical statistics
International Nuclear Information System (INIS)
Wetterich, C.
2010-01-01
Quantum mechanics can emerge from classical statistics. A typical quantum system describes an isolated subsystem of a classical statistical ensemble with infinitely many classical states. The state of this subsystem can be characterized by only a few probabilistic observables. Their expectation values define a density matrix if they obey a 'purity constraint'. Then all the usual laws of quantum mechanics follow, including Heisenberg's uncertainty relation, entanglement and a violation of Bell's inequalities. No concepts beyond classical statistics are needed for quantum physics - the differences are only apparent and result from the particularities of those classical statistical systems which admit a quantum mechanical description. Born's rule for quantum mechanical probabilities follows from the probability concept for a classical statistical ensemble. In particular, we show how the non-commuting properties of quantum operators are associated to the use of conditional probabilities within the classical system, and how a unitary time evolution reflects the isolation of the subsystem. As an illustration, we discuss a classical statistical implementation of a quantum computer.
Modern Thermodynamics with Statistical Mechanics
Helrich, Carl S
2009-01-01
With the aim of presenting thermodynamics in as simple and as unified a form as possible, this textbook starts with an introduction to the first and second laws and then promptly addresses the complete set of the potentials in a subsequent chapter and as a central theme throughout. Before discussing modern laboratory measurements, the book shows that the fundamental quantities sought in the laboratory are those which are required for determining the potentials. Since the subjects of thermodynamics and statistical mechanics are a seamless whole, statistical mechanics is treated as integral part of the text. Other key topics such as irreversibility, the ideas of Ilya Prigogine, chemical reaction rates, equilibrium of heterogeneous systems, and transition-state theory serve to round out this modern treatment. An additional chapter covers quantum statistical mechanics due to active current research in Bose-Einstein condensation. End-of-chapter exercises, chapter summaries, and an appendix reviewing fundamental pr...
A possible realization of Einstein's causal theory underlying quantum mechanics
International Nuclear Information System (INIS)
Yussouff, M.
1979-06-01
It is shown that a new microscopic mechanics formulated earlier can be looked upon as a possible causal theory underlying quantum mechanics, which removes Einstein's famous objections against quantum theory. This approach is free from objections raised against Bohm's hidden variable theory and leads to a clear physical picture in terms of familiar concepts, if self interactions are held responsible for deviations from classical behaviour. The new level of physics unfolded by this approach may reveal novel frontiers in high-energy physics. (author)
Global and local aspects of causality in quantum mechanics
Directory of Open Access Journals (Sweden)
Stoica Cristinel
2013-09-01
Full Text Available Quantum mechanics forces us to reconsider certain aspects of classical causality. The ‘central mystery’ of quantum mechanics manifests in different ways, depending on the interpretation. This mystery can be formulated as the possibility of selecting part of the initial conditions of the Universe ‘retroactively’. This talk aims to show that there is a global, timeless, ‘bird’s view’ of the spacetime, which makes this mystery more reasonable. We will review some well-known quantum effects from the perspective of global consistency.
Statistical ensembles in quantum mechanics
International Nuclear Information System (INIS)
Blokhintsev, D.
1976-01-01
The interpretation of quantum mechanics presented in this paper is based on the concept of quantum ensembles. This concept differs essentially from the canonical one by that the interference of the observer into the state of a microscopic system is of no greater importance than in any other field of physics. Owing to this fact, the laws established by quantum mechanics are not of less objective character than the laws governing classical statistical mechanics. The paradoxical nature of some statements of quantum mechanics which result from the interpretation of the wave functions as the observer's notebook greatly stimulated the development of the idea presented. (Auth.)
Some speculations on a causal unification of relativity, gravitation, and quantum mechanics
Energy Technology Data Exchange (ETDEWEB)
Buonomano, V; Engel, A [Universidade Estadual de Campinas (Brazil). Instituto de Matematica
1976-03-01
Some speculations on a causal model that could provide a common conceptual foundation for relativity, gravitation, and quantum mechanics are presented. The present approach is a unification of three theories, the first being the repulsive theory of gravitational forces first proposed by Lesage who attempted to explain gravitational forces from the principle of conservation of momentum of the hypothetical particles gravitons. The second of these theories is the Brownian motion theory of quantum mechanics or stochastic mechanics, which treats the nondeterministic nature of quantum mechanics as being due to a Brownian motion of all objects. This Brownian motion being caused by the statistical variation in the graviton flux. The above two theories are unified in this article with the causal theory of special relativity. The Big Bang theory of the creation of the Universe is assumed. An experimental test is proposed.
Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.
Gopnik, Alison; Wellman, Henry M
2012-11-01
We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.
Statistical mechanics of multipartite entanglement
Facchi, P.; Florio, G.; Marzolino, U.; Parisi, G.; Pascazio, S.
2009-02-01
We characterize the multipartite entanglement of a system of n qubits in terms of the distribution function of the bipartite purity over all balanced bipartitions. We search for those (maximally multipartite entangled) states whose purity is minimum for all bipartitions and recast this optimization problem into a problem of statistical mechanics.
Statistical mechanics of multipartite entanglement
Energy Technology Data Exchange (ETDEWEB)
Facchi, P [Dipartimento di Matematica, Universita di Bari, I-70125 Bari (Italy); Florio, G; Pascazio, S [Istituto Nazionale di Fisica Nucleare, Sezione di Bari, I-70126 Bari (Italy); Marzolino, U [Dipartimento di Fisica Teorica, Universita di Trieste, Strada Costiera 11, 34014 Trieste (Italy); Parisi, G [Dipartimento di Fisica, Universita di Roma ' La Sapienza' , Piazzale Aldo Moro 2, 00185 Roma, Italy, Centre for Statistical Mechanics and Complexity (SMC), CNR-INFM, 00185 Roma (Italy)
2009-02-06
We characterize the multipartite entanglement of a system of n qubits in terms of the distribution function of the bipartite purity over all balanced bipartitions. We search for those (maximally multipartite entangled) states whose purity is minimum for all bipartitions and recast this optimization problem into a problem of statistical mechanics.
Statistical mechanics of multipartite entanglement
International Nuclear Information System (INIS)
Facchi, P; Florio, G; Pascazio, S; Marzolino, U; Parisi, G
2009-01-01
We characterize the multipartite entanglement of a system of n qubits in terms of the distribution function of the bipartite purity over all balanced bipartitions. We search for those (maximally multipartite entangled) states whose purity is minimum for all bipartitions and recast this optimization problem into a problem of statistical mechanics
Statistical Mechanics of Prion Diseases
International Nuclear Information System (INIS)
Slepoy, A.; Singh, R. R. P.; Pazmandi, F.; Kulkarni, R. V.; Cox, D. L.
2001-01-01
We present a two-dimensional, lattice based, protein-level statistical mechanical model for prion diseases (e.g., mad cow disease) with concomitant prion protein misfolding and aggregation. Our studies lead us to the hypothesis that the observed broad incubation time distribution in epidemiological data reflect fluctuation dominated growth seeded by a few nanometer scale aggregates, while much narrower incubation time distributions for innoculated lab animals arise from statistical self-averaging. We model ''species barriers'' to prion infection and assess a related treatment protocol
Statistical mechanics of black holes
International Nuclear Information System (INIS)
Harms, B.; Leblanc, Y.
1992-01-01
We analyze the statistical mechanics of a gas of neutral and charged black holes. The microcanonical ensemble is the only possible approach to this system, and the equilibrium configuration is the one for which most of the energy is carried by a single black hole. Schwarzschild black holes are found to obey the statistical bootstrap condition. In all cases, the microcanonical temperature is identical to the Hawking temperature of the most massive black hole in the gas. U(1) charges in general break the bootstrap property. The problems of black-hole decay and of quantum coherence are also addressed
Nonequilibrium statistical mechanics ensemble method
Eu, Byung Chan
1998-01-01
In this monograph, nonequilibrium statistical mechanics is developed by means of ensemble methods on the basis of the Boltzmann equation, the generic Boltzmann equations for classical and quantum dilute gases, and a generalised Boltzmann equation for dense simple fluids The theories are developed in forms parallel with the equilibrium Gibbs ensemble theory in a way fully consistent with the laws of thermodynamics The generalised hydrodynamics equations are the integral part of the theory and describe the evolution of macroscopic processes in accordance with the laws of thermodynamics of systems far removed from equilibrium Audience This book will be of interest to researchers in the fields of statistical mechanics, condensed matter physics, gas dynamics, fluid dynamics, rheology, irreversible thermodynamics and nonequilibrium phenomena
Statistical mechanics of complex networks
Rubi, Miguel; Diaz-Guilera, Albert
2003-01-01
Networks can provide a useful model and graphic image useful for the description of a wide variety of web-like structures in the physical and man-made realms, e.g. protein networks, food webs and the Internet. The contributions gathered in the present volume provide both an introduction to, and an overview of, the multifaceted phenomenology of complex networks. Statistical Mechanics of Complex Networks also provides a state-of-the-art picture of current theoretical methods and approaches.
Integrating functional data to prioritize causal variants in statistical fine-mapping studies.
Directory of Open Access Journals (Sweden)
Gleb Kichaev
2014-10-01
Full Text Available Standard statistical approaches for prioritization of variants for functional testing in fine-mapping studies either use marginal association statistics or estimate posterior probabilities for variants to be causal under simplifying assumptions. Here, we present a probabilistic framework that integrates association strength with functional genomic annotation data to improve accuracy in selecting plausible causal variants for functional validation. A key feature of our approach is that it empirically estimates the contribution of each functional annotation to the trait of interest directly from summary association statistics while allowing for multiple causal variants at any risk locus. We devise efficient algorithms that estimate the parameters of our model across all risk loci to further increase performance. Using simulations starting from the 1000 Genomes data, we find that our framework consistently outperforms the current state-of-the-art fine-mapping methods, reducing the number of variants that need to be selected to capture 90% of the causal variants from an average of 13.3 to 10.4 SNPs per locus (as compared to the next-best performing strategy. Furthermore, we introduce a cost-to-benefit optimization framework for determining the number of variants to be followed up in functional assays and assess its performance using real and simulation data. We validate our findings using a large scale meta-analysis of four blood lipids traits and find that the relative probability for causality is increased for variants in exons and transcription start sites and decreased in repressed genomic regions at the risk loci of these traits. Using these highly predictive, trait-specific functional annotations, we estimate causality probabilities across all traits and variants, reducing the size of the 90% confidence set from an average of 17.5 to 13.5 variants per locus in this data.
Statistical mechanics of economics I
Energy Technology Data Exchange (ETDEWEB)
Kusmartsev, F.V., E-mail: F.Kusmartsev@lboro.ac.u [Department of Physics, Loughborough University, Leicestershire, LE11 3TU (United Kingdom)
2011-02-07
We show that statistical mechanics is useful in the description of financial crisis and economics. Taking a large amount of instant snapshots of a market over an interval of time we construct their ensembles and study their statistical interference. This results in a probability description of the market and gives capital, money, income, wealth and debt distributions, which in the most cases takes the form of the Bose-Einstein distribution. In addition, statistical mechanics provides the main market equations and laws which govern the correlations between the amount of money, debt, product, prices and number of retailers. We applied the found relations to a study of the evolution of the economics in USA between the years 1996 to 2008 and observe that over that time the income of a major population is well described by the Bose-Einstein distribution which parameters are different for each year. Each financial crisis corresponds to a peak in the absolute activity coefficient. The analysis correctly indicates the past crises and predicts the future one.
Statistical mechanics of economics I
International Nuclear Information System (INIS)
Kusmartsev, F.V.
2011-01-01
We show that statistical mechanics is useful in the description of financial crisis and economics. Taking a large amount of instant snapshots of a market over an interval of time we construct their ensembles and study their statistical interference. This results in a probability description of the market and gives capital, money, income, wealth and debt distributions, which in the most cases takes the form of the Bose-Einstein distribution. In addition, statistical mechanics provides the main market equations and laws which govern the correlations between the amount of money, debt, product, prices and number of retailers. We applied the found relations to a study of the evolution of the economics in USA between the years 1996 to 2008 and observe that over that time the income of a major population is well described by the Bose-Einstein distribution which parameters are different for each year. Each financial crisis corresponds to a peak in the absolute activity coefficient. The analysis correctly indicates the past crises and predicts the future one.
Statistical mechanics and Lorentz violation
International Nuclear Information System (INIS)
Colladay, Don; McDonald, Patrick
2004-01-01
The theory of statistical mechanics is studied in the presence of Lorentz-violating background fields. The analysis is performed using the Standard-Model Extension (SME) together with a Jaynesian formulation of statistical inference. Conventional laws of thermodynamics are obtained in the presence of a perturbed hamiltonian that contains the Lorentz-violating terms. As an example, properties of the nonrelativistic ideal gas are calculated in detail. To lowest order in Lorentz violation, the scalar thermodynamic variables are only corrected by a rotationally invariant combination of parameters that mimics a (frame dependent) effective mass. Spin-couplings can induce a temperature-independent polarization in the classical gas that is not present in the conventional case. Precision measurements in the residual expectation values of the magnetic moment of Fermi gases in the limit of high temperature may provide interesting limits on these parameters
Statistical mechanics of cellular automata
International Nuclear Information System (INIS)
Wolfram, S.
1983-01-01
Cellular automata are used as simple mathematical models to investigate self-organization in statistical mechanics. A detailed analysis is given of ''elementary'' cellular automata consisting of a sequence of sites with values 0 or 1 on a line, with each site evolving deterministically in discrete time steps according to p definite rules involving the values of its nearest neighbors. With simple initial configurations, the cellular automata either tend to homogeneous states, or generate self-similar patterns with fractal dimensions approx. =1.59 or approx. =1.69. With ''random'' initial configurations, the irreversible character of the cellular automaton evolution leads to several self-organization phenomena. Statistical properties of the structures generated are found to lie in two universality classes, independent of the details of the initial state or the cellular automaton rules. More complicated cellular automata are briefly considered, and connections with dynamical systems theory and the formal theory of computation are discussed
Statistical mechanics and field theory
International Nuclear Information System (INIS)
Samuel, S.A.
1979-05-01
Field theory methods are applied to statistical mechanics. Statistical systems are related to fermionic-like field theories through a path integral representation. Considered are the Ising model, the free-fermion model, and close-packed dimer problems on various lattices. Graphical calculational techniques are developed. They are powerful and yield a simple procedure to compute the vacuum expectation value of an arbitrary product of Ising spin variables. From a field theorist's point of view, this is the simplest most logical derivation of the Ising model partition function and correlation functions. This work promises to open a new area of physics research when the methods are used to approximate unsolved problems. By the above methods a new model named the 128 pseudo-free vertex model is solved. Statistical mechanics intuition is applied to field theories. It is shown that certain relativistic field theories are equivalent to classical interacting gases. Using this analogy many results are obtained, particularly for the Sine-Gordon field theory. Quark confinement is considered. Although not a proof of confinement, a logical, esthetic, and simple picture is presented of how confinement works. A key ingredient is the insight gained by using an analog statistical system consisting of a gas of macromolecules. This analogy allows the computation of Wilson loops in the presence of topological vortices and when symmetry breakdown occurs in the topological quantum number. Topological symmetry breakdown calculations are placed on approximately the same level of rigor as instanton calculations. The picture of confinement that emerges is similar to the dual Meissner type advocated by Mandelstam. Before topological symmetry breakdown, QCD has monopoles bound linearly together by three topological strings. Topological symmetry breakdown corresponds to a new phase where these monopoles are liberated. It is these liberated monopoles that confine quarks. 64 references
Introduction to quantum statistical mechanics
Bogolyubov, N N
2010-01-01
Introduction to Quantum Statistical Mechanics (Second Edition) may be used as an advanced textbook by graduate students, even ambitious undergraduates in physics. It is also suitable for non experts in physics who wish to have an overview of some of the classic and fundamental quantum models in the subject. The explanation in the book is detailed enough to capture the interest of the reader, and complete enough to provide the necessary background material needed to dwell further into the subject and explore the research literature.
Annotations to quantum statistical mechanics
Kim, In-Gee
2018-01-01
This book is a rewritten and annotated version of Leo P. Kadanoff and Gordon Bayms lectures that were presented in the book Quantum Statistical Mechanics: Greens Function Methods in Equilibrium and Nonequilibrium Problems. The lectures were devoted to a discussion on the use of thermodynamic Greens functions in describing the properties of many-particle systems. The functions provided a method for discussing finite-temperature problems with no more conceptual difficulty than ground-state problems, and the method was equally applicable to boson and fermion systems and equilibrium and nonequilibrium problems. The lectures also explained nonequilibrium statistical physics in a systematic way and contained essential concepts on statistical physics in terms of Greens functions with sufficient and rigorous details. In-Gee Kim thoroughly studied the lectures during one of his research projects but found that the unspecialized method used to present them in the form of a book reduced their readability. He st...
Statistical mechanics of violent relaxation
International Nuclear Information System (INIS)
Shu, F.H.
1978-01-01
We reexamine the foundations of Lynden-Bell's statistical mechanical discussion of violent relaxation in collisionless stellar systems. We argue that Lynden-Bell's formulation in terms of a continuum description introduces unnecessary complications, and we consider a more conventional formulation in terms of particles. We then find the exclusion principle discovered by Lynden-Bell to be quantitatively important only at phase densities where two-body encounters are no longer negligible. Since the edynamical basis for the exclusion principle vanishes in such cases anyway, Lynden-Bell statistics always reduces in practice to Maxwell-Boltzmann statistics when applied to stellar systems. Lynden-Bell also found the equilibrium distribution function generally to be a sum of Maxwellians with velocity dispersions dependent on the phase density at star formation. We show that this difficulty vanishes in the particulate description for an encounterless stellar system as long as stars of different masses are initially well mixed in phase space. Our methods also demonstrate the equivalence between Gibbs's formalism which uses the microcanonical ensemble and Boltzmann's formalism which uses a coarse-grained continuum description. In addition, we clarify the concept of irreversible behavior on a macroscopic scale for an encounterless stellar system. Finally, we comment on the use of unusual macroscopic constraints to simulate the effects of incomplete relaxation
Statistical mechanics of program systems
International Nuclear Information System (INIS)
Neirotti, Juan P; Caticha, Nestor
2006-01-01
We discuss the collective behaviour of a set of operators and variables that constitute a program and the emergence of meaningful computational properties in the language of statistical mechanics. This is done by appropriately modifying available Monte Carlo methods to deal with hierarchical structures. The study suggests, in analogy with simulated annealing, a method to automatically design programs. Reasonable solutions can be found, at low temperatures, when the method is applied to simple toy problems such as finding an algorithm that determines the roots of a function or one that makes a nonlinear regression. Peaks in the specific heat are interpreted as signalling phase transitions which separate regions where different algorithmic strategies are used to solve the problem
Statistical Mechanics of Turbulent Flows
International Nuclear Information System (INIS)
Cambon, C
2004-01-01
This is a handbook for a computational approach to reacting flows, including background material on statistical mechanics. In this sense, the title is somewhat misleading with respect to other books dedicated to the statistical theory of turbulence (e.g. Monin and Yaglom). In the present book, emphasis is placed on modelling (engineering closures) for computational fluid dynamics. The probabilistic (pdf) approach is applied to the local scalar field, motivated first by the nonlinearity of chemical source terms which appear in the transport equations of reacting species. The probabilistic and stochastic approaches are also used for the velocity field and particle position; nevertheless they are essentially limited to Lagrangian models for a local vector, with only single-point statistics, as for the scalar. Accordingly, conventional techniques, such as single-point closures for RANS (Reynolds-averaged Navier-Stokes) and subgrid-scale models for LES (large-eddy simulations), are described and in some cases reformulated using underlying Langevin models and filtered pdfs. Even if the theoretical approach to turbulence is not discussed in general, the essentials of probabilistic and stochastic-processes methods are described, with a useful reminder concerning statistics at the molecular level. The book comprises 7 chapters. Chapter 1 briefly states the goals and contents, with a very clear synoptic scheme on page 2. Chapter 2 presents definitions and examples of pdfs and related statistical moments. Chapter 3 deals with stochastic processes, pdf transport equations, from Kramer-Moyal to Fokker-Planck (for Markov processes), and moments equations. Stochastic differential equations are introduced and their relationship to pdfs described. This chapter ends with a discussion of stochastic modelling. The equations of fluid mechanics and thermodynamics are addressed in chapter 4. Classical conservation equations (mass, velocity, internal energy) are derived from their
Statistical Mechanics of Turbulent Dynamos
Shebalin, John V.
2014-01-01
Incompressible magnetohydrodynamic (MHD) turbulence and magnetic dynamos, which occur in magnetofluids with large fluid and magnetic Reynolds numbers, will be discussed. When Reynolds numbers are large and energy decays slowly, the distribution of energy with respect to length scale becomes quasi-stationary and MHD turbulence can be described statistically. In the limit of infinite Reynolds numbers, viscosity and resistivity become zero and if these values are used in the MHD equations ab initio, a model system called ideal MHD turbulence results. This model system is typically confined in simple geometries with some form of homogeneous boundary conditions, allowing for velocity and magnetic field to be represented by orthogonal function expansions. One advantage to this is that the coefficients of the expansions form a set of nonlinearly interacting variables whose behavior can be described by equilibrium statistical mechanics, i.e., by a canonical ensemble theory based on the global invariants (energy, cross helicity and magnetic helicity) of ideal MHD turbulence. Another advantage is that truncated expansions provide a finite dynamical system whose time evolution can be numerically simulated to test the predictions of the associated statistical mechanics. If ensemble predictions are the same as time averages, then the system is said to be ergodic; if not, the system is nonergodic. Although it had been implicitly assumed in the early days of ideal MHD statistical theory development that these finite dynamical systems were ergodic, numerical simulations provided sufficient evidence that they were, in fact, nonergodic. Specifically, while canonical ensemble theory predicted that expansion coefficients would be (i) zero-mean random variables with (ii) energy that decreased with length scale, it was found that although (ii) was correct, (i) was not and the expected ergodicity was broken. The exact cause of this broken ergodicity was explained, after much
A Causal Mechanism of Policy Innovation: The Reform of Colombia’s Oil-Rents Management System
Directory of Open Access Journals (Sweden)
Bayron Paz
2018-01-01
Full Text Available This article analyzes policy innovation in Colombia, through the adoption of a new centralized oil-rent management system in 2011, after 20 years of decentralized policies. Using a policy-design framework, we identify a causal mechanism linking the opening of a policy window to policy change as a combination of the emergence of a new policy network, the adoption of a new policy paradigm and the selection of a new instruments mix. Drawing on Bayesian statistics, the 11 tests conducted on the causal mechanism show the importance of State resources of information, authority, treasury and organization to assess the outcome of policy change.
Perceptual learning shapes multisensory causal inference via two distinct mechanisms.
McGovern, David P; Roudaia, Eugenie; Newell, Fiona N; Roach, Neil W
2016-04-19
To accurately represent the environment, our brains must integrate sensory signals from a common source while segregating those from independent sources. A reasonable strategy for performing this task is to restrict integration to cues that coincide in space and time. However, because multisensory signals are subject to differential transmission and processing delays, the brain must retain a degree of tolerance for temporal discrepancies. Recent research suggests that the width of this 'temporal binding window' can be reduced through perceptual learning, however, little is known about the mechanisms underlying these experience-dependent effects. Here, in separate experiments, we measure the temporal and spatial binding windows of human participants before and after training on an audiovisual temporal discrimination task. We show that training leads to two distinct effects on multisensory integration in the form of (i) a specific narrowing of the temporal binding window that does not transfer to spatial binding and (ii) a general reduction in the magnitude of crossmodal interactions across all spatiotemporal disparities. These effects arise naturally from a Bayesian model of causal inference in which learning improves the precision of audiovisual timing estimation, whilst concomitantly decreasing the prior expectation that stimuli emanate from a common source.
International Nuclear Information System (INIS)
Buonomano, V.; Engel, A.
1974-10-01
Some speculations on a causal model that seems to provide a common conceptual foundation for Relativity Gravitation and Quantum Mechanics are presented. The present approach is a unifying of three theories. The first being the repulsive theory of gravitational forces first proposed by Lesage in the eighteenth century. The second of these theories is the Brownian Motion Theory of Quantum Mechanics or Stocastic Mechanics which treats the non-deterministic Nature of Quantum Mechanics as being due to a Brownian motion of all objects. This Brownian motion being caused by the statistical variation in the graviton flux. The above two theories are unified with the Causal Theory of Special Relativity. Within the present context, the time dilations (and other effects) of Relativity are explained by assuming that the rate of a clock is a function of the total number or intensity of gravitons and the average frequency or energy of the gravitons that the clock receives. The Special Theory would then be the special case of the General Theory where the intensity is constant but the average frequency varies. In all the previous it is necessary to assume a particular model of the creation of the universe, namely the Big Bang Theory. This assumption gives us the existence of a preferred reference frame, the frame in which the Big Bang explosion was at rest. The above concepts of graviton distribution and real time dilations become meaningful by assuming the Big Bang Theory along with this preferred frame. An experimental test is proposed
Zenil, Hector
2018-02-18
To extract and learn representations leading to generative mechanisms from data, especially without making arbitrary decisions and biased assumptions, is a central challenge in most areas of scientific research particularly in connection to current major limitations of influential topics and methods of machine and deep learning as they have often lost sight of the model component. Complex data is usually produced by interacting sources with different mechanisms. Here we introduce a parameter-free model-based approach, based upon the seminal concept of Algorithmic Probability, that decomposes an observation and signal into its most likely algorithmic generative mechanisms. Our methods use a causal calculus to infer model representations. We demonstrate the method ability to distinguish interacting mechanisms and deconvolve them, regardless of whether the objects produce strings, space-time evolution diagrams, images or networks. We numerically test and evaluate our method and find that it can disentangle observations from discrete dynamic systems, random and complex networks. We think that these causal inference techniques can contribute as key pieces of information for estimations of probability distributions complementing other more statistical-oriented techniques that otherwise lack model inference capabilities.
DEFF Research Database (Denmark)
Huber, Martin; Lechner, Michael; Mellace, Giovanni
Previous research found that less accommodating caseworkers are more successful in placing unemployed workers into employment. This paper tries to shed more light on the causal mechanisms behind this result using semiparametric mediation analysis. Analysing very informative linked jobseeker...
Quantum mechanics as applied mathematical statistics
International Nuclear Information System (INIS)
Skala, L.; Cizek, J.; Kapsa, V.
2011-01-01
Basic mathematical apparatus of quantum mechanics like the wave function, probability density, probability density current, coordinate and momentum operators, corresponding commutation relation, Schroedinger equation, kinetic energy, uncertainty relations and continuity equation is discussed from the point of view of mathematical statistics. It is shown that the basic structure of quantum mechanics can be understood as generalization of classical mechanics in which the statistical character of results of measurement of the coordinate and momentum is taken into account and the most important general properties of statistical theories are correctly respected.
Statistical mechanics principles and selected applications
Hill, Terrell L
1956-01-01
""Excellent … a welcome addition to the literature on the subject."" - ScienceBefore the publication of this standard, oft-cited book, there were few if any statistical-mechanics texts that incorporated reviews of both fundamental principles and recent developments in the field.In this volume, Professor Hill offers just such a dual presentation - a useful account of basic theory and of its applications, made accessible in a comprehensive format. The book opens with concise, unusually clear introductory chapters on classical statistical mechanics, quantum statistical mechanics and the relatio
mediation: R package for causal mediation analysis
Tingley, Dustin; Yamamoto, Teppei; Hirose, Kentaro; Keele, Luke; Imai, Kosuke
2012-01-01
In this paper, we describe the R package mediation for conducting causal mediation analysis in applied empirical research. In many scientific disciplines, the goal of researchers is not only estimating causal effects of a treatment but also understanding the process in which the treatment causally affects the outcome. Causal mediation analysis is frequently used to assess potential causal mechanisms. The mediation package implements a comprehensive suite of statistical tools for conducting su...
Thermalized solutions, statistical mechanics and turbulence
Indian Academy of Sciences (India)
2015-02-20
Feb 20, 2015 ... In this study, we examine the intriguing connection between turbulence and equilibrium statistical mechanics. There are several recent works which emphasize this connection. Thus in the last ... Current Issue : Vol. 90, Issue 6.
Science Academies' Refresher Course in Statistical Mechanics
Indian Academy of Sciences (India)
2018-02-27
Feb 27, 2018 ... Post Graduate and Research Department of Physics. Bishop Moore ... The Course will cover the basic and advanced topics of Statistical. Mechanics ... Courses of good standing for promotion, vide notification. F3-1/2009 ...
Is there a statistical mechanics of turbulence?
International Nuclear Information System (INIS)
Kraichnan, R.H.; Chen, S.Y.
1988-09-01
The statistical-mechanical treatment of turbulence is made questionable by strong nonlinearity and strong disequilibrium that result in the creation of ordered structures imbedded in disorder. Model systems are described which may provide some hope that a compact, yet faithful, statistical description of turbulence nevertheless is possible. Some essential dynamic features of the models are captured by low-order statistical approximations despite strongly non-Gaussian behavior. 31 refs., 5 figs
Projection operator techniques in nonequilibrium statistical mechanics
International Nuclear Information System (INIS)
Grabert, H.
1982-01-01
This book is an introduction to the application of the projection operator technique to the statistical mechanics of irreversible processes. After a general introduction to the projection operator technique and statistical thermodynamics the Fokker-Planck and the master equation approach are described together with the response theory. Then, as applications the damped harmonic oscillator, simple fluids, and the spin relaxation are considered. (HSI)
Dediu, D.
2008-01-01
The causal correlations between human genetic variants and linguistic (typological) features could represent the mechanism required for gradual, accretionary models of language evolution. The causal link is mediated by the process of cultural transmission of language across generations in a population of genetically biased individuals. The particular case of Tone, ASPM and Microcephalin is discussed as an illustration. It is proposed that this type of genetically-influenced linguistic bias, c...
Rohrlich, Daniel
Y. Aharonov and A. Shimony both conjectured that two axioms - relativistic causality (``no superluminal signalling'') and nonlocality - so nearly contradict each other that only quantum mechanics reconciles them. Can we indeed derive quantum mechanics, at least in part, from these two axioms? No: ``PR-box'' correlations show that quantum correlations are not the most nonlocal correlations consistent with relativistic causality. Here we replace ``nonlocality'' with ``retrocausality'' and supplement the axioms of relativistic causality and retrocausality with a natural and minimal third axiom: the existence of a classical limit, in which macroscopic observables commute. That is, just as quantum mechanics has a classical limit, so must any generalization of quantum mechanics. In this limit, PR-box correlations violaterelativistic causality. Generalized to all stronger-than-quantum bipartite correlations, this result is a derivation of Tsirelson's bound (a theorem of quantum mechanics) from the three axioms of relativistic causality, retrocausality and the existence of a classical limit. Although the derivation does not assume quantum mechanics, it points to the Hilbert space structure that underlies quantum correlations. I thank the John Templeton Foundation (Project ID 43297) and the Israel Science Foundation (Grant No. 1190/13) for support.
The large deviation approach to statistical mechanics
International Nuclear Information System (INIS)
Touchette, Hugo
2009-01-01
The theory of large deviations is concerned with the exponential decay of probabilities of large fluctuations in random systems. These probabilities are important in many fields of study, including statistics, finance, and engineering, as they often yield valuable information about the large fluctuations of a random system around its most probable state or trajectory. In the context of equilibrium statistical mechanics, the theory of large deviations provides exponential-order estimates of probabilities that refine and generalize Einstein's theory of fluctuations. This review explores this and other connections between large deviation theory and statistical mechanics, in an effort to show that the mathematical language of statistical mechanics is the language of large deviation theory. The first part of the review presents the basics of large deviation theory, and works out many of its classical applications related to sums of random variables and Markov processes. The second part goes through many problems and results of statistical mechanics, and shows how these can be formulated and derived within the context of large deviation theory. The problems and results treated cover a wide range of physical systems, including equilibrium many-particle systems, noise-perturbed dynamics, nonequilibrium systems, as well as multifractals, disordered systems, and chaotic systems. This review also covers many fundamental aspects of statistical mechanics, such as the derivation of variational principles characterizing equilibrium and nonequilibrium states, the breaking of the Legendre transform for nonconcave entropies, and the characterization of nonequilibrium fluctuations through fluctuation relations.
The large deviation approach to statistical mechanics
Touchette, Hugo
2009-07-01
The theory of large deviations is concerned with the exponential decay of probabilities of large fluctuations in random systems. These probabilities are important in many fields of study, including statistics, finance, and engineering, as they often yield valuable information about the large fluctuations of a random system around its most probable state or trajectory. In the context of equilibrium statistical mechanics, the theory of large deviations provides exponential-order estimates of probabilities that refine and generalize Einstein’s theory of fluctuations. This review explores this and other connections between large deviation theory and statistical mechanics, in an effort to show that the mathematical language of statistical mechanics is the language of large deviation theory. The first part of the review presents the basics of large deviation theory, and works out many of its classical applications related to sums of random variables and Markov processes. The second part goes through many problems and results of statistical mechanics, and shows how these can be formulated and derived within the context of large deviation theory. The problems and results treated cover a wide range of physical systems, including equilibrium many-particle systems, noise-perturbed dynamics, nonequilibrium systems, as well as multifractals, disordered systems, and chaotic systems. This review also covers many fundamental aspects of statistical mechanics, such as the derivation of variational principles characterizing equilibrium and nonequilibrium states, the breaking of the Legendre transform for nonconcave entropies, and the characterization of nonequilibrium fluctuations through fluctuation relations.
The Dirac equation in classical statistical mechanics
International Nuclear Information System (INIS)
Ord, G.N.
2002-01-01
The Dirac equation, usually obtained by 'quantizing' a classical stochastic model is here obtained directly within classical statistical mechanics. The special underlying space-time geometry of the random walk replaces the missing analytic continuation, making the model 'self-quantizing'. This provides a new context for the Dirac equation, distinct from its usual context in relativistic quantum mechanics
Emergence of quantum mechanics from classical statistics
International Nuclear Information System (INIS)
Wetterich, C
2009-01-01
The conceptual setting of quantum mechanics is subject to an ongoing debate from its beginnings until now. The consequences of the apparent differences between quantum statistics and classical statistics range from the philosophical interpretations to practical issues as quantum computing. In this note we demonstrate how quantum mechanics can emerge from classical statistical systems. We discuss conditions and circumstances for this to happen. Quantum systems describe isolated subsystems of classical statistical systems with infinitely many states. While infinitely many classical observables 'measure' properties of the subsystem and its environment, the state of the subsystem can be characterized by the expectation values of only a few probabilistic observables. They define a density matrix, and all the usual laws of quantum mechanics follow. No concepts beyond classical statistics are needed for quantum physics - the differences are only apparent and result from the particularities of those classical statistical systems which admit a quantum mechanical description. In particular, we show how the non-commuting properties of quantum operators are associated to the use of conditional probabilities within the classical system, and how a unitary time evolution reflects the isolation of the subsystem.
An introduction to thermodynamics and statistical mechanics
Saxena, A K
2016-01-01
An Introduction to Thermodynamics and Statistical Mechanics aims to serve as a text book for undergraduate hons.and postgraduate students of physics. The book covers First Law of Thermodynamics, Entropy and Second Law ofThermodynamics, Thermodynamic Relations, The Statistical Basis of Thermodynamics, Microcanonical Ensemble,Classical Statistical and Canonical Distribution, Grand Canonical Ensemble, Quantum Statistical Mechanics, PhaseTransitions, Fluctuations, Irreversible Processes and Transport Phenomena (Diffusion).SALIENT FEATURES:iC* Offers students a conceptual development of the subjectiC* Review questions at the end of chapters.NEW TO THE SECOND EDITIONiC* PVT SurfacesiC* Real Heat EnginesiC* Van der Waals Models (Qualitative Considerations)iC* Cluster ExpansioniC* Brownian Motion (Einstein's Theory)
Learning Predictive Statistics: Strategies and Brain Mechanisms.
Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E; Kourtzi, Zoe
2017-08-30
When immersed in a new environment, we are challenged to decipher initially incomprehensible streams of sensory information. However, quite rapidly, the brain finds structure and meaning in these incoming signals, helping us to predict and prepare ourselves for future actions. This skill relies on extracting the statistics of event streams in the environment that contain regularities of variable complexity from simple repetitive patterns to complex probabilistic combinations. Here, we test the brain mechanisms that mediate our ability to adapt to the environment's statistics and predict upcoming events. By combining behavioral training and multisession fMRI in human participants (male and female), we track the corticostriatal mechanisms that mediate learning of temporal sequences as they change in structure complexity. We show that learning of predictive structures relates to individual decision strategy; that is, selecting the most probable outcome in a given context (maximizing) versus matching the exact sequence statistics. These strategies engage distinct human brain regions: maximizing engages dorsolateral prefrontal, cingulate, sensory-motor regions, and basal ganglia (dorsal caudate, putamen), whereas matching engages occipitotemporal regions (including the hippocampus) and basal ganglia (ventral caudate). Our findings provide evidence for distinct corticostriatal mechanisms that facilitate our ability to extract behaviorally relevant statistics to make predictions. SIGNIFICANCE STATEMENT Making predictions about future events relies on interpreting streams of information that may initially appear incomprehensible. Past work has studied how humans identify repetitive patterns and associative pairings. However, the natural environment contains regularities that vary in complexity from simple repetition to complex probabilistic combinations. Here, we combine behavior and multisession fMRI to track the brain mechanisms that mediate our ability to adapt to
Statistical mechanics and applications in condensed matter
Di Castro, Carlo
2015-01-01
This innovative and modular textbook combines classical topics in thermodynamics, statistical mechanics and many-body theory with the latest developments in condensed matter physics research. Written by internationally renowned experts and logically structured to cater for undergraduate and postgraduate students and researchers, it covers the underlying theoretical principles and includes numerous problems and worked examples to put this knowledge into practice. Three main streams provide a framework for the book; beginning with thermodynamics and classical statistical mechanics, including mean field approximation, fluctuations and the renormalization group approach to critical phenomena. The authors then examine quantum statistical mechanics, covering key topics such as normal Fermi and Luttinger liquids, superfluidity and superconductivity. Finally, they explore classical and quantum kinetics, Anderson localization and quantum interference, and disordered Fermi liquids. Unique in providing a bridge between ...
Gross, Kevin; Rosenheim, Jay A
2011-10-01
Secondary pest outbreaks occur when the use of a pesticide to reduce densities of an unwanted target pest species triggers subsequent outbreaks of other pest species. Although secondary pest outbreaks are thought to be familiar in agriculture, their rigorous documentation is made difficult by the challenges of performing randomized experiments at suitable scales. Here, we quantify the frequency and monetary cost of secondary pest outbreaks elicited by early-season applications of broad-spectrum insecticides to control the plant bug Lygus spp. (primarily L. hesperus) in cotton grown in the San Joaquin Valley, California, USA. We do so by analyzing pest-control management practices for 969 cotton fields spanning nine years and 11 private ranches. Our analysis uses statistical methods to draw formal causal inferences from nonexperimental data that have become popular in public health and economics, but that are not yet widely known in ecology or agriculture. We find that, in fields that received an early-season broad-spectrum insecticide treatment for Lygus, 20.2% +/- 4.4% (mean +/- SE) of late-season pesticide costs were attributable to secondary pest outbreaks elicited by the early-season insecticide application for Lygus. In 2010 U.S. dollars, this equates to an additional $6.00 +/- $1.30 (mean +/- SE) per acre in management costs. To the extent that secondary pest outbreaks may be driven by eliminating pests' natural enemies, these figures place a lower bound on the monetary value of ecosystem services provided by native communities of arthropod predators and parasitoids in this agricultural system.
Causal evidence in risk and policy perceptions: Applying the covariation/mechanism framework.
Baucum, Matt; John, Richard
2018-05-01
Today's information-rich society demands constant evaluation of cause-effect relationships; behaviors and attitudes ranging from medical choices to voting decisions to policy preferences typically entail some form of causal inference ("Will this policy reduce crime?", "Will this activity improve my health?"). Cause-effect relationships such as these can be thought of as depending on two qualitatively distinct forms of evidence: covariation-based evidence (e.g., "states with this policy have fewer homicides") or mechanism-based (e.g., "this policy will reduce crime by discouraging repeat offenses"). Some psychological work has examined how people process these two forms of causal evidence in instances of "everyday" causality (e.g., assessing why a car will not start), but it is not known how these two forms of evidence contribute to causal judgments in matters of public risk or policy. Three studies (n = 715) investigated whether judgments of risk and policy scenarios would be affected by covariation and mechanism evidence and whether the evidence types interacted with one another (as suggested by past studies). Results showed that causal judgments varied linearly with mechanism strength and logarithmically with covariation strength, and that the evidence types produced only additive effects (but no interaction). We discuss the results' implications for risk communication and policy information dissemination. Copyright © 2018 Elsevier B.V. All rights reserved.
Fluctuations of physical values in statistical mechanics
International Nuclear Information System (INIS)
Zaripov, R.G.
1999-01-01
The new matrix inequalities for the boundary of measurement accuracy of physical values in the ensemble of quantum systems were obtained. The multidimensional thermodynamical parameter measurement is estimated. The matrix inequalities obtained are quantum analogs of the Cramer-Rao information inequalities in mathematical statistics. The quantity of information in quantum mechanical measurement, connected with the boundaries of jointly measurable values in one macroscopic experiment was determined. The lower boundary of the variance of estimation of multidimensional quantum mechanical parameter was found. (author)
The fuzzy cube and causal efficacy: representation of concomitant mechanisms in stroke.
Jobe, Thomas H.; Helgason, Cathy M.
1998-04-01
Twentieth century medical science has embraced nineteenth century Boolean probability theory based upon two-valued Aristotelian logic. With the later addition of bit-based, von Neumann structured computational architectures, an epistemology based on randomness has led to a bivalent epidemiological methodology that dominates medical decision making. In contrast, fuzzy logic, based on twentieth century multi-valued logic, and computational structures that are content addressed and adaptively modified, has advanced a new scientific paradigm for the twenty-first century. Diseases such as stroke involve multiple concomitant causal factors that are difficult to represent using conventional statistical methods. We tested which paradigm best represented this complex multi-causal clinical phenomenon-stroke. We show that the fuzzy logic paradigm better represented clinical complexity in cerebrovascular disease than current probability theory based methodology. We believe this finding is generalizable to all of clinical science since multiple concomitant causal factors are involved in nearly all known pathological processes.
Causal mechanisms of masked hypertension: socio-psychological aspects.
Ogedegbe, Gbenga
2010-04-01
The contribution of Dr Thomas Pickering's study to the measurement of blood pressure (BP) is the defining aspect of his academic career and achievement - narrowly defined. In this regard, two important areas characterized his study as it relates to masked hypertension. First, he introduced the term, masked hypertension, to replace the rather inappropriate term 'reverse white-coat hypertension' and 'white-coat normotension'; thus drawing attention to the fact that these patients are genuinely hypertensive by ambulatory BP but were missed by normal office BP. More importantly, he rightly maintained that masked hypertension is a true continuum of sustained hypertension rather than an aberrant measurement artifact. Second, is his pivotal study on the important role of psychosocial factors as a potential mechanism for the development of masked hypertension. In this regard, he explained masked hypertension as a conditioned response to anxiety in office settings, and highlighted the role that diagnostic labeling plays in its development. His view of masked hypertension is that of a continuum from prehypertension (based on office BP measurement) to masked hypertension (based on ambulatory BP) and finally to sustained hypertension (based on both office and ambulatory BP). He strongly believes that it is the prehypertensive patients who progress to masked hypertension. Subsequently, patients who are prehypertensive should be screened for masked hypertension and treated. In this manuscript, we summarize his study as it relates to the definition of masked hypertension, the psychosocial characteristics, mechanisms and its clinical relevance.
(ajst) statistical mechanics model for orientational
African Journals Online (AJOL)
Science and Engineering Series Vol. 6, No. 2, pp. 94 - 101. STATISTICAL MECHANICS MODEL FOR ORIENTATIONAL. MOTION OF TWO-DIMENSIONAL RIGID ROTATOR. Malo, J.O. ... there is no translational motion and that they are well separated so .... constant and I is the moment of inertia of a linear rotator. Thus, the ...
Statistical-mechanical formulation of Lyapunov exponents
International Nuclear Information System (INIS)
Tanase-Nicola, Sorin; Kurchan, Jorge
2003-01-01
We show how the Lyapunov exponents of a dynamic system can, in general, be expressed in terms of the free energy of a (non-Hermitian) quantum many-body problem. This puts their study as a problem of statistical mechanics, whose intuitive concepts and techniques of approximation can hence be borrowed
Multiparticle quantum mechanics obeying fractional statistics
International Nuclear Information System (INIS)
Wu, Y.
1984-01-01
We obtain the rule governing many-body wave functions for particles obeying fractional statistics in two (space) dimensions. It generalizes and continuously interpolates the usual symmetrization and antisymmetrization. Quantum mechanics of more than two particles is discussed and some new features are found
Statistical mechanics and the foundations of thermodynamics
International Nuclear Information System (INIS)
Loef, A.M.
1979-01-01
An introduction to classical statistical mechanics and its relation to thermodynamics is presented. Emphasis is put on getting a detailed and logical presentation of the foundations of thermodynamics based on the maximum entropy principles which govern the values taken by macroscopic variables according to the laws of large numbers
Statistical mechanics of systems of unbounded spins
Energy Technology Data Exchange (ETDEWEB)
Lebowitz, J L [Yeshiva Univ., New York (USA). Belfer Graduate School of Science; Presutti, E [L' Aquila Univ. (Italy). Istituto di Matematica
1976-11-01
We develop the statistical mechanics of unbounded n-component spin systems interacting via potentials which are superstable and strongly tempered. The uniqueness of the equilibrium state is then proven for one component ferromagnetic spins whose free energy is differentiable with respect to the magnetic field.
Stability and equilibrium in quantum statistical mechanics
International Nuclear Information System (INIS)
Kastler, Daniel.
1975-01-01
A derivation of the Gibbs Ansatz, base of the equilibrium statistical mechanics is provided from a stability requirements, in technical connection with the harmonic analysis of non-commutative dynamical systems. By the same token a relation is established between stability and the positivity of Hamiltonian in the zero temperature case [fr
Causal mechanisms of subjective cognitive dysfunction in schizophrenic and depressed patients
van den Bosch, RJ; Rombouts, RP
We examined causal mechanisms of subjective cognitive (dis)abilities in schizophrenic and depressed patients, and in patient and normal control groups. This exploratory study included objective cognitive performance (Continuous Performance Task) as well as mood and mental effort ratings. Self-report
Directory of Open Access Journals (Sweden)
Stephen R Palmquist
2013-05-01
Full Text Available Quantum indeterminism seems incompatible with Kant’s defense of causality in his Second Analogy. The Copenhagen interpretation also takes quantum theory as evidence for anti-realism. This first article of a two-part series argues that the law of causality, as transcendental, applies only to the world as observable, not to hypothetical (unobservable objects such as quarks, detectable only by high energy accelerators. Taking Planck’s constant and the speed of light as the lower and upper bounds of observability provides a way of interpreting the observables of quantum mechanics as empirically real even though they are transcendentally (i.e., pre-observationally ideal.
Bayesian approach to inverse statistical mechanics
Habeck, Michael
2014-05-01
Inverse statistical mechanics aims to determine particle interactions from ensemble properties. This article looks at this inverse problem from a Bayesian perspective and discusses several statistical estimators to solve it. In addition, a sequential Monte Carlo algorithm is proposed that draws the interaction parameters from their posterior probability distribution. The posterior probability involves an intractable partition function that is estimated along with the interactions. The method is illustrated for inverse problems of varying complexity, including the estimation of a temperature, the inverse Ising problem, maximum entropy fitting, and the reconstruction of molecular interaction potentials.
Applying Statistical Mechanics to pixel detectors
International Nuclear Information System (INIS)
Pindo, Massimiliano
2002-01-01
Pixel detectors, being made of a large number of active cells of the same kind, can be considered as significant sets to which Statistical Mechanics variables and methods can be applied. By properly redefining well known statistical parameters in order to let them match the ones that actually characterize pixel detectors, an analysis of the way they work can be performed in a totally new perspective. A deeper understanding of pixel detectors is attained, helping in the evaluation and comparison of their intrinsic characteristics and performance
Nonextensive statistical mechanics of ionic solutions
International Nuclear Information System (INIS)
Varela, L.M.; Carrete, J.; Munoz-Sola, R.; Rodriguez, J.R.; Gallego, J.
2007-01-01
Classical mean-field Poisson-Boltzmann theory of ionic solutions is revisited in the theoretical framework of nonextensive Tsallis statistics. The nonextensive equivalent of Poisson-Boltzmann equation is formulated revisiting the statistical mechanics of liquids and the Debye-Hueckel framework is shown to be valid for highly diluted solutions even under circumstances where nonextensive thermostatistics must be applied. The lowest order corrections associated to nonadditive effects are identified for both symmetric and asymmetric electrolytes and the behavior of the average electrostatic potential in a homogeneous system is analytically and numerically analyzed for various values of the complexity measurement nonextensive parameter q
An introduction to statistical mechanics and thermodynamics
Swendsen, Robert H
2012-01-01
This text presents the two complementary aspects of thermal physics as an integrated theory of the properties of matter. Conceptual understanding is promoted by thorough development of basic concepts. In contrast to many texts, statistical mechanics, including discussion of the required probability theory, is presented first. This provides a statistical foundation for the concept of entropy, which is central to thermal physics. A unique feature of the book is the development ofentropy based on Boltzmann's 1877 definition; this avoids contradictions or ad hoc corrections found in other texts. D
Quantum mechanics as a natural generalization of classical statistical mechanics
International Nuclear Information System (INIS)
Xu Laizi; Qian Shangwu
1994-01-01
By comparison between equations of motion of geometrical optics (GO) and that of classical statistical mechanics (CSM), it is found that there should be an analogy between GO and CSM instead of GO and classical mechanics (CM). Furthermore, by comparison between the classical limit (CL) of quantum mechanics (QM) and CSM, the authors find that CL of QM is CSM not CM, hence they demonstrated that QM is a natural generalization of CSM instead of CM
Analogies between classical statistical mechanics and quantum mechanics
International Nuclear Information System (INIS)
Uehara, M.
1986-01-01
Some analogies between nonequilibrium classical statistical mechanics and quantum mechanics, at the level of the Liouville equation and at the kinetic level, are commented on. A theorem, related to the Vlasov equation applied to a plasma, is proved. The theorem presents an analogy with Ehrenfest's theorem of quantum mechanics. An analogy between the plasma kinetic theory and Bohm's quantum theory with 'hidden variables' is also shown. (Author) [pt
Cellular automata and statistical mechanical models
International Nuclear Information System (INIS)
Rujan, P.
1987-01-01
The authors elaborate on the analogy between the transfer matrix of usual lattice models and the master equation describing the time development of cellular automata. Transient and stationary properties of probabilistic automata are linked to surface and bulk properties, respectively, of restricted statistical mechanical systems. It is demonstrated that methods of statistical physics can be successfully used to describe the dynamic and the stationary behavior of such automata. Some exact results are derived, including duality transformations, exact mappings, disorder, and linear solutions. Many examples are worked out in detail to demonstrate how to use statistical physics in order to construct cellular automata with desired properties. This approach is considered to be a first step toward the design of fully parallel, probabilistic systems whose computational abilities rely on the cooperative behavior of their components
Mathematical methods in quantum and statistical mechanics
International Nuclear Information System (INIS)
Fishman, L.
1977-01-01
The mathematical structure and closed-form solutions pertaining to several physical problems in quantum and statistical mechanics are examined in some detail. The J-matrix method, introduced previously for s-wave scattering and based upon well-established Hilbert Space theory and related generalized integral transformation techniques, is extended to treat the lth partial wave kinetic energy and Coulomb Hamiltonians within the context of square integrable (L 2 ), Laguerre (Slater), and oscillator (Gaussian) basis sets. The theory of relaxation in statistical mechanics within the context of the theory of linear integro-differential equations of the Master Equation type and their corresponding Markov processes is examined. Several topics of a mathematical nature concerning various computational aspects of the L 2 approach to quantum scattering theory are discussed
Equilibrium statistical mechanics of lattice models
Lavis, David A
2015-01-01
Most interesting and difficult problems in equilibrium statistical mechanics concern models which exhibit phase transitions. For graduate students and more experienced researchers this book provides an invaluable reference source of approximate and exact solutions for a comprehensive range of such models. Part I contains background material on classical thermodynamics and statistical mechanics, together with a classification and survey of lattice models. The geometry of phase transitions is described and scaling theory is used to introduce critical exponents and scaling laws. An introduction is given to finite-size scaling, conformal invariance and Schramm—Loewner evolution. Part II contains accounts of classical mean-field methods. The parallels between Landau expansions and catastrophe theory are discussed and Ginzburg—Landau theory is introduced. The extension of mean-field theory to higher-orders is explored using the Kikuchi—Hijmans—De Boer hierarchy of approximations. In Part III the use of alge...
Statistical mechanics and the physics of fluids
Tosi, Mario
This volume collects the lecture notes of a course on statistical mechanics, held at Scuola Normale Superiore di Pisa for third-to-fifth year students in physics and chemistry. Three main themes are covered in the book. The first part gives a compact presentation of the foundations of statistical mechanics and their connections with thermodynamics. Applications to ideal gases of material particles and of excitation quanta are followed by a brief introduction to a real classical gas and to a weakly coupled classical plasma, and by a broad overview on the three states of matter.The second part is devoted to fluctuations around equilibrium and their correlations. Coverage of liquid structure and critical phenomena is followed by a discussion of irreversible processes as exemplified by diffusive motions and by the dynamics of density and heat fluctuations. Finally, the third part is an introduction to some advanced themes: supercooling and the glassy state, non-Newtonian fluids including polymers and liquid cryst...
Nonextensive statistical mechanics and high energy physics
Directory of Open Access Journals (Sweden)
Tsallis Constantino
2014-04-01
Full Text Available The use of the celebrated Boltzmann-Gibbs entropy and statistical mechanics is justified for ergodic-like systems. In contrast, complex systems typically require more powerful theories. We will provide a brief introduction to nonadditive entropies (characterized by indices like q, which, in the q → 1 limit, recovers the standard Boltzmann-Gibbs entropy and associated nonextensive statistical mechanics. We then present somerecent applications to systems such as high-energy collisions, black holes and others. In addition to that, we clarify and illustrate the neat distinction that exists between Lévy distributions and q-exponential ones, a point which occasionally causes some confusion in the literature, very particularly in the LHC literature
Zeno dynamics in quantum statistical mechanics
International Nuclear Information System (INIS)
Schmidt, Andreas U
2003-01-01
We study the quantum Zeno effect in quantum statistical mechanics within the operator algebraic framework. We formulate a condition for the appearance of the effect in W*-dynamical systems, in terms of the short-time behaviour of the dynamics. Examples of quantum spin systems show that this condition can be effectively applied to quantum statistical mechanical models. Furthermore, we derive an explicit form of the Zeno generator, and use it to construct Gibbs equilibrium states for the Zeno dynamics. As a concrete example, we consider the X-Y model, for which we show that a frequent measurement at a microscopic level, e.g. a single lattice site, can produce a macroscopic effect in changing the global equilibrium
Introductory statistical mechanics for electron storage rings
International Nuclear Information System (INIS)
Jowett, J.M.
1986-07-01
These lectures introduce the beam dynamics of electron-positron storage rings with particular emphasis on the effects due to synchrotron radiation. They differ from most other introductions in their systematic use of the physical principles and mathematical techniques of the non-equilibrium statistical mechanics of fluctuating dynamical systems. A self-contained exposition of the necessary topics from this field is included. Throughout the development, a Hamiltonian description of the effects of the externally applied fields is maintained in order to preserve the links with other lectures on beam dynamics and to show clearly the extent to which electron dynamics in non-Hamiltonian. The statistical mechanical framework is extended to a discussion of the conceptual foundations of the treatment of collective effects through the Vlasov equation
Statistical algebraic approach to quantum mechanics
International Nuclear Information System (INIS)
Slavnov, D.A.
2001-01-01
The scheme for plotting the quantum theory with application of the statistical algebraic approach is proposed. The noncommutative algebra elements (observed ones) and nonlinear functionals on this algebra (physical state) are used as the primary constituents. The latter ones are associated with the single-unit measurement results. Certain physical state groups are proposed to consider as quantum states of the standard quantum mechanics. It is shown that the mathematical apparatus of the standard quantum mechanics may be reproduced in such a scheme in full volume [ru
Exactly soluble problems in statistical mechanics
International Nuclear Information System (INIS)
Yang, C.N.
1983-01-01
In the last few years, a number of two-dimensional classical and one-dimensional quantum mechanical problems in statistical mechanics have been exactly solved. Although these problems range over models of diverse physical interest, their solutions were obtained using very similar mathematical methods. In these lectures, the main points of the methods are discussed. In this introductory lecture, an overall survey of all these problems without going into the detailed method of solution is given. In later lectures, they shall concentrate on one particular problem: the delta function interaction in one dimension, and go into the details of that problem
Classical statistical mechanics approach to multipartite entanglement
Energy Technology Data Exchange (ETDEWEB)
Facchi, P [Dipartimento di Matematica, Universita di Bari, I-70125 Bari (Italy); Florio, G; Pascazio, S [Istituto Nazionale di Fisica Nucleare, Sezione di Bari, I-70126 Bari (Italy); Marzolino, U [Dipartimento di Fisica, Universita di Trieste, and Istituto Nazionale di Fisica Nucleare, Sezione di Trieste, I-34014 Trieste (Italy); Parisi, G [Dipartimento di Fisica, Universita di Roma ' La Sapienza' , Piazzale Aldo Moro 2, Centre for Statistical Mechanics and Complexity (SMC), CNR-INFM (Italy)
2010-06-04
We characterize the multipartite entanglement of a system of n qubits in terms of the distribution function of the bipartite purity over balanced bipartitions. We search for maximally multipartite entangled states, whose average purity is minimal, and recast this optimization problem into a problem of statistical mechanics, by introducing a cost function, a fictitious temperature and a partition function. By investigating the high-temperature expansion, we obtain the first three moments of the distribution. We find that the problem exhibits frustration.
Classical statistical mechanics approach to multipartite entanglement
Facchi, P.; Florio, G.; Marzolino, U.; Parisi, G.; Pascazio, S.
2010-06-01
We characterize the multipartite entanglement of a system of n qubits in terms of the distribution function of the bipartite purity over balanced bipartitions. We search for maximally multipartite entangled states, whose average purity is minimal, and recast this optimization problem into a problem of statistical mechanics, by introducing a cost function, a fictitious temperature and a partition function. By investigating the high-temperature expansion, we obtain the first three moments of the distribution. We find that the problem exhibits frustration.
Classical statistical mechanics approach to multipartite entanglement
International Nuclear Information System (INIS)
Facchi, P; Florio, G; Pascazio, S; Marzolino, U; Parisi, G
2010-01-01
We characterize the multipartite entanglement of a system of n qubits in terms of the distribution function of the bipartite purity over balanced bipartitions. We search for maximally multipartite entangled states, whose average purity is minimal, and recast this optimization problem into a problem of statistical mechanics, by introducing a cost function, a fictitious temperature and a partition function. By investigating the high-temperature expansion, we obtain the first three moments of the distribution. We find that the problem exhibits frustration.
Statistical Mechanics and Black Hole Thermodynamics
Carlip, Steven
1997-01-01
Black holes are thermodynamic objects, but despite recent progress, the ultimate statistical mechanical origin of black hole temperature and entropy remains mysterious. Here I summarize an approach in which the entropy is viewed as arising from ``would-be pure gauge'' degrees of freedom that become dynamical at the horizon. For the (2+1)-dimensional black hole, these degrees of freedom can be counted, and yield the correct Bekenstein-Hawking entropy; the corresponding problem in 3+1 dimension...
Statistical mechanics of budget-constrained auctions
Altarelli, F.; Braunstein, A.; Realpe-Gomez, J.; Zecchina, R.
2009-01-01
Finding the optimal assignment in budget-constrained auctions is a combinatorial optimization problem with many important applications, a notable example being the sale of advertisement space by search engines (in this context the problem is often referred to as the off-line AdWords problem). Based on the cavity method of statistical mechanics, we introduce a message passing algorithm that is capable of solving efficiently random instances of the problem extracted from a natural distribution,...
The statistical mechanics of learning a rule
International Nuclear Information System (INIS)
Watkin, T.L.H.; Rau, A.; Biehl, M.
1993-01-01
A summary is presented of the statistical mechanical theory of learning a rule with a neural network, a rapidly advancing area which is closely related to other inverse problems frequently encountered by physicists. By emphasizing the relationship between neural networks and strongly interacting physical systems, such as spin glasses, the authors show how learning theory has provided a workshop in which to develop new, exact analytical techniques
Statistical mechanics of the majority game
International Nuclear Information System (INIS)
Kozlowski, P; Marsili, M
2003-01-01
The majority game, modelling a system of heterogeneous agents trying to behave in a similar way, is introduced and studied using methods of statistical mechanics. The stationary states of the game are given by the (local) minima of a particular Hopfield-like Hamiltonian. On the basis of replica symmetric calculations, we draw the phase diagram, which contains the analogue of a retrieval phase. The number of metastable states is estimated using the annealed approximation. The results are confronted with extensive numerical simulations
Metastability in Field Theory and Statistical Mechanics
International Nuclear Information System (INIS)
Carvalho, C.A. de.
1984-01-01
After a phase transition analysis which can occur in the framework of a scalar field theory, at finite temperature and in presence of a external field, possibles metastable situations are studied and also how is their relationship with the transitions. In both cases it is used a semiclassical approximation to the theory which, in Statistical Mechanics, corresponds to the droplet-bubble model. (L.C.) [pt
Statistical mechanics for a class of quantum statistics
International Nuclear Information System (INIS)
Isakov, S.B.
1994-01-01
Generalized statistical distributions for identical particles are introduced for the case where filling a single-particle quantum state by particles depends on filling states of different momenta. The system of one-dimensional bosons with a two-body potential that can be solved by means of the thermodynamic Bethe ansatz is shown to be equivalent thermodynamically to a system of free particles obeying statistical distributions of the above class. The quantum statistics arising in this way are completely determined by the two-particle scattering phases of the corresponding interacting systems. An equation determining the statistical distributions for these statistics is derived
Directory of Open Access Journals (Sweden)
Moura LMVR
2016-12-01
Full Text Available Lidia MVR Moura,1,2 M Brandon Westover,1,2 David Kwasnik,1 Andrew J Cole,1,2 John Hsu3–5 1Massachusetts General Hospital, Department of Neurology, Epilepsy Service, Boston, MA, USA; 2Harvard Medical School, Boston, MA, USA; 3Massachusetts General Hospital, Mongan Institute, Boston, MA, USA; 4Harvard Medical School, Department of Medicine, Boston, MA, USA; 5Harvard Medical School, Department of Health Care Policy, Boston, MA, USA Abstract: The elderly population faces an increasing number of cases of chronic neurological conditions, such as epilepsy and Alzheimer’s disease. Because the elderly with epilepsy are commonly excluded from randomized controlled clinical trials, there are few rigorous studies to guide clinical practice. When the elderly are eligible for trials, they either rarely participate or frequently have poor adherence to therapy, thus limiting both generalizability and validity. In contrast, large observational data sets are increasingly available, but are susceptible to bias when using common analytic approaches. Recent developments in causal inference-analytic approaches also introduce the possibility of emulating randomized controlled trials to yield valid estimates. We provide a practical example of the application of the principles of causal inference to a large observational data set of patients with epilepsy. This review also provides a framework for comparative-effectiveness research in chronic neurological conditions. Keywords: epilepsy, epidemiology, neurostatistics, causal inference
Reininghaus, Ulrich; Depp, Colin A; Myin-Germeys, Inez
2016-03-01
Integrated models of psychotic disorders have posited a number of putative psychological mechanisms that may contribute to the development of psychotic symptoms, but it is only recently that a modest amount of experience sampling research has provided evidence on their role in daily life, outside the research laboratory. A number of methodological challenges remain in evaluating specificity of potential causal links between a given psychological mechanism and psychosis outcomes in a systematic fashion, capitalizing on longitudinal data to investigate temporal ordering. In this article, we argue for testing ecological interventionist causal models that draw on real world and real-time delivered, ecological momentary interventions for generating evidence on several causal criteria (association, time order, and direction/sole plausibility) under real-world conditions, while maximizing generalizability to social contexts and experiences in heterogeneous populations. Specifically, this approach tests whether ecological momentary interventions can (1) modify a putative mechanism and (2) produce changes in the mechanism that lead to sustainable changes in intended psychosis outcomes in individuals' daily lives. Future research using this approach will provide translational evidence on the active ingredients of mobile health and in-person interventions that promote sustained effectiveness of ecological momentary interventions and, thereby, contribute to ongoing efforts that seek to enhance effectiveness of psychological interventions under real-world conditions. © The Author 2015. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Reininghaus, Ulrich; Depp, Colin A.; Myin-Germeys, Inez
2016-01-01
Integrated models of psychotic disorders have posited a number of putative psychological mechanisms that may contribute to the development of psychotic symptoms, but it is only recently that a modest amount of experience sampling research has provided evidence on their role in daily life, outside the research laboratory. A number of methodological challenges remain in evaluating specificity of potential causal links between a given psychological mechanism and psychosis outcomes in a systematic fashion, capitalizing on longitudinal data to investigate temporal ordering. In this article, we argue for testing ecological interventionist causal models that draw on real world and real-time delivered, ecological momentary interventions for generating evidence on several causal criteria (association, time order, and direction/sole plausibility) under real-world conditions, while maximizing generalizability to social contexts and experiences in heterogeneous populations. Specifically, this approach tests whether ecological momentary interventions can (1) modify a putative mechanism and (2) produce changes in the mechanism that lead to sustainable changes in intended psychosis outcomes in individuals’ daily lives. Future research using this approach will provide translational evidence on the active ingredients of mobile health and in-person interventions that promote sustained effectiveness of ecological momentary interventions and, thereby, contribute to ongoing efforts that seek to enhance effectiveness of psychological interventions under real-world conditions. PMID:26707864
Statistical mechanics of lattice Boson field theory
International Nuclear Information System (INIS)
1976-01-01
A lattice approximation to Euclidean, boson quantum field theory is expressed in terms of the thermodynamic properties of a classical statistical mechanical system near its critical point in a sufficiently general way to permit the inclusion of an anomalous dimension of the vacuum. Using the thermodynamic properties of the Ising model, one can begin to construct nontrivial (containing scattering) field theories in 2, 3 and 4 dimensions. It is argued that, depending on the choice of the bare coupling constant, there are three types of behavior to be expected: the perturbation theory region, the renormalization group fixed point region, and the Ising model region
Quantum field theory and statistical mechanics
International Nuclear Information System (INIS)
Jegerlehner, F.
1975-01-01
At first a heuristic understanding is given how the relation between quantum field theory and statistical mechanics near phase transitions comes about. A long range scale invariant theory is constructed, critical indices are calculated and the relations among them are proved, field theoretical Kadanoff-scale transformations are formulated and scaling corrections calculated. A precise meaning to many of Kadanoffs considerations and a model matching Wegners phenomenological scheme is given. It is shown, that soft parametrization is most transparent for the discussion of scaling behaviour. (BJ) [de
Statistical mechanics of spatial evolutionary games
International Nuclear Information System (INIS)
Miekisz, Jacek
2004-01-01
We discuss the long-run behaviour of stochastic dynamics of many interacting players in spatial evolutionary games. In particular, we investigate the effect of the number of players and the noise level on the stochastic stability of Nash equilibria. We discuss similarities and differences between systems of interacting players maximizing their individual payoffs and particles minimizing their interaction energy. We use concepts and techniques of statistical mechanics to study game-theoretic models. In order to obtain results in the case of the so-called potential games, we analyse the thermodynamic limit of the appropriate models of interacting particles
Statistical mechanics out of equilibrium the irreversibility
International Nuclear Information System (INIS)
Alvarez Estrada, R. F.
2001-01-01
A Round Table about the issue of Irreversibility and related matters has taken place during the last (20th) Statistical Mechanics Conference, held in Paris (July 1998). This article tries to provide a view (necessarily limited, and hence, uncompleted) of some approaches to the subject: the one based upon deterministic chaos (which is currently giving rise to a very active research) and the classical interpretation due to Boltzmann. An attempt has been made to write this article in a self-contained way, and to avoid a technical presentation wherever possible. (Author) 29 refs
Principles of thermodynamics and statistical mechanics
Lawden, D F
2005-01-01
A thorough exploration of the universal principles of thermodynamics and statistical mechanics, this volume explains the applications of these essential rules to a multitude of situations arising in physics and engineering. It develops their use in a variety of circumstances-including those involving gases, crystals, and magnets-in order to illustrate general methods of analysis and to provide readers with all the necessary background to continue in greater depth with specific topics.Author D. F. Lawden has considerable experience in teaching this subject to university students of varied abili
Thermodynamics and statistical mechanics an integrated approach
Hardy, Robert J
2014-01-01
This textbook brings together the fundamentals of the macroscopic and microscopic aspects of thermal physics by presenting thermodynamics and statistical mechanics as complementary theories based on small numbers of postulates. The book is designed to give the instructor flexibility in structuring courses for advanced undergraduates and/or beginning graduate students and is written on the principle that a good text should also be a good reference. The presentation of thermodynamics follows the logic of Clausius and Kelvin while relating the concepts involved to familiar phenomena and the mod
Statistical mechanics of lattice boson field theory
International Nuclear Information System (INIS)
Baker, G.A. Jr.
1977-01-01
A lattice approximation to Euclidean, boson quantum field theory is expressed in terms of the thermodynamic properties of a classical statistical mechanical system near its critical point in a sufficiently general way to permit the inclusion of an anomalous dimension of the vacuum. Using the thermodynamic properties of the Ising model, one can begin to construct nontrivial (containing scattering) field theories in 2, 3, and 4 dimensions. It is argued that, depending on the choice of the bare coupling constant, there are three types of behavior to be expected: the perturbation theory region, the renormalization group fixed point region, and the Ising model region. 24 references
Early years of Computational Statistical Mechanics
Mareschal, Michel
2018-05-01
Evidence that a model of hard spheres exhibits a first-order solid-fluid phase transition was provided in the late fifties by two new numerical techniques known as Monte Carlo and Molecular Dynamics. This result can be considered as the starting point of computational statistical mechanics: at the time, it was a confirmation of a counter-intuitive (and controversial) theoretical prediction by J. Kirkwood. It necessitated an intensive collaboration between the Los Alamos team, with Bill Wood developing the Monte Carlo approach, and the Livermore group, where Berni Alder was inventing Molecular Dynamics. This article tells how it happened.
Statistical mechanics and the foundations of thermodynamics
International Nuclear Information System (INIS)
Martin-Loef, A.
1979-01-01
These lectures are designed as an introduction to classical statistical mechanics and its relation to thermodynamics. They are intended to bridge the gap between the treatment of the subject in physics text books and the modern presentations of mathematically rigorous results. We shall first introduce the probability distributions, ensembles, appropriate for describing systems in equilibrium and consider some of their basic physical applications. We also discuss the problem of approach to equilibrium and how irreversibility comes into the dynamics. We then give a detailed description of how the law of large numbers for macrovariables in equilibrium is derived from the fact that entropy is an extensive quantity in the thermodynamic limit. We show in a natural way how to split the energy changes in an thermodynamical process into work and heat leading to a derivation of the first and second laws of thermodynamics from the rules of thermodynamical equilibrium. We have elaborated this part in detail because we feel it is quite satisfactory, that the establishment of the limit of thermodynamic functions as achieved in the modern development of the mathematical aspects of statistical mechanics allows a more general and logically clearer presentation of the bases of thermodynamics. We close these lectures by presenting the basic facts about fluctuation theory. The treatment aims to be reasonably self-contained both concerning the physics and mathematics needed. No knowledge of quantum mechanics is presupposed. Since we spent a large part on mathematical proofs and give many technical facts these lectures are probably most digestive for the mathematically inclined reader who wants to understand the physics of the subject. (HJ)
International Nuclear Information System (INIS)
Tadaki, Kohtaro
2010-01-01
The statistical mechanical interpretation of algorithmic information theory (AIT, for short) was introduced and developed by our former works [K. Tadaki, Local Proceedings of CiE 2008, pp. 425-434, 2008] and [K. Tadaki, Proceedings of LFCS'09, Springer's LNCS, vol. 5407, pp. 422-440, 2009], where we introduced the notion of thermodynamic quantities, such as partition function Z(T), free energy F(T), energy E(T), statistical mechanical entropy S(T), and specific heat C(T), into AIT. We then discovered that, in the interpretation, the temperature T equals to the partial randomness of the values of all these thermodynamic quantities, where the notion of partial randomness is a stronger representation of the compression rate by means of program-size complexity. Furthermore, we showed that this situation holds for the temperature T itself, which is one of the most typical thermodynamic quantities. Namely, we showed that, for each of the thermodynamic quantities Z(T), F(T), E(T), and S(T) above, the computability of its value at temperature T gives a sufficient condition for T is an element of (0,1) to satisfy the condition that the partial randomness of T equals to T. In this paper, based on a physical argument on the same level of mathematical strictness as normal statistical mechanics in physics, we develop a total statistical mechanical interpretation of AIT which actualizes a perfect correspondence to normal statistical mechanics. We do this by identifying a microcanonical ensemble in the framework of AIT. As a result, we clarify the statistical mechanical meaning of the thermodynamic quantities of AIT.
Two statistical mechanics aspects of complex networks
Thurner, Stefan; Biely, Christoly
2006-12-01
By adopting an ensemble interpretation of non-growing rewiring networks, network theory can be reduced to a counting problem of possible network states and an identification of their associated probabilities. We present two scenarios of how different rewirement schemes can be used to control the state probabilities of the system. In particular, we review how by generalizing the linking rules of random graphs, in combination with superstatistics and quantum mechanical concepts, one can establish an exact relation between the degree distribution of any given network and the nodes’ linking probability distributions. In a second approach, we control state probabilities by a network Hamiltonian, whose characteristics are motivated by biological and socio-economical statistical systems. We demonstrate that a thermodynamics of networks becomes a fully consistent concept, allowing to study e.g. ‘phase transitions’ and computing entropies through thermodynamic relations.
Statistical mechanics of magnetized pair Fermi gas
International Nuclear Information System (INIS)
Daicic, J.; Frankel, N.E.; Kowalenko, V.
1993-01-01
Following previous work on the magnetized pair Bose gas this contribution presents the statistical mechanics of the charged relativistic Fermi gas with pair creation in d spatial dimensions. Initially, the gas in no external fields is studied. As a result, expansions for the various thermodynamic functions are obtained in both the μ/m→0 (neutrino) limit, and about the point μ/m =1, where μ is the chemical potential. The thermodynamics of a gas of quantum-number conserving massless fermions is also discussed. Then a complete study of the pair Fermi gas in a homogeneous magnetic field, is presented investigating the behavior of the magnetization over a wide range of field strengths. The inclusion of pairs leads to new results for the net magnetization due to the paramagnetic moment of the spins and the diamagnetic Landau orbits. 20 refs
Thermodynamics and statistical mechanics an integrated approach
Shell, M Scott
2015-01-01
Learn classical thermodynamics alongside statistical mechanics with this fresh approach to the subjects. Molecular and macroscopic principles are explained in an integrated, side-by-side manner to give students a deep, intuitive understanding of thermodynamics and equip them to tackle future research topics that focus on the nanoscale. Entropy is introduced from the get-go, providing a clear explanation of how the classical laws connect to the molecular principles, and closing the gap between the atomic world and thermodynamics. Notation is streamlined throughout, with a focus on general concepts and simple models, for building basic physical intuition and gaining confidence in problem analysis and model development. Well over 400 guided end-of-chapter problems are included, addressing conceptual, fundamental, and applied skill sets. Numerous worked examples are also provided together with handy shaded boxes to emphasize key concepts, making this the complete teaching package for students in chemical engineer...
Current algebra, statistical mechanics and quantum models
Vilela Mendes, R.
2017-11-01
Results obtained in the past for free boson systems at zero and nonzero temperatures are revisited to clarify the physical meaning of current algebra reducible functionals which are associated to systems with density fluctuations, leading to observable effects on phase transitions. To use current algebra as a tool for the formulation of quantum statistical mechanics amounts to the construction of unitary representations of diffeomorphism groups. Two mathematical equivalent procedures exist for this purpose. One searches for quasi-invariant measures on configuration spaces, the other for a cyclic vector in Hilbert space. Here, one argues that the second approach is closer to the physical intuition when modelling complex systems. An example of application of the current algebra methodology to the pairing phenomenon in two-dimensional fermion systems is discussed.
Statistical mechanics of budget-constrained auctions
International Nuclear Information System (INIS)
Altarelli, F; Braunstein, A; Realpe-Gomez, J; Zecchina, R
2009-01-01
Finding the optimal assignment in budget-constrained auctions is a combinatorial optimization problem with many important applications, a notable example being in the sale of advertisement space by search engines (in this context the problem is often referred to as the off-line AdWords problem). On the basis of the cavity method of statistical mechanics, we introduce a message-passing algorithm that is capable of solving efficiently random instances of the problem extracted from a natural distribution, and we derive from its properties the phase diagram of the problem. As the control parameter (average value of the budgets) is varied, we find two phase transitions delimiting a region in which long-range correlations arise
Statistical mechanics of budget-constrained auctions
Altarelli, F.; Braunstein, A.; Realpe-Gomez, J.; Zecchina, R.
2009-07-01
Finding the optimal assignment in budget-constrained auctions is a combinatorial optimization problem with many important applications, a notable example being in the sale of advertisement space by search engines (in this context the problem is often referred to as the off-line AdWords problem). On the basis of the cavity method of statistical mechanics, we introduce a message-passing algorithm that is capable of solving efficiently random instances of the problem extracted from a natural distribution, and we derive from its properties the phase diagram of the problem. As the control parameter (average value of the budgets) is varied, we find two phase transitions delimiting a region in which long-range correlations arise.
Generalized bond percolation and statistical mechanics
International Nuclear Information System (INIS)
Tsallis, C.
1978-05-01
A generalization of traditional bond percolation is performed, in the sens that bonds have now the possibility of partially transmitting the information (a fact which leads to the concept of 'fidelity' of the bond), and also in the sens that, besides the normal tendency to equiprobability, the bonds are allowed to substantially change the information. Furthermore the fidelity is allowed, to become an aleatory variable, and the operational rules concerning the associated distribution laws are determined. Thermally quenched random bonds and the whole body of Statistical Mechanics become particular cases of this formalism, which is in general adapted to the treatment of all problems whose main characteristic is to preserve a part of the information through a long path or array (critical phenomena, regime changements, thermal random models, etc). Operationally it provides a quick method for the calculation of the equivalent probability of complex clusters within the traditional bond percolation problem [pt
Statistical mechanics of dense granular media
International Nuclear Information System (INIS)
Coniglio, A; Fierro, A; Nicodemi, M; Ciamarra, M Pica; Tarzia, M
2005-01-01
We discuss some recent results on the statistical mechanics approach to dense granular media. In particular, by analytical mean field investigation we derive the phase diagram of monodisperse and bidisperse granular assemblies. We show that 'jamming' corresponds to a phase transition from a 'fluid' to a 'glassy' phase, observed when crystallization is avoided. The nature of such a 'glassy' phase turns out to be the same as found in mean field models for glass formers. This gives quantitative evidence for the idea of a unified description of the 'jamming' transition in granular media and thermal systems, such as glasses. We also discuss mixing/segregation transitions in binary mixtures and their connections to phase separation and 'geometric' effects
Statistical mechanics of driven diffusive systems
Schmittmann, B
1995-01-01
Far-from-equilibrium phenomena, while abundant in nature, are not nearly as well understood as their equilibrium counterparts. On the theoretical side, progress is slowed by the lack of a simple framework, such as the Boltzmann-Gbbs paradigm in the case of equilibrium thermodynamics. On the experimental side, the enormous structural complexity of real systems poses serious obstacles to comprehension. Similar difficulties have been overcome in equilibrium statistical mechanics by focusing on model systems. Even if they seem too simplistic for known physical systems, models give us considerable insight, provided they capture the essential physics. They serve as important theoretical testing grounds where the relationship between the generic physical behavior and the key ingredients of a successful theory can be identified and understood in detail. Within the vast realm of non-equilibrium physics, driven diffusive systems form a subset with particularly interesting properties. As a prototype model for these syst...
Directory of Open Access Journals (Sweden)
R. Eric Heidel
2016-01-01
Full Text Available Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power.
BOOK REVIEW: Statistical Mechanics of Turbulent Flows
Cambon, C.
2004-10-01
This is a handbook for a computational approach to reacting flows, including background material on statistical mechanics. In this sense, the title is somewhat misleading with respect to other books dedicated to the statistical theory of turbulence (e.g. Monin and Yaglom). In the present book, emphasis is placed on modelling (engineering closures) for computational fluid dynamics. The probabilistic (pdf) approach is applied to the local scalar field, motivated first by the nonlinearity of chemical source terms which appear in the transport equations of reacting species. The probabilistic and stochastic approaches are also used for the velocity field and particle position; nevertheless they are essentially limited to Lagrangian models for a local vector, with only single-point statistics, as for the scalar. Accordingly, conventional techniques, such as single-point closures for RANS (Reynolds-averaged Navier-Stokes) and subgrid-scale models for LES (large-eddy simulations), are described and in some cases reformulated using underlying Langevin models and filtered pdfs. Even if the theoretical approach to turbulence is not discussed in general, the essentials of probabilistic and stochastic-processes methods are described, with a useful reminder concerning statistics at the molecular level. The book comprises 7 chapters. Chapter 1 briefly states the goals and contents, with a very clear synoptic scheme on page 2. Chapter 2 presents definitions and examples of pdfs and related statistical moments. Chapter 3 deals with stochastic processes, pdf transport equations, from Kramer-Moyal to Fokker-Planck (for Markov processes), and moments equations. Stochastic differential equations are introduced and their relationship to pdfs described. This chapter ends with a discussion of stochastic modelling. The equations of fluid mechanics and thermodynamics are addressed in chapter 4. Classical conservation equations (mass, velocity, internal energy) are derived from their
Stagg, Camille L.; Schoolmaster, Donald; Krauss, Ken W.; Cormier, Nicole; Conner, William H.
2017-01-01
Coastal wetlands significantly contribute to global carbon storage potential. Sea-level rise and other climate change-induced disturbances threaten coastal wetland sustainability and carbon storage capacity. It is critical that we understand the mechanisms controlling wetland carbon loss so that we can predict and manage these resources in anticipation of climate change. However, our current understanding of the mechanisms that control soil organic matter decomposition, in particular the impacts of elevated salinity, are limited, and literature reports are contradictory. In an attempt to improve our understanding of these complex processes, we measured root and rhizome decomposition and developed a causal model to identify and quantify the mechanisms that influence soil organic matter decomposition in coastal wetlands that are impacted by sea-level rise. We identified three causal pathways: 1) a direct pathway representing the effects of flooding on soil moisture, 2) a direct pathway representing the effects of salinity on decomposer microbial communities and soil biogeochemistry, and 3) an indirect pathway representing the effects of salinity on litter quality through changes in plant community composition over time. We used this model to test the effects of alternate scenarios on the response of tidal freshwater forested wetlands and oligohaline marshes to short- and long-term climate-induced disturbances of flooding and salinity. In tidal freshwater forested wetlands, the model predicted less decomposition in response to drought, hurricane salinity pulsing, and long-term sea-level rise. In contrast, in the oligohaline marsh, the model predicted no change in response to sea-level rise, and increased decomposition following a drought or a hurricane salinity pulse. Our results show that it is critical to consider the temporal scale of disturbance and the magnitude of exposure when assessing the effects of salinity intrusion on carbon mineralization in coastal
Causal and causally separable processes
Oreshkov, Ognyan; Giarmatzi, Christina
2016-09-01
The idea that events are equipped with a partial causal order is central to our understanding of physics in the tested regimes: given two pointlike events A and B, either A is in the causal past of B, B is in the causal past of A, or A and B are space-like separated. Operationally, the meaning of these order relations corresponds to constraints on the possible correlations between experiments performed in the vicinities of the respective events: if A is in the causal past of B, an experimenter at A could signal to an experimenter at B but not the other way around, while if A and B are space-like separated, no signaling is possible in either direction. In the context of a concrete physical theory, the correlations compatible with a given causal configuration may obey further constraints. For instance, space-like correlations in quantum mechanics arise from local measurements on joint quantum states, while time-like correlations are established via quantum channels. Similarly to other variables, however, the causal order of a set of events could be random, and little is understood about the constraints that causality implies in this case. A main difficulty concerns the fact that the order of events can now generally depend on the operations performed at the locations of these events, since, for instance, an operation at A could influence the order in which B and C occur in A’s future. So far, no formal theory of causality compatible with such dynamical causal order has been developed. Apart from being of fundamental interest in the context of inferring causal relations, such a theory is imperative for understanding recent suggestions that the causal order of events in quantum mechanics can be indefinite. Here, we develop such a theory in the general multipartite case. Starting from a background-independent definition of causality, we derive an iteratively formulated canonical decomposition of multipartite causal correlations. For a fixed number of settings and
Causal and causally separable processes
International Nuclear Information System (INIS)
Oreshkov, Ognyan; Giarmatzi, Christina
2016-01-01
The idea that events are equipped with a partial causal order is central to our understanding of physics in the tested regimes: given two pointlike events A and B , either A is in the causal past of B , B is in the causal past of A , or A and B are space-like separated. Operationally, the meaning of these order relations corresponds to constraints on the possible correlations between experiments performed in the vicinities of the respective events: if A is in the causal past of B , an experimenter at A could signal to an experimenter at B but not the other way around, while if A and B are space-like separated, no signaling is possible in either direction. In the context of a concrete physical theory, the correlations compatible with a given causal configuration may obey further constraints. For instance, space-like correlations in quantum mechanics arise from local measurements on joint quantum states, while time-like correlations are established via quantum channels. Similarly to other variables, however, the causal order of a set of events could be random, and little is understood about the constraints that causality implies in this case. A main difficulty concerns the fact that the order of events can now generally depend on the operations performed at the locations of these events, since, for instance, an operation at A could influence the order in which B and C occur in A ’s future. So far, no formal theory of causality compatible with such dynamical causal order has been developed. Apart from being of fundamental interest in the context of inferring causal relations, such a theory is imperative for understanding recent suggestions that the causal order of events in quantum mechanics can be indefinite. Here, we develop such a theory in the general multipartite case. Starting from a background-independent definition of causality, we derive an iteratively formulated canonical decomposition of multipartite causal correlations. For a fixed number of settings and
Aftershock Energy Distribution by Statistical Mechanics Approach
Daminelli, R.; Marcellini, A.
2015-12-01
The aim of our work is to research the most probable distribution of the energy of aftershocks. We started by applying one of the fundamental principles of statistical mechanics that, in case of aftershock sequences, it could be expressed as: the greater the number of different ways in which the energy of aftershocks can be arranged among the energy cells in phase space the more probable the distribution. We assume that each cell in phase space has the same possibility to be occupied, and that more than one cell in the phase space can have the same energy. Seeing that seismic energy is proportional to products of different parameters, a number of different combinations of parameters can produce different energies (e.g., different combination of stress drop and fault area can release the same seismic energy). Let us assume that there are gi cells in the aftershock phase space characterised by the same energy released ɛi. Therefore we can assume that the Maxwell-Boltzmann statistics can be applied to aftershock sequences with the proviso that the judgment on the validity of this hypothesis is the agreement with the data. The aftershock energy distribution can therefore be written as follow: n(ɛ)=Ag(ɛ)exp(-βɛ)where n(ɛ) is the number of aftershocks with energy, ɛ, A and β are constants. Considering the above hypothesis, we can assume g(ɛ) is proportional to ɛ. We selected and analysed different aftershock sequences (data extracted from Earthquake Catalogs of SCEC, of INGV-CNT and other institutions) with a minimum magnitude retained ML=2 (in some cases ML=2.6) and a time window of 35 days. The results of our model are in agreement with the data, except in the very low energy band, where our model resulted in a moderate overestimation.
Petrasek MacDonald, Joanna; Ford, James D.; Cunsolo Willox, Ashlee; Ross, Nancy A.
2013-01-01
Objectives To review the protective factors and causal mechanisms which promote and enhance Indigenous youth mental health in the Circumpolar North. Study design A systematic literature review of peer-reviewed English-language research was conducted to systematically examine the protective factors and causal mechanisms which promote and enhance Indigenous youth mental health in the Circumpolar North. Methods This review followed the Preferred Reporting Items for Systematic Reviews and Meta-An...
A quantum information approach to statistical mechanics
International Nuclear Information System (INIS)
Cuevas, G.
2011-01-01
The field of quantum information and computation harnesses and exploits the properties of quantum mechanics to perform tasks more efficiently than their classical counterparts, or that may uniquely be possible in the quantum world. Its findings and techniques have been applied to a number of fields, such as the study of entanglement in strongly correlated systems, new simulation techniques for many-body physics or, generally, to quantum optics. This thesis aims at broadening the scope of quantum information theory by applying it to problems in statistical mechanics. We focus on classical spin models, which are toy models used in a variety of systems, ranging from magnetism, neural networks, to quantum gravity. We tackle these models using quantum information tools from three different angles. First, we show how the partition function of a class of widely different classical spin models (models in different dimensions, different types of many-body interactions, different symmetries, etc) can be mapped to the partition function of a single model. We prove this by first establishing a relation between partition functions and quantum states, and then transforming the corresponding quantum states to each other. Second, we give efficient quantum algorithms to estimate the partition function of various classical spin models, such as the Ising or the Potts model. The proof is based on a relation between partition functions and quantum circuits, which allows us to determine the quantum computational complexity of the partition function by studying the corresponding quantum circuit. Finally, we outline the possibility of applying quantum information concepts and tools to certain models of dis- crete quantum gravity. The latter provide a natural route to generalize our results, insofar as the central quantity has the form of a partition function, and as classical spin models are used as toy models of matter. (author)
The Brandeis Dice Problem and Statistical Mechanics
van Enk, Steven J.
2014-11-01
Jaynes invented the Brandeis Dice Problem as a simple illustration of the MaxEnt (Maximum Entropy) procedure that he had demonstrated to work so well in Statistical Mechanics. I construct here two alternative solutions to his toy problem. One, like Jaynes' solution, uses MaxEnt and yields an analog of the canonical ensemble, but at a different level of description. The other uses Bayesian updating and yields an analog of the micro-canonical ensemble. Both, unlike Jaynes' solution, yield error bars, whose operational merits I discuss. These two alternative solutions are not equivalent for the original Brandeis Dice Problem, but become so in what must, therefore, count as the analog of the thermodynamic limit, M-sided dice with M → ∞. Whereas the mathematical analogies between the dice problem and Stat Mech are quite close, there are physical properties that the former lacks but that are crucial to the workings of the latter. Stat Mech is more than just MaxEnt.
The Statistical Mechanics of Ideal MHD Turbulence
Shebalin, John V.
2003-01-01
Turbulence is a universal, nonlinear phenomenon found in all energetic fluid and plasma motion. In particular. understanding magneto hydrodynamic (MHD) turbulence and incorporating its effects in the computation and prediction of the flow of ionized gases in space, for example, are great challenges that must be met if such computations and predictions are to be meaningful. Although a general solution to the "problem of turbulence" does not exist in closed form, numerical integrations allow us to explore the phase space of solutions for both ideal and dissipative flows. For homogeneous, incompressible turbulence, Fourier methods are appropriate, and phase space is defined by the Fourier coefficients of the physical fields. In the case of ideal MHD flows, a fairly robust statistical mechanics has been developed, in which the symmetry and ergodic properties of phase space is understood. A discussion of these properties will illuminate our principal discovery: Coherent structure and randomness co-exist in ideal MHD turbulence. For dissipative flows, as opposed to ideal flows, progress beyond the dimensional analysis of Kolmogorov has been difficult. Here, some possible future directions that draw on the ideal results will also be discussed. Our conclusion will be that while ideal turbulence is now well understood, real turbulence still presents great challenges.
Plasma Soliton Turbulence and Statistical Mechanics
International Nuclear Information System (INIS)
Treumann, R.A.; Pottelette, R.
1999-01-01
Collisionless kinetic plasma turbulence is described approximately in terms of a superposition of non-interacting solitary waves. We discuss the relevance of such a description under astrophysical conditions. Several types of solitary waves may be of interest in this relation as generators of turbulence and turbulent transport. A consistent theory of turbulence can be given only in a few particular cases when the description can be reduced to the Korteweg-de Vries equation or some other simple equation like the Kadomtsev-Petviashvili equation. It turns out that the soliton turbulence is usually energetically harder than the ordinary weakly turbulent plasma description. This implies that interaction of particles with such kinds of turbulence can lead to stronger acceleration than in ordinary turbulence. However, the description in our model is only classical and non-relativistic. Transport in solitary turbulence is most important for drift wave turbulence. Such waves form solitary drift wave vortices which may provide cross-field transport. A more general discussion is given on transport. In a model of Levy flight trapping of particles in solitons (or solitary turbulence) one finds that the residence time of particles in the region of turbulence may be described by a generalized Lorentzian probability distribution. It is shown that under collisionless equilibrium conditions far away from thermal equilibrium such distributions are natural equilibrium distributions. A consistent thermodynamic description of such media can be given in terms of a generalized Lorentzian statistical mechanics and thermodynamics. (author)
Statistical mechanics, gravity, and Euclidean theory
International Nuclear Information System (INIS)
Fursaev, Dmitri V.
2002-01-01
A review of computations of free energy for Gibbs states on stationary but not static gravitational and gauge backgrounds is given. On these backgrounds wave equations for free fields are reduced to eigenvalue problems which depend non-linearly on the spectral parameter. We present a method to deal with such problems. In particular, we demonstrate how some results of the spectral theory of second-order elliptic operators, such as heat kernel asymptotics, can be extended to a class of non-linear spectral problems. The method is used to trace down the relation between the canonical definition of the free energy based on summation over the modes and the covariant definition given in Euclidean quantum gravity. As an application, high-temperature asymptotics of the free energy and of the thermal part of the stress-energy tensor in the presence of rotation are derived. We also discuss statistical mechanics in the presence of Killing horizons where canonical and Euclidean theories are related in a non-trivial way
Probabilistic cellular automata: Some statistical mechanical considerations
International Nuclear Information System (INIS)
Lebowitz, J.L.; Maes, C.; Speer, E.R.
1990-01-01
Spin systems evolving in continuous or discrete time under the action of stochastic dynamics are used to model phenomena as diverse as the structure of alloys and the functioning of neural networks. While in some cases the dynamics are secondary, designed to produce a specific stationary measure whose properties one is interested in studying, there are other cases in which the only available information is the dynamical rule. Prime examples of the former are computer simulations, via Glauber dynamics, of equilibrium Gibbs measures with a specified interaction potential. Examples of the latter include various types of majority rule dynamics used as models for pattern recognition and for error-tolerant computations. The present note discusses ways in which techniques found useful in equilibrium statistical mechanics can be applied to a particular class of models of the latter types. These are cellular automata with noise: systems in which the spins are updated stochastically at integer times, simultaneously at all sites of some regular lattice. These models were first investigated in detail in the Soviet literature of the late sixties and early seventies. They are now generally referred to as Stochastic or Probabilistic Cellular Automata (PCA), and may be considered to include deterministic automata (CA) as special limits. 16 refs., 3 figs
International Nuclear Information System (INIS)
Huang, Fuqun; Smidts, Carol
2017-01-01
Understanding cause-effect relations between concepts in software dependability engineering is fundamental to various research or industrial activities. Cognitive maps are traditionally used to elicit and represent such knowledge; however they seem incapable of accurately representing complex causal mechanisms in dependability engineering. This paper proposes a new notation called Causal Mechanism Graph (CMG) to elicit and represent the cause-effect domain knowledge embedded in experts’ minds or described in the literature. CMG contains a new set of symbols elicited from domain experts to capture the recurring interaction mechanisms between multiple concepts in software dependability engineering. Furthermore, compared to major existing graphic methods, CMG is particularly robust and suitable for mental knowledge elicitation: it allows one to represent the full range of cause-effect knowledge, accurately or fuzzily as one sees fit depending on the depth of knowledge he/she has. This feature combined with excellent reliability and validity poses CMG as a promising method that has the potential to be used in various areas, such as software dependability requirement elicitation, software dependability assessment and dependability risk control. - Highlights: • A new notation CMG for capturing cause-effect conceptual knowledge in software dependability. • CMG is particularly robust and suitable for mental knowledge representation. • CMG is a visual representation that bridges mental knowledge, natural and mathematical language. • CMG possesses excellent representation capability, validity and inter-coder reliability. • CMG is a fundamental method for various areas in dependability engineering.
A statistical mechanical model of economics
Lubbers, Nicholas Edward Williams
Statistical mechanics pursues low-dimensional descriptions of systems with a very large number of degrees of freedom. I explore this theme in two contexts. The main body of this dissertation explores and extends the Yard Sale Model (YSM) of economic transactions using a combination of simulations and theory. The YSM is a simple interacting model for wealth distributions which has the potential to explain the empirical observation of Pareto distributions of wealth. I develop the link between wealth condensation and the breakdown of ergodicity due to nonlinear diffusion effects which are analogous to the geometric random walk. Using this, I develop a deterministic effective theory of wealth transfer in the YSM that is useful for explaining many quantitative results. I introduce various forms of growth to the model, paying attention to the effect of growth on wealth condensation, inequality, and ergodicity. Arithmetic growth is found to partially break condensation, and geometric growth is found to completely break condensation. Further generalizations of geometric growth with growth in- equality show that the system is divided into two phases by a tipping point in the inequality parameter. The tipping point marks the line between systems which are ergodic and systems which exhibit wealth condensation. I explore generalizations of the YSM transaction scheme to arbitrary betting functions to develop notions of universality in YSM-like models. I find that wealth vi condensation is universal to a large class of models which can be divided into two phases. The first exhibits slow, power-law condensation dynamics, and the second exhibits fast, finite-time condensation dynamics. I find that the YSM, which exhibits exponential dynamics, is the critical, self-similar model which marks the dividing line between the two phases. The final chapter develops a low-dimensional approach to materials microstructure quantification. Modern materials design harnesses complex
A statistical mechanics approach to Granovetter theory
Barra, Adriano; Agliari, Elena
2012-05-01
In this paper we try to bridge breakthroughs in quantitative sociology/econometrics, pioneered during the last decades by Mac Fadden, Brock-Durlauf, Granovetter and Watts-Strogatz, by introducing a minimal model able to reproduce essentially all the features of social behavior highlighted by these authors. Our model relies on a pairwise Hamiltonian for decision-maker interactions which naturally extends the multi-populations approaches by shifting and biasing the pattern definitions of a Hopfield model of neural networks. Once introduced, the model is investigated through graph theory (to recover Granovetter and Watts-Strogatz results) and statistical mechanics (to recover Mac-Fadden and Brock-Durlauf results). Due to the internal symmetries of our model, the latter is obtained as the relaxation of a proper Markov process, allowing even to study its out-of-equilibrium properties. The method used to solve its equilibrium is an adaptation of the Hamilton-Jacobi technique recently introduced by Guerra in the spin-glass scenario and the picture obtained is the following: shifting the patterns from [-1,+1]→[0.+1] implies that the larger the amount of similarities among decision makers, the stronger their relative influence, and this is enough to explain both the different role of strong and weak ties in the social network as well as its small-world properties. As a result, imitative interaction strengths seem essentially a robust request (enough to break the gauge symmetry in the couplings), furthermore, this naturally leads to a discrete choice modelization when dealing with the external influences and to imitative behavior à la Curie-Weiss as the one introduced by Brock and Durlauf.
Statistical Mechanics of Money, Income, and Wealth
Yakovenko, Victor
2006-03-01
In Ref. [1], we proposed an analogy between the exponential Boltzmann-Gibbs distribution of energy in physics and the equilibrium probability distribution of money in a closed economic system. Analogously to energy, money is locally conserved in interactions between economic agents, so the thermal Boltzmann-Gibbs distribution function is expected for money. Since then, many researchers followed and expanded this idea [2]. Much work was done on the analysis of empirical data, mostly on income, for which a lot of tax and census data is available. We demonstrated [3] that income distribution in the USA has a well-defined two-class structure. The majority of population (97-99%) belongs to the lower class characterized by the exponential Boltzmann-Gibbs (``thermal'') distribution. The upper class (1-3% of population) has a Pareto power-law (``superthermal'') distribution, whose parameters change in time with the rise and fall of stock market. We proposed a concept of equilibrium inequality in a society, based on the principle of maximal entropy, and quantitatively demonstrated that it applies to the majority of population. Income distribution in other countries shows similar patterns. For more references, see http://www2.physics.umd.edu/˜yakovenk/econophysics.html. References: [1] A. A. Dragulescu and V. M. Yakovenko, ``Statistical mechanics of money'', Eur. Phys. J. B 17, 723 (2000). [2] ``Econophysics of Wealth Distributions'', edited by A. Chatterjee, S. Yarlagadda, and B. K. Chakrabarti, Springer, 2005. [3] A. C. Silva and V. M. Yakovenko, ``Temporal evolution of the `thermal' and `superthermal' income classes in the USA during 1983-2001'', Europhys. Lett. 69, 304 (2005).
Statistical Mechanics and Applications in Condensed Matter
Di Castro, Carlo; Raimondi, Roberto
2015-08-01
Preface; 1. Thermodynamics: a brief overview; 2. Kinetics; 3. From Boltzmann to Gibbs; 4. More ensembles; 5. The thermodynamic limit and its thermodynamic stability; 6. Density matrix and quantum statistical mechanics; 7. The quantum gases; 8. Mean-field theories and critical phenomena; 9. Second quantization and Hartree-Fock approximation; 10. Linear response and fluctuation-dissipation theorem in quantum systems: equilibrium and small deviations; 11. Brownian motion and transport in disordered systems; 12. Fermi liquids; 13. The Landau theory of the second order phase transitions; 14. The Landau-Wilson model for critical phenomena; 15. Superfluidity and superconductivity; 16. The scaling theory; 17. The renormalization group approach; 18. Thermal Green functions; 19. The microscopic foundations of Fermi liquids; 20. The Luttinger liquid; 21. Quantum interference effects in disordered electron systems; Appendix A. The central limit theorem; Appendix B. Some useful properties of the Euler Gamma function; Appendix C. Proof of the second theorem of Yang and Lee; Appendix D. The most probable distribution for the quantum gases; Appendix E. Fermi-Dirac and Bose-Einstein integrals; Appendix F. The Fermi gas in a uniform magnetic field: Landau diamagnetism; Appendix G. Ising and gas-lattice models; Appendix H. Sum over discrete Matsubara frequencies; Appendix I. Hydrodynamics of the two-fluid model of superfluidity; Appendix J. The Cooper problem in the theory of superconductivity; Appendix K. Superconductive fluctuations phenomena; Appendix L. Diagrammatic aspects of the exact solution of the Tomonaga Luttinger model; Appendix M. Details on the theory of the disordered Fermi liquid; References; Author index; Index.
Statistical Mechanics of Temporal and Interacting Networks
Zhao, Kun
In the last ten years important breakthroughs in the understanding of the topology of complexity have been made in the framework of network science. Indeed it has been found that many networks belong to the universality classes called small-world networks or scale-free networks. Moreover it was found that the complex architecture of real world networks strongly affects the critical phenomena defined on these structures. Nevertheless the main focus of the research has been the characterization of single and static networks. Recently, temporal networks and interacting networks have attracted large interest. Indeed many networks are interacting or formed by a multilayer structure. Example of these networks are found in social networks where an individual might be at the same time part of different social networks, in economic and financial networks, in physiology or in infrastructure systems. Moreover, many networks are temporal, i.e. the links appear and disappear on the fast time scale. Examples of these networks are social networks of contacts such as face-to-face interactions or mobile-phone communication, the time-dependent correlations in the brain activity and etc. Understanding the evolution of temporal and multilayer networks and characterizing critical phenomena in these systems is crucial if we want to describe, predict and control the dynamics of complex system. In this thesis, we investigate several statistical mechanics models of temporal and interacting networks, to shed light on the dynamics of this new generation of complex networks. First, we investigate a model of temporal social networks aimed at characterizing human social interactions such as face-to-face interactions and phone-call communication. Indeed thanks to the availability of data on these interactions, we are now in the position to compare the proposed model to the real data finding good agreement. Second, we investigate the entropy of temporal networks and growing networks , to provide
The scientifiv way of thinking in statistics, statistical physics and quantum mechanics
Săvoiu, Gheorghe
2008-01-01
This paper focuses on the way of thinking in both classical and modern Physics and Statistics, Statistical Mechanics or Statistical Physics and Quantum Mechanics. These different statistical ways of thinking and their specific methods have generated new fields for new activities and new scientific disciplines, like Econophysics (between Economics and Physics), Sociophysics (between Sociology and Physics), Mediaphysics (between all media and comunication sciences), etc. After describing some r...
The scientific way of thinking in statistics, statistical physics and quantum mechanics
Săvoiu, Gheorghe
2008-01-01
This paper focuses on the way of thinking in both classical and modern Physics and Statistics, Statistical Mechanics or Statistical Physics and Quantum Mechanics. These different statistical ways of thinking and their specific methods have generated new fields for new activities and new scientific disciplines, like Econophysics (between Economics and Physics), Sociophysics (between Sociology and Physics), Mediaphysics (between all media and comunication sciences), etc. After describing some r...
The statistical mechanics of financial markets
Voit, Johannes
2003-01-01
From the reviews of the first edition - "Provides an excellent introduction for physicists interested in the statistical properties of financial markets. Appropriately early in the book the basic financial terms such as shorts, limit orders, puts, calls, and other terms are clearly defined. Examples, often with graphs, augment the reader’s understanding of what may be a plethora of new terms and ideas… [This is] an excellent starting point for the physicist interested in the subject. Some of the book’s strongest features are its careful definitions, its detailed examples, and the connection it establishes to physical systems." PHYSICS TODAY "This book is excellent at illustrating the similarities of financial markets with other non-equilibrium physical systems. [...] In summary, a very good book that offers more than just qualitative comparisons of physics and finance." (www.quantnotes.com) This highly-praised introductory treatment describes parallels between statistical physics and finance - both thos...
Statistical mechanics of microscopically thin thermalized shells
Kosmrlj, Andrej
Recent explosion in fabrication of microscopically thin free standing structures made from graphene and other two-dimensional materials has led to a renewed interest in the mechanics of such structures in presence of thermal fluctuations. Since late 1980s it has been known that for flat solid sheets thermal fluctuations effectively increase the bending rigidity and reduce the bulk and shear moduli in a scale-dependent fashion. However, much is still unknown about the mechanics of thermalized flat sheets of complex geometries and about the mechanics of thermalized shells with non-zero background curvature. In this talk I will present recent development in the mechanics of thermalized ribbons, spherical shells and cylindrical tubes. Long ribbons are found to behave like hybrids between flat sheets with renormalized elastic constants and semi-flexible polymers, and these results can be used to predict the mechanics of graphene kirigami structures. Contrary to the anticipated behavior for ribbons, the non-zero background curvature of shells leads to remarkable novel phenomena. In shells, thermal fluctuations effectively generate negative surface tension, which can significantly reduce the critical buckling pressure for spherical shells and the critical axial load for cylindrical tubes. For large shells this thermally generated load becomes big enough to spontaneously crush spherical shells and cylindrical tubes even in the absence of external loads. I will comment on the relevance for crushing of microscopic shells (viral capsids, bacteria, microcapsules) due to osmotic shocks and for crushing of nanotubes.
A statistical mechanical model for equilibrium ionization
International Nuclear Information System (INIS)
Macris, N.; Martin, P.A.; Pule, J.
1990-01-01
A quantum electron interacts with a classical gas of hard spheres and is in thermal equilibrium with it. The interaction is attractive and the electron can form a bound state with the classical particles. It is rigorously shown that in a well defined low density and low temperature limit, the ionization probability for the electron tends to the value predicted by the Saha formula for thermal ionization. In this regime, the electron is found to be in a statistical mixture of a bound and a free state. (orig.)
Statistical mechanics of reacting dense plasmas
Energy Technology Data Exchange (ETDEWEB)
Rogers, F.J.
1978-11-22
A review of the quantum statistical theory of strongly coupled many component plasmas is given. The theoretical development is shown to consist of six separate parts. Compensation between bound and scattering state contributions to the partition function and use of the shifted Debye energy levels are important aspects of the analysis. The results are valid when the electrons are moderately coupled to the heavy ions, i.e., ..lambda../sub e..cap alpha../* < 1, but no restriction is placed on the coupling between heavy ions. Another restriction is that lambda/lambda/sub D/ < 1, i.e., the thermal deBroglie wavelength is less than the Debye length. Numerical calculations of PV/N/sub 0/kT and C/sub V/ are given for a Rubidium plasma.
Reality, Causality, and Probability, from Quantum Mechanics to Quantum Field Theory
Plotnitsky, Arkady
2015-10-01
These three lectures consider the questions of reality, causality, and probability in quantum theory, from quantum mechanics to quantum field theory. They do so in part by exploring the ideas of the key founding figures of the theory, such N. Bohr, W. Heisenberg, E. Schrödinger, or P. A. M. Dirac. However, while my discussion of these figures aims to be faithful to their thinking and writings, and while these lectures are motivated by my belief in the helpfulness of their thinking for understanding and advancing quantum theory, this project is not driven by loyalty to their ideas. In part for that reason, these lectures also present different and even conflicting ways of thinking in quantum theory, such as that of Bohr or Heisenberg vs. that of Schrödinger. The lectures, most especially the third one, also consider new physical, mathematical, and philosophical complexities brought in by quantum field theory vis-à-vis quantum mechanics. I close by briefly addressing some of the implications of the argument presented here for the current state of fundamental physics.
Statistical mechanics of human resource allocation
Inoue, Jun-Ichi; Chen, He
2014-03-01
We provide a mathematical platform to investigate the network topology of agents, say, university graduates who are looking for their positions in labor markets. The basic model is described by the so-called Potts spin glass which is well-known in the research field of statistical physics. In the model, each Potts spin (a tiny magnet in atomic scale length) represents the action of each student, and it takes a discrete variable corresponding to the company he/she applies for. We construct the energy to include three distinct effects on the students' behavior, namely, collective effect, market history and international ranking of companies. In this model system, the correlations (the adjacent matrix) between students are taken into account through the pairwise spin-spin interactions. We carry out computer simulations to examine the efficiency of the model. We also show that some chiral representation of the Potts spin enables us to obtain some analytical insights into our labor markets. This work was financially supported by Grant-in-Aid for Scientific Research (C) of Japan Society for the Promotion of Science No. 25330278.
Statistical Mechanics of Japanese Labor Markets
Chen, He
We introduce a probabilistic model to analyze job-matching processes of recent Japanese labor markets, in particular, for university graduates by means of statistical physics. To make a model of the market efficiently, we take into account several hypotheses. Namely, each company fixes the (business year independent) number of opening positions for newcomers. The ability of gathering newcomers depends on the result of job matching process in past business years. This fact means that the ability of the company is weakening if the company did not make their quota or the company gathered applicants too much over the quota. All university graduates who are looking for their jobs can access the public information about the ranking of companies. By assuming the above essential key points, we construct the local energy function of each company and describe the probability that an arbitrary company gets students at each business year by a Boltzmann-Gibbs distribution. We evaluate the relevant physical quantities such as the employment rate and Gini index. We discuss social inequalities in labor markets, and provide some ways to improve these situations, such as the informal job offer rate, the job-worker mismatch between students and companies. Graduate School of Information Science and Technology.
Reconstructing Constructivism: Causal Models, Bayesian Learning Mechanisms, and the Theory Theory
Gopnik, Alison; Wellman, Henry M.
2012-01-01
We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework…
Statistical mechanics of a cat's cradle
Shen, Tongye; Wolynes, Peter G.
2006-11-01
It is believed that, much like a cat's cradle, the cytoskeleton can be thought of as a network of strings under tension. We show that both regular and random bond-disordered networks having bonds that buckle upon compression exhibit a variety of phase transitions as a function of temperature and extension. The results of self-consistent phonon calculations for the regular networks agree very well with computer simulations at finite temperature. The analytic theory also yields a rigidity onset (mechanical percolation) and the fraction of extended bonds for random networks. There is very good agreement with the simulations by Delaney et al (2005 Europhys. Lett. 72 990). The mean field theory reveals a nontranslationally invariant phase with self-generated heterogeneity of tautness, representing 'antiferroelasticity'.
Statistical Mechanics of Thin Spherical Shells
Directory of Open Access Journals (Sweden)
Andrej Košmrlj
2017-01-01
Full Text Available We explore how thermal fluctuations affect the mechanics of thin amorphous spherical shells. In flat membranes with a shear modulus, thermal fluctuations increase the bending rigidity and reduce the in-plane elastic moduli in a scale-dependent fashion. This is still true for spherical shells. However, the additional coupling between the shell curvature, the local in-plane stretching modes, and the local out-of-plane undulations leads to novel phenomena. In spherical shells, thermal fluctuations produce a radius-dependent negative effective surface tension, equivalent to applying an inward external pressure. By adapting renormalization group calculations to allow for a spherical background curvature, we show that while small spherical shells are stable, sufficiently large shells are crushed by this thermally generated “pressure.” Such shells can be stabilized by an outward osmotic pressure, but the effective shell size grows nonlinearly with increasing outward pressure, with the same universal power-law exponent that characterizes the response of fluctuating flat membranes to a uniform tension.
Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory
Gopnik, Alison; Wellman, Henry M.
2012-01-01
We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but ...
Morris, Stephen P.; Edovald, Triin; Lloyd, Cheryl; Kiss, Zsolt
2016-01-01
Based on the experience of evaluating 2 cross-age peer-tutoring interventions, we argue that researchers need to pay greater attention to causal mechanisms within the context of school-based randomised controlled trials. Without studying mechanisms, researchers are less able to explain the underlying causal processes that give rise to results from…
A statistical mechanical approach to restricted integer partition functions
Zhou, Chi-Chun; Dai, Wu-Sheng
2018-05-01
The main aim of this paper is twofold: (1) suggesting a statistical mechanical approach to the calculation of the generating function of restricted integer partition functions which count the number of partitions—a way of writing an integer as a sum of other integers under certain restrictions. In this approach, the generating function of restricted integer partition functions is constructed from the canonical partition functions of various quantum gases. (2) Introducing a new type of restricted integer partition functions corresponding to general statistics which is a generalization of Gentile statistics in statistical mechanics; many kinds of restricted integer partition functions are special cases of this restricted integer partition function. Moreover, with statistical mechanics as a bridge, we reveal a mathematical fact: the generating function of restricted integer partition function is just the symmetric function which is a class of functions being invariant under the action of permutation groups. Using this approach, we provide some expressions of restricted integer partition functions as examples.
Statistical mechanics for a system with imperfections: pt. 1
International Nuclear Information System (INIS)
Choh, S.T.; Kahng, W.H.; Um, C.I.
1982-01-01
Statistical mechanics is extended to treat a system where parts of the Hamiltonian are randomly varying. As the starting point of the theory, the statistical correlation among energy levels is neglected, allowing use of the central limit theorem of the probability theory. (Author)
Kelcey, Benjamin; Dong, Nianbo; Spybrook, Jessaca; Cox, Kyle
2017-01-01
Designs that facilitate inferences concerning both the total and indirect effects of a treatment potentially offer a more holistic description of interventions because they can complement "what works" questions with the comprehensive study of the causal connections implied by substantive theories. Mapping the sensitivity of designs to…
Quantum mechanics and field theory with fractional spin and statistics
International Nuclear Information System (INIS)
Forte, S.
1992-01-01
Planar systems admit quantum states that are neither bosons nor fermions, i.e., whose angular momentum is neither integer nor half-integer. After a discussion of some examples of familiar models in which fractional spin may arise, the relevant (nonrelativistic) quantum mechanics is developed from first principles. The appropriate generalization of statistics is also discussed. Some physical effects of fractional spin and statistics are worked out explicitly. The group theory underlying relativistic models with fractional spin and statistics is then introduced and applied to relativistic particle mechanics and field theory. Field-theoretical models in 2+1 dimensions are presented which admit solitons that carry fractional statistics, and are discussed in a semiclassical approach, in the functional integral approach, and in the canonical approach. Finally, fundamental field theories whose Fock states carry fractional spin and statistics are discussed
Statistical-mechanical entropy by the thin-layer method
International Nuclear Information System (INIS)
Feng, He; Kim, Sung Won
2003-01-01
G. Hooft first studied the statistical-mechanical entropy of a scalar field in a Schwarzschild black hole background by the brick-wall method and hinted that the statistical-mechanical entropy is the statistical origin of the Bekenstein-Hawking entropy of the black hole. However, according to our viewpoint, the statistical-mechanical entropy is only a quantum correction to the Bekenstein-Hawking entropy of the black-hole. The brick-wall method based on thermal equilibrium at a large scale cannot be applied to the cases out of equilibrium such as a nonstationary black hole. The statistical-mechanical entropy of a scalar field in a nonstationary black hole background is calculated by the thin-layer method. The condition of local equilibrium near the horizon of the black hole is used as a working postulate and is maintained for a black hole which evaporates slowly enough and whose mass is far greater than the Planck mass. The statistical-mechanical entropy is also proportional to the area of the black hole horizon. The difference from the stationary black hole is that the result relies on a time-dependent cutoff
Fracture mechanics and statistical mechanics of reinforced elastomeric blends
Heinrich, Gert; Kaliske, Michael; Klüppel, Manfred; Schneider, Konrad; Vilgis, Thomas
2013-01-01
Elastomers are found in many applications ranging from technology to daily life applications for example in tires, drive systems, sealings and print rollers. Dynamical operation conditions put extremely high demands on the performance and stability of these materials and their elastic and flow properties can be easily adjusted by simple manipulations on their elastic and viscous properties. However, the required service life suffers often from material damage as a result of wear processes such as abrasion and wear fatigue, mostly caused by crack formation and propagation. This book covers interdisciplinary research between physics, physical chemistry, material sciences and engineering of elastomers within the range from nanometres to millimetres and connects these aspects with the constitutive material properties. The different chapters describe reliable lifetime and durability predictions based on new fracture mechanical testing concepts and advanced material-theoretical methods which are finally implemented...
College Education and Social Trust: An Evidence-Based Study on the Causal Mechanisms
Huang, Jian; van den Brink, Henriette Maassen; Groot, Wim
2011-01-01
This paper examines the influence of college education on social trust at the individual level. Based on the literature of trust and social trust, we hypothesize that life experience/development since adulthood and perceptions of cultural/social structures are two primary channels in the causal linkage between college education and social trust.…
College education and social trust: an evidence-based study on the causal mechanisms
Huang, J.; Maassen van den Brink, H.; Groot, W.
2011-01-01
This paper examines the influence of college education on social trust at the individual level. Based on the literature of trust and social trust, we hypothesize that life experience/development since adulthood and perceptions of cultural/social structures are two primary channels in the causal
Brief introduction to Lie-admissible formulations in statistical mechanics
International Nuclear Information System (INIS)
Fronteau, J.
1981-01-01
The present article is a summary of the essential ideas and results published in previous articles, the aim here being to describe the situation in a schematic way for the benefit of non-specialists. The non-truncated Liouville theorem and equation, natural introduction of Lie-admissible formulations into statistical mechanics, the notion of a statistical quasi-particle, and transition towards the notion of fine thermodynamics are discussed
The road to Maxwell's demon conceptual foundations of statistical mechanics
Hemmo, Meir
2012-01-01
Time asymmetric phenomena are successfully predicted by statistical mechanics. Yet the foundations of this theory are surprisingly shaky. Its explanation for the ease of mixing milk with coffee is incomplete, and even implies that un-mixing them should be just as easy. In this book the authors develop a new conceptual foundation for statistical mechanics that addresses this difficulty. Explaining the notions of macrostates, probability, measurement, memory, and the arrow of time in statistical mechanics, they reach the startling conclusion that Maxwell's Demon, the famous perpetuum mobile, is consistent with the fundamental physical laws. Mathematical treatments are avoided where possible, and instead the authors use novel diagrams to illustrate the text. This is a fascinating book for graduate students and researchers interested in the foundations and philosophy of physics.
Statistical mechanics of low-density parity-check codes
Energy Technology Data Exchange (ETDEWEB)
Kabashima, Yoshiyuki [Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology, Yokohama 2268502 (Japan); Saad, David [Neural Computing Research Group, Aston University, Birmingham B4 7ET (United Kingdom)
2004-02-13
We review recent theoretical progress on the statistical mechanics of error correcting codes, focusing on low-density parity-check (LDPC) codes in general, and on Gallager and MacKay-Neal codes in particular. By exploiting the relation between LDPC codes and Ising spin systems with multi-spin interactions, one can carry out a statistical mechanics based analysis that determines the practical and theoretical limitations of various code constructions, corresponding to dynamical and thermodynamical transitions, respectively, as well as the behaviour of error-exponents averaged over the corresponding code ensemble as a function of channel noise. We also contrast the results obtained using methods of statistical mechanics with those derived in the information theory literature, and show how these methods can be generalized to include other channel types and related communication problems. (topical review)
Statistical mechanics of low-density parity-check codes
International Nuclear Information System (INIS)
Kabashima, Yoshiyuki; Saad, David
2004-01-01
We review recent theoretical progress on the statistical mechanics of error correcting codes, focusing on low-density parity-check (LDPC) codes in general, and on Gallager and MacKay-Neal codes in particular. By exploiting the relation between LDPC codes and Ising spin systems with multi-spin interactions, one can carry out a statistical mechanics based analysis that determines the practical and theoretical limitations of various code constructions, corresponding to dynamical and thermodynamical transitions, respectively, as well as the behaviour of error-exponents averaged over the corresponding code ensemble as a function of channel noise. We also contrast the results obtained using methods of statistical mechanics with those derived in the information theory literature, and show how these methods can be generalized to include other channel types and related communication problems. (topical review)
mediation: R Package for Causal Mediation Analysis
Directory of Open Access Journals (Sweden)
Dustin Tingley
2014-09-01
Full Text Available In this paper, we describe the R package mediation for conducting causal mediation analysis in applied empirical research. In many scientific disciplines, the goal of researchers is not only estimating causal effects of a treatment but also understanding the process in which the treatment causally affects the outcome. Causal mediation analysis is frequently used to assess potential causal mechanisms. The mediation package implements a comprehensive suite of statistical tools for conducting such an analysis. The package is organized into two distinct approaches. Using the model-based approach, researchers can estimate causal mediation effects and conduct sensitivity analysis under the standard research design. Furthermore, the design-based approach provides several analysis tools that are applicable under different experimental designs. This approach requires weaker assumptions than the model-based approach. We also implement a statistical method for dealing with multiple (causally dependent mediators, which are often encountered in practice. Finally, the package also offers a methodology for assessing causal mediation in the presence of treatment noncompliance, a common problem in randomized trials.
Limiting processes in non-equilibrium classical statistical mechanics
International Nuclear Information System (INIS)
Jancel, R.
1983-01-01
After a recall of the basic principles of the statistical mechanics, the results of ergodic theory, the transient at the thermodynamic limit and his link with the transport theory near the equilibrium are analyzed. The fundamental problems put by the description of non-equilibrium macroscopic systems are investigated and the kinetic methods are stated. The problems of the non-equilibrium statistical mechanics are analyzed: irreversibility and coarse-graining, macroscopic variables and kinetic description, autonomous reduced descriptions, limit processes, BBGKY hierarchy, limit theorems [fr
Generalized statistical mechanics approaches to earthquakes and tectonics
Papadakis, Giorgos; Michas, Georgios
2016-01-01
Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes. PMID:28119548
Pearce, Marcus T
2018-05-11
Music perception depends on internal psychological models derived through exposure to a musical culture. It is hypothesized that this musical enculturation depends on two cognitive processes: (1) statistical learning, in which listeners acquire internal cognitive models of statistical regularities present in the music to which they are exposed; and (2) probabilistic prediction based on these learned models that enables listeners to organize and process their mental representations of music. To corroborate these hypotheses, I review research that uses a computational model of probabilistic prediction based on statistical learning (the information dynamics of music (IDyOM) model) to simulate data from empirical studies of human listeners. The results show that a broad range of psychological processes involved in music perception-expectation, emotion, memory, similarity, segmentation, and meter-can be understood in terms of a single, underlying process of probabilistic prediction using learned statistical models. Furthermore, IDyOM simulations of listeners from different musical cultures demonstrate that statistical learning can plausibly predict causal effects of differential cultural exposure to musical styles, providing a quantitative model of cultural distance. Understanding the neural basis of musical enculturation will benefit from close coordination between empirical neuroimaging and computational modeling of underlying mechanisms, as outlined here. © 2018 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.
College Education and Social Trust: An Evidence-Based Study on the Causal Mechanisms
Huang, Jian; van den Brink, Henri?tte Maassen; Groot, Wim
2010-01-01
This paper examines the influence of college education on social trust at the individual level. Based on the literature of trust and social trust, we hypothesize that life experience/development since adulthood and perceptions of cultural/social structures are two primary channels in the causal linkage between college education and social trust. In the first part of the empirical study econometric techniques are employed to tackle the omitted-variable problem and substantial evidence is found...
Statistical physics of black holes as quantum-mechanical systems
Giddings, Steven B.
2013-01-01
Some basic features of black-hole statistical mechanics are investigated, assuming that black holes respect the principles of quantum mechanics. Care is needed in defining an entropy S_bh corresponding to the number of microstates of a black hole, given that the black hole interacts with its surroundings. An open question is then the relationship between this entropy and the Bekenstein-Hawking entropy S_BH. For a wide class of models with interactions needed to ensure unitary quantum evolutio...
A statistical mechanics approach to mixing in stratified fluids
Venaille , Antoine; Gostiaux , Louis; Sommeria , Joël
2016-01-01
Accepted for the Journal of Fluid Mechanics; Predicting how much mixing occurs when a given amount of energy is injected into a Boussinesq fluid is a longstanding problem in stratified turbulence. The huge number of degrees of freedom involved in these processes renders extremely difficult a deterministic approach to the problem. Here we present a statistical mechanics approach yielding a prediction for a cumulative, global mixing efficiency as a function of a global Richard-son number and th...
Molecular dynamics and Monte Carlo calculations in statistical mechanics
International Nuclear Information System (INIS)
Wood, W.W.; Erpenbeck, J.J.
1976-01-01
Monte Carlo and molecular dynamics calculations on statistical mechanical systems is reviewed giving some of the more significant recent developments. It is noted that the term molecular dynamics refers to the time-averaging technique for hard-core and square-well interactions and for continuous force-law interactions. Ergodic questions, methodology, quantum mechanical, Lorentz, and one-dimensional, hard-core, and square and triangular-well systems, short-range soft potentials, and other systems are included. 268 references
SRB states and nonequilibrium statistical mechanics close to equilibrium
Gallavotti, Giovannni; Ruelle, David
1996-01-01
Nonequilibrium statistical mechanics close to equilibrium is studied using SRB states and a formula for their derivatives with respect to parameters. We write general expressions for the thermodynamic fluxes (or currents) and the transport coefficients, generalizing previous results. In this framework we give a general proof of the Onsager reciprocity relations.
Two-dimensional models in statistical mechanics and field theory
International Nuclear Information System (INIS)
Koberle, R.
1980-01-01
Several features of two-dimensional models in statistical mechanics and Field theory, such as, lattice quantum chromodynamics, Z(N), Gross-Neveu and CP N-1 are discussed. The problems of confinement and dynamical mass generation are also analyzed. (L.C.) [pt
Mathematics of statistical mechanics and the chaos theory
International Nuclear Information System (INIS)
Llave, R. de la; Haro, A.
2000-01-01
Statistical mechanics requires a language that unifies probabilistic and deterministic description of physical systems. We describe briefly some of the mathematical ideas needed for this unification. These ideas have also proved important in the study of chaotic systems. (Author) 17 refs
Braid group, knot theory and statistical mechanics II
Yang Chen Ning
1994-01-01
The present volume is an updated version of the book edited by C N Yang and M L Ge on the topics of braid groups and knot theory, which are related to statistical mechanics. This book is based on the 1989 volume but has new material included and new contributors.
Statistical mechanics and the evolution of polygenic quantitative traits
Barton, N.H.; De Vladar, H.P.
The evolution of quantitative characters depends on the frequencies of the alleles involved, yet these frequencies cannot usually be measured. Previous groups have proposed an approximation to the dynamics of quantitative traits, based on an analogy with statistical mechanics. We present a modified
Grassmann methods in lattice field theory and statistical mechanics
International Nuclear Information System (INIS)
Bilgici, E.; Gattringer, C.; Huber, P.
2006-01-01
Full text: In two dimensions models of loops can be represented as simple Grassmann integrals. In our work we explore the generalization of these techniques to lattice field theories and statistical mechanic systems in three and four dimensions. We discuss possible strategies and applications for representations of loop and surface models as Grassmann integrals. (author)
Geometry and topology in hamiltonian dynamics and statistical mechanics
Pettini, Marco
2007-01-01
Explores the foundations of hamiltonian dynamical systems and statistical mechanics, in particular phase transitions, from the point of view of geometry and topology. This book provides an overview of the research in the area. Using geometrical thinking to solve fundamental problems in these areas could be highly productive
Statistical mechanics in the context of special relativity.
Kaniadakis, G
2002-11-01
In Ref. [Physica A 296, 405 (2001)], starting from the one parameter deformation of the exponential function exp(kappa)(x)=(sqrt[1+kappa(2)x(2)]+kappax)(1/kappa), a statistical mechanics has been constructed which reduces to the ordinary Boltzmann-Gibbs statistical mechanics as the deformation parameter kappa approaches to zero. The distribution f=exp(kappa)(-beta E+betamu) obtained within this statistical mechanics shows a power law tail and depends on the nonspecified parameter beta, containing all the information about the temperature of the system. On the other hand, the entropic form S(kappa)= integral d(3)p(c(kappa) f(1+kappa)+c(-kappa) f(1-kappa)), which after maximization produces the distribution f and reduces to the standard Boltzmann-Shannon entropy S0 as kappa-->0, contains the coefficient c(kappa) whose expression involves, beside the Boltzmann constant, another nonspecified parameter alpha. In the present effort we show that S(kappa) is the unique existing entropy obtained by a continuous deformation of S0 and preserving unaltered its fundamental properties of concavity, additivity, and extensivity. These properties of S(kappa) permit to determine unequivocally the values of the above mentioned parameters beta and alpha. Subsequently, we explain the origin of the deformation mechanism introduced by kappa and show that this deformation emerges naturally within the Einstein special relativity. Furthermore, we extend the theory in order to treat statistical systems in a time dependent and relativistic context. Then, we show that it is possible to determine in a self consistent scheme within the special relativity the values of the free parameter kappa which results to depend on the light speed c and reduces to zero as c--> infinity recovering in this way the ordinary statistical mechanics and thermodynamics. The statistical mechanics here presented, does not contain free parameters, preserves unaltered the mathematical and epistemological structure of
Introductive remarks on causal inference
Directory of Open Access Journals (Sweden)
Silvana A. Romio
2013-05-01
Full Text Available One of the more challenging issues in epidemiological research is being able to provide an unbiased estimate of the causal exposure-disease effect, to assess the possible etiological mechanisms and the implication for public health. A major source of bias is confounding, which can spuriously create or mask the causal relationship. In the last ten years, methodological research has been developed to better de_ne the concept of causation in epidemiology and some important achievements have resulted in new statistical models. In this review, we aim to show how a technique the well known by statisticians, i.e. standardization, can be seen as a method to estimate causal e_ects, equivalent under certain conditions to the inverse probability treatment weight procedure.
Fundamental link between system theory and statistical mechanics
International Nuclear Information System (INIS)
Atmanspacher, H.; Scheingraber, H.
1987-01-01
A fundamental link between system theory and statistical mechanics has been found to be established by the Kolmogorov entropy. By this quantity the temporal evolution of dynamical systems can be classified into regular, chaotic, and stochastic processes. Since K represents a measure for the internal information creation rate of dynamical systems, it provides an approach to irreversibility. The formal relationship to statistical mechanics is derived by means of an operator formalism originally introduced by Prigogine. For a Liouville operator L and an information operator M tilde acting on a distribution in phase space, it is shown that i[L, M tilde] = KI (I = identity operator). As a first consequence of this equivalence, a relation is obtained between the chaotic correlation time of a system and Prigogine's concept of a finite duration of presence. Finally, the existence of chaos in quantum systems is discussed with respect to the existence of a quantum mechanical time operator
Statistical mechanics of two-dimensional and geophysical flows
International Nuclear Information System (INIS)
Bouchet, Freddy; Venaille, Antoine
2012-01-01
The theoretical study of the self-organization of two-dimensional and geophysical turbulent flows is addressed based on statistical mechanics methods. This review is a self-contained presentation of classical and recent works on this subject; from the statistical mechanics basis of the theory up to applications to Jupiter’s troposphere and ocean vortices and jets. Emphasize has been placed on examples with available analytical treatment in order to favor better understanding of the physics and dynamics. After a brief presentation of the 2D Euler and quasi-geostrophic equations, the specificity of two-dimensional and geophysical turbulence is emphasized. The equilibrium microcanonical measure is built from the Liouville theorem. Important statistical mechanics concepts (large deviations and mean field approach) and thermodynamic concepts (ensemble inequivalence and negative heat capacity) are briefly explained and described. On this theoretical basis, we predict the output of the long time evolution of complex turbulent flows as statistical equilibria. This is applied to make quantitative models of two-dimensional turbulence, the Great Red Spot and other Jovian vortices, ocean jets like the Gulf-Stream, and ocean vortices. A detailed comparison between these statistical equilibria and real flow observations is provided. We also present recent results for non-equilibrium situations, for the studies of either the relaxation towards equilibrium or non-equilibrium steady states. In this last case, forces and dissipation are in a statistical balance; fluxes of conserved quantity characterize the system and microcanonical or other equilibrium measures no longer describe the system.
Gallo, Eduardo F; Posner, Jonathan
2016-01-01
Attention-deficit hyperactivity disorder (ADHD) is a neurodevelopmental disorder characterised by developmentally inappropriate levels of inattention and hyperactivity or impulsivity. The heterogeneity of its clinical manifestations and the differential responses to treatment and varied prognoses have long suggested myriad underlying causes. Over the past decade, clinical and basic research efforts have uncovered many behavioural and neurobiological alterations associated with ADHD, from genes to higher order neural networks. Here, we review the neurobiology of ADHD by focusing on neural circuits implicated in the disorder and discuss how abnormalities in circuitry relate to symptom presentation and treatment. We summarise the literature on genetic variants that are potentially related to the development of ADHD, and how these, in turn, might affect circuit function and relevant behaviours. Whether these underlying neurobiological factors are causally related to symptom presentation remains unresolved. Therefore, we assess efforts aimed at disentangling issues of causality, and showcase the shifting research landscape towards endophenotype refinement in clinical and preclinical settings. Furthermore, we review approaches being developed to understand the neurobiological underpinnings of this complex disorder including the use of animal models, neuromodulation, and pharmaco-imaging studies. PMID:27183902
Statistical fracture mechanics approach to the strength of brittle rock
International Nuclear Information System (INIS)
Ratigan, J.L.
1981-06-01
Statistical fracture mechanics concepts used in the past for rock are critically reviewed and modifications are proposed which are warranted by (1) increased understanding of fracture provided by modern fracture mechanics and (2) laboratory test data both from the literature and from this research. Over 600 direct and indirect tension tests have been performed on three different rock types; Stripa Granite, Sierra White Granite and Carrara Marble. In several instances assumptions which are common in the literature were found to be invalid. A three parameter statistical fracture mechanics model with Mode I critical strain energy release rate as the variant is presented. Methodologies for evaluating the parameters in this model as well as the more commonly employed two parameter models are discussed. The experimental results and analysis of this research indicate that surfacially distributed flaws, rather than volumetrically distributed flaws are responsible for rupture in many testing situations. For several of the rock types tested, anisotropy (both in apparent tensile strength and size effect) precludes the use of contemporary statistical fracture mechanics models
On the statistical-mechanical meaning of the Bousso bound
International Nuclear Information System (INIS)
Pesci, Alessandro
2008-01-01
The Bousso entropy bound, in its generalized form, is investigated for the case of perfect fluids at local thermodynamic equilibrium and evidence is found that the bound is satisfied if and only if a certain local thermodynamic property holds, emerging when the attempt is made to apply the bound to thin layers of matter. This property consists of the existence of an ultimate lower limit l* to the thickness of the slices for which a statistical-mechanical description is viable, depending l* on the thermodynamical variables which define the state of the system locally. This limiting scale, found to be in general much larger than the Planck scale (so that no Planck scale physics must be necessarily invoked to justify it), appears not related to gravity and this suggests that the generalized entropy bound is likely to be rooted on conventional flat-spacetime statistical mechanics, with the maximum admitted entropy being however actually determined also by gravity. Some examples of ideal fluids are considered in order to identify the mechanisms which can set a lower limit to the statistical-mechanical description and these systems are found to respect the lower limiting scale l*. The photon gas, in particular, appears to seemingly saturate this limiting scale and the consequence is drawn that for systems consisting of a single slice of a photon gas with thickness l*, the generalized Bousso bound is saturated. It is argued that this seems to open the way to a peculiar understanding of black hole entropy: if an entropy can meaningfully (i.e. with a second law) be assigned to a black hole, the value A/4 for it (where A is the area of the black hole) is required simply by (conventional) statistical mechanics coupled to general relativity
A scan statistic to extract causal gene clusters from case-control genome-wide rare CNV data
Directory of Open Access Journals (Sweden)
Scherer Stephen W
2011-05-01
Full Text Available Abstract Background Several statistical tests have been developed for analyzing genome-wide association data by incorporating gene pathway information in terms of gene sets. Using these methods, hundreds of gene sets are typically tested, and the tested gene sets often overlap. This overlapping greatly increases the probability of generating false positives, and the results obtained are difficult to interpret, particularly when many gene sets show statistical significance. Results We propose a flexible statistical framework to circumvent these problems. Inspired by spatial scan statistics for detecting clustering of disease occurrence in the field of epidemiology, we developed a scan statistic to extract disease-associated gene clusters from a whole gene pathway. Extracting one or a few significant gene clusters from a global pathway limits the overall false positive probability, which results in increased statistical power, and facilitates the interpretation of test results. In the present study, we applied our method to genome-wide association data for rare copy-number variations, which have been strongly implicated in common diseases. Application of our method to a simulated dataset demonstrated the high accuracy of this method in detecting disease-associated gene clusters in a whole gene pathway. Conclusions The scan statistic approach proposed here shows a high level of accuracy in detecting gene clusters in a whole gene pathway. This study has provided a sound statistical framework for analyzing genome-wide rare CNV data by incorporating topological information on the gene pathway.
A κ-generalized statistical mechanics approach to income analysis
International Nuclear Information System (INIS)
Clementi, F; Gallegati, M; Kaniadakis, G
2009-01-01
This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low–middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful
A κ-generalized statistical mechanics approach to income analysis
Clementi, F.; Gallegati, M.; Kaniadakis, G.
2009-02-01
This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low-middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful.
Directory of Open Access Journals (Sweden)
Grauls D.
2006-12-01
Full Text Available Abnormal fluid pressure regimes are commonly encountered at depth in most sedimentary basins. Relationships between effective vertical stress and porosity have been applied, since 1970 to the Gulf Coast area, to assess the magnitude of overpressures. Positive results have been obtained from seismic and basin-modeling techniques in sand-shale, vertical-stress-dominated tertiary basins, whenever compaction disequilibrium conditions apply. However, overpressures resulting from other and/or additional causes (tectonic stress, hydrocarbon generation, thermal stress, fault-related transfer, hydrofracturing. . . cannot be quantitatively assessed using this approach. A hydromechanical approach is then proposed in addition to conventional methods. At any depth, the upper bound fluid pressure is controlled by in situ conditions related to hydrofracturing or fault reactivation. Fluid-driven fracturing implies an episodically open system, under a close to zerominimum effective stress regime. Sound knowledge of present-day tectonic stress regimes allows a direct estimation of minimum stress evolution. A quantitative fluid pressure assessment at depth is therefore possible, as in undrained or/and compartmented geological systems, pressure regimes, whatever their origin, tend to rapidly reach a value close to the minimum principal stress. Therefore, overpressure assessment will be improved, as this methodology can be applied to various geological settings and situations where present-day overpressures originated from other causal mechanisms, very often combined. However, pressure trends in transition zones are more difficult to assess correctly. Additional research on cap rocks and fault seals is therefore required to improve their predictability. In addition to overpressure assessment, the minimum principal stress concept allows a better understanding of petroleum system, as fault-related hydrocarbon dynamic transfers, hydrofractured domains and cap
Statistical Mechanics Analysis of ATP Binding to a Multisubunit Enzyme
International Nuclear Information System (INIS)
Zhang Yun-Xin
2014-01-01
Due to inter-subunit communication, multisubunit enzymes usually hydrolyze ATP in a concerted fashion. However, so far the principle of this process remains poorly understood. In this study, from the viewpoint of statistical mechanics, a simple model is presented. In this model, we assume that the binding of ATP will change the potential of the corresponding enzyme subunit, and the degree of this change depends on the state of its adjacent subunits. The probability of enzyme in a given state satisfies the Boltzmann's distribution. Although it looks much simple, this model can fit the recent experimental data of chaperonin TRiC/CCT well. From this model, the dominant state of TRiC/CCT can be obtained. This study provide a new way to understand biophysical processe by statistical mechanics analysis. (interdisciplinary physics and related areas of science and technology)
Algebraic methods in statistical mechanics and quantum field theory
Emch, Dr Gérard G
2009-01-01
This systematic algebraic approach concerns problems involving a large number of degrees of freedom. It extends the traditional formalism of quantum mechanics, and it eliminates conceptual and mathematical difficulties common to the development of statistical mechanics and quantum field theory. Further, the approach is linked to research in applied and pure mathematics, offering a reflection of the interplay between formulation of physical motivations and self-contained descriptions of the mathematical methods.The four-part treatment begins with a survey of algebraic approaches to certain phys
Generalized Statistical Mechanics at the Onset of Chaos
Directory of Open Access Journals (Sweden)
Alberto Robledo
2013-11-01
Full Text Available Transitions to chaos in archetypal low-dimensional nonlinear maps offer real and precise model systems in which to assess proposed generalizations of statistical mechanics. The known association of chaotic dynamics with the structure of Boltzmann–Gibbs (BG statistical mechanics has suggested the potential verification of these generalizations at the onset of chaos, when the only Lyapunov exponent vanishes and ergodic and mixing properties cease to hold. There are three well-known routes to chaos in these deterministic dissipative systems, period-doubling, quasi-periodicity and intermittency, which provide the setting in which to explore the limit of validity of the standard BG structure. It has been shown that there is a rich and intricate behavior for both the dynamics within and towards the attractors at the onset of chaos and that these two kinds of properties are linked via generalized statistical-mechanical expressions. Amongst the topics presented are: (i permanently growing sensitivity fluctuations and their infinite family of generalized Pesin identities; (ii the emergence of statistical-mechanical structures in the dynamics along the routes to chaos; (iii dynamical hierarchies with modular organization; and (iv limit distributions of sums of deterministic variables. The occurrence of generalized entropy properties in condensed-matter physical systems is illustrated by considering critical fluctuations, localization transition and glass formation. We complete our presentation with the description of the manifestations of the dynamics at the transitions to chaos in various kinds of complex systems, such as, frequency and size rank distributions and complex network images of time series. We discuss the results.
Realistic thermodynamic and statistical-mechanical measures for neural synchronization.
Kim, Sang-Yoon; Lim, Woochang
2014-04-15
Synchronized brain rhythms, associated with diverse cognitive functions, have been observed in electrical recordings of brain activity. Neural synchronization may be well described by using the population-averaged global potential VG in computational neuroscience. The time-averaged fluctuation of VG plays the role of a "thermodynamic" order parameter O used for describing the synchrony-asynchrony transition in neural systems. Population spike synchronization may be well visualized in the raster plot of neural spikes. The degree of neural synchronization seen in the raster plot is well measured in terms of a "statistical-mechanical" spike-based measure Ms introduced by considering the occupation and the pacing patterns of spikes. The global potential VG is also used to give a reference global cycle for the calculation of Ms. Hence, VG becomes an important collective quantity because it is associated with calculation of both O and Ms. However, it is practically difficult to directly get VG in real experiments. To overcome this difficulty, instead of VG, we employ the instantaneous population spike rate (IPSR) which can be obtained in experiments, and develop realistic thermodynamic and statistical-mechanical measures, based on IPSR, to make practical characterization of the neural synchronization in both computational and experimental neuroscience. Particularly, more accurate characterization of weak sparse spike synchronization can be achieved in terms of realistic statistical-mechanical IPSR-based measure, in comparison with the conventional measure based on VG. Copyright © 2014. Published by Elsevier B.V.
Ferraro, Paul J.; Hanauer, Merlin M.
2014-01-01
To develop effective environmental policies, we must understand the mechanisms through which the policies affect social and environmental outcomes. Unfortunately, empirical evidence about these mechanisms is limited, and little guidance for quantifying them exists. We develop an approach to quantifying the mechanisms through which protected areas affect poverty. We focus on three mechanisms: changes in tourism and recreational services; changes in infrastructure in the form of road networks, health clinics, and schools; and changes in regulating and provisioning ecosystem services and foregone production activities that arise from land-use restrictions. The contributions of ecotourism and other ecosystem services to poverty alleviation in the context of a real environmental program have not yet been empirically estimated. Nearly two-thirds of the poverty reduction associated with the establishment of Costa Rican protected areas is causally attributable to opportunities afforded by tourism. Although protected areas reduced deforestation and increased regrowth, these land cover changes neither reduced nor exacerbated poverty, on average. Protected areas did not, on average, affect our measures of infrastructure and thus did not contribute to poverty reduction through this mechanism. We attribute the remaining poverty reduction to unobserved dimensions of our mechanisms or to other mechanisms. Our study empirically estimates previously unidentified contributions of ecotourism and other ecosystem services to poverty alleviation in the context of a real environmental program. We demonstrate that, with existing data and appropriate empirical methods, conservation scientists and policymakers can begin to elucidate the mechanisms through which ecosystem conservation programs affect human welfare. PMID:24567397
Bovier, Anton
2006-06-01
Our mathematical understanding of the statistical mechanics of disordered systems is going through a period of stunning progress. This self-contained book is a graduate-level introduction for mathematicians and for physicists interested in the mathematical foundations of the field, and can be used as a textbook for a two-semester course on mathematical statistical mechanics. It assumes only basic knowledge of classical physics and, on the mathematics side, a good working knowledge of graduate-level probability theory. The book starts with a concise introduction to statistical mechanics, proceeds to disordered lattice spin systems, and concludes with a presentation of the latest developments in the mathematical understanding of mean-field spin glass models. In particular, recent progress towards a rigorous understanding of the replica symmetry-breaking solutions of the Sherrington-Kirkpatrick spin glass models, due to Guerra, Aizenman-Sims-Starr and Talagrand, is reviewed in some detail. Comprehensive introduction to an active and fascinating area of research Clear exposition that builds to the state of the art in the mathematics of spin glasses Written by a well-known and active researcher in the field
A quantum causal discovery algorithm
Giarmatzi, Christina; Costa, Fabio
2018-03-01
Finding a causal model for a set of classical variables is now a well-established task—but what about the quantum equivalent? Even the notion of a quantum causal model is controversial. Here, we present a causal discovery algorithm for quantum systems. The input to the algorithm is a process matrix describing correlations between quantum events. Its output consists of different levels of information about the underlying causal model. Our algorithm determines whether the process is causally ordered by grouping the events into causally ordered non-signaling sets. It detects if all relevant common causes are included in the process, which we label Markovian, or alternatively if some causal relations are mediated through some external memory. For a Markovian process, it outputs a causal model, namely the causal relations and the corresponding mechanisms, represented as quantum states and channels. Our algorithm opens the route to more general quantum causal discovery methods.
Introduction to nonequilibrium statistical mechanics with quantum field theory
International Nuclear Information System (INIS)
Kita, Takafumi
2010-01-01
In this article, we present a concise and self-contained introduction to nonequilibrium statistical mechanics with quantum field theory by considering an ensemble of interacting identical bosons or fermions as an example. Readers are assumed to be familiar with the Matsubara formalism of equilibrium statistical mechanics such as Feynman diagrams, the proper self-energy, and Dyson's equation. The aims are threefold: (1) to explain the fundamentals of nonequilibrium quantum field theory as simple as possible on the basis of the knowledge of the equilibrium counterpart; (2) to elucidate the hierarchy in describing nonequilibrium systems from Dyson's equation on the Keldysh contour to the Navier-Stokes equation in fluid mechanics via quantum transport equations and the Boltzmann equation; (3) to derive an expression of nonequilibrium entropy that evolves with time. In stage (1), we introduce nonequilibrium Green's function and the self-energy uniquely on the round-trip Keldysh contour, thereby avoiding possible confusions that may arise from defining multiple Green's functions at the very beginning. We try to present the Feynman rules for the perturbation expansion as simple as possible. In particular, we focus on the self-consistent perturbation expansion with the Luttinger-Ward thermodynamic functional, i.e., Baym's Φ-derivable approximation, which has a crucial property for nonequilibrium systems of obeying various conservation laws automatically. We also show how the two-particle correlations can be calculated within the Φ-derivable approximation, i.e., an issue of how to handle the 'Bogoliubov-Born-Green-Kirkwood-Yvons (BBGKY) hierarchy'. Aim (2) is performed through successive reductions of relevant variables with the Wigner transformation, the gradient expansion based on the Groenewold-Moyal product, and Enskog's expansion from local equilibrium. This part may be helpful for convincing readers that nonequilibrium systems can be handled microscopically with
Alternative derivations of the statistical mechanical distribution laws.
Wall, F T
1971-08-01
A new approach is presented for the derivation of statistical mechanical distribution laws. The derivations are accomplished by minimizing the Helmholtz free energy under constant temperature and volume, instead of maximizing the entropy under constant energy and volume. An alternative method involves stipulating equality of chemical potential, or equality of activity, for particles in different energy levels. This approach leads to a general statement of distribution laws applicable to all systems for which thermodynamic probabilities can be written. The methods also avoid use of the calculus of variations, Lagrangian multipliers, and Stirling's approximation for the factorial. The results are applied specifically to Boltzmann, Fermi-Dirac, and Bose-Einstein statistics. The special significance of chemical potential and activity is discussed for microscopic systems.
Thermodynamics and statistical mechanics. [thermodynamic properties of gases
1976-01-01
The basic thermodynamic properties of gases are reviewed and the relations between them are derived from the first and second laws. The elements of statistical mechanics are then formulated and the partition function is derived. The classical form of the partition function is used to obtain the Maxwell-Boltzmann distribution of kinetic energies in the gas phase and the equipartition of energy theorem is given in its most general form. The thermodynamic properties are all derived as functions of the partition function. Quantum statistics are reviewed briefly and the differences between the Boltzmann distribution function for classical particles and the Fermi-Dirac and Bose-Einstein distributions for quantum particles are discussed.
International Nuclear Information System (INIS)
Novello, M.; Salim, J.M.; Torres, J.; Oliveira, H.P. de
1989-01-01
A set of spatially homogeneous and isotropic cosmological geometries generated by a class of non-perfect is investigated fluids. The irreversibility if this system is studied in the context of causal thermodynamics which provides a useful mechanism to conform to the non-violation of the causal principle. (author) [pt
Is quantum theory a form of statistical mechanics?
Adler, S. L.
2007-05-01
We give a review of the basic themes of my recent book: Adler S L 2004 Quantum Theory as an Emergent Phenomenon (Cambridge: Cambridge University Press). We first give motivations for considering the possibility that quantum mechanics is not exact, but is instead an accurate asymptotic approximation to a deeper level theory. For this deeper level, we propose a non-commutative generalization of classical mechanics, that we call "trace dynamics", and we give a brief survey of how it works, considering for simplicity only the bosonic case. We then discuss the statistical mechanics of trace dynamics and give our argument that with suitable approximations, the Ward identities for trace dynamics imply that ensemble averages in the canonical ensemble correspond to Wightman functions in quantum field theory. Thus, quantum theory emerges as the statistical thermodynamics of trace dynamics. Finally, we argue that Brownian motion corrections to this thermodynamics lead to stochastic corrections to the Schrödinger equation, of the type that have been much studied in the "continuous spontaneous localization" model of objective state vector reduction. In appendices to the talk, we give details of the existence of a conserved operator in trace dynamics that encodes the structure of the canonical algebra, of the derivation of the Ward identities, and of the proof that the stochastically-modified Schrödinger equation leads to state vector reduction with Born rule probabilities.
Statistical mechanical analysis of LMFBR fuel cladding tubes
International Nuclear Information System (INIS)
Poncelet, J.-P.; Pay, A.
1977-01-01
The most important design requirement on fuel pin cladding for LMFBR's is its mechanical integrity. Disruptive factors include internal pressure from mixed oxide fuel fission gas release, thermal stresses and high temperature creep, neutron-induced differential void-swelling as a source of stress in the cladding and irradiation creep of stainless steel material, corrosion by fission products. Under irradiation these load-restraining mechanisms are accentuated by stainless steel embrittlement and strength alterations. To account for the numerous uncertainties involved in the analysis by theoretical models and computer codes statistical tools are unavoidably requested, i.e. Monte Carlo simulation methods. Thanks to these techniques, uncertainties in nominal characteristics, material properties and environmental conditions can be linked up in a correct way and used for a more accurate conceptual design. First, a thermal creep damage index is set up through a sufficiently sophisticated clad physical analysis including arbitrary time dependence of power and neutron flux as well as effects of sodium temperature, burnup and steel mechanical behavior. Although this strain limit approach implies a more general but time consuming model., on the counterpart the net output is improved and e.g. clad temperature, stress and strain maxima may be easily assessed. A full spectrum of variables are statistically treated to account for their probability distributions. Creep damage probability may be obtained and can contribute to a quantitative fuel probability estimation
Representative volume size: A comparison of statistical continuum mechanics and statistical physics
Energy Technology Data Exchange (ETDEWEB)
AIDUN,JOHN B.; TRUCANO,TIMOTHY G.; LO,CHI S.; FYE,RICHARD M.
1999-05-01
In this combination background and position paper, the authors argue that careful work is needed to develop accurate methods for relating the results of fine-scale numerical simulations of material processes to meaningful values of macroscopic properties for use in constitutive models suitable for finite element solid mechanics simulations. To provide a definite context for this discussion, the problem is couched in terms of the lack of general objective criteria for identifying the size of the representative volume (RV) of a material. The objective of this report is to lay out at least the beginnings of an approach for applying results and methods from statistical physics to develop concepts and tools necessary for determining the RV size, as well as alternatives to RV volume-averaging for situations in which the RV is unmanageably large. The background necessary to understand the pertinent issues and statistical physics concepts is presented.
Nonequilibrium statistical mechanics and stochastic thermodynamics of small systems
International Nuclear Information System (INIS)
Tu Zhanchun
2014-01-01
Thermodynamics is an old subject. The research objects in conventional thermodynamics are macroscopic systems with huge number of particles. In recent 30 years, thermodynamics of small systems is a frontier topic in physics. Here we introduce nonequilibrium statistical mechanics and stochastic thermodynamics of small systems. As a case study, we construct a Canot-like cycle of a stochastic heat engine with a single particle controlled by a time-dependent harmonic potential. We find that the efficiency at maximum power is 1 - √T c /T h , where Tc and Th are the temperatures of cold bath and hot bath, respectively. (author)
Statistical mechanics of attractor neural network models with synaptic depression
International Nuclear Information System (INIS)
Igarashi, Yasuhiko; Oizumi, Masafumi; Otsubo, Yosuke; Nagata, Kenji; Okada, Masato
2009-01-01
Synaptic depression is known to control gain for presynaptic inputs. Since cortical neurons receive thousands of presynaptic inputs, and their outputs are fed into thousands of other neurons, the synaptic depression should influence macroscopic properties of neural networks. We employ simple neural network models to explore the macroscopic effects of synaptic depression. Systems with the synaptic depression cannot be analyzed due to asymmetry of connections with the conventional equilibrium statistical-mechanical approach. Thus, we first propose a microscopic dynamical mean field theory. Next, we derive macroscopic steady state equations and discuss the stabilities of steady states for various types of neural network models.
From inverse problems to learning: a Statistical Mechanics approach
Baldassi, Carlo; Gerace, Federica; Saglietti, Luca; Zecchina, Riccardo
2018-01-01
We present a brief introduction to the statistical mechanics approaches for the study of inverse problems in data science. We then provide concrete new results on inferring couplings from sampled configurations in systems characterized by an extensive number of stable attractors in the low temperature regime. We also show how these result are connected to the problem of learning with realistic weak signals in computational neuroscience. Our techniques and algorithms rely on advanced mean-field methods developed in the context of disordered systems.
Statistical mechanics of socio-economic systems with heterogeneous agents
International Nuclear Information System (INIS)
De Martino, Andrea; Marsili, Matteo
2006-01-01
We review the statistical mechanics approach to the study of the emerging collective behaviour of systems of heterogeneous interacting agents. The general framework is presented through examples in such contexts as ecosystem dynamics and traffic modelling. We then focus on the analysis of the optimal properties of large random resource-allocation problems and on Minority Games and related models of speculative trading in financial markets, discussing a number of extensions including multi-asset models, majority games and models with asymmetric information. Finally, we summarize the main conclusions and outline the major open problems and limitations of the approach. (topical review)
Statistical mechanical analysis of LMFBR fuel cladding tubes
International Nuclear Information System (INIS)
Poncelet, J.-P.; Pay, A.
1977-01-01
The most important design requirement on fuel pin cladding for LMFBR's is its mechanical integrity. Disruptive factors include internal pressure from mixed oxide fuel fission gas release, thermal stresses and high temperature creep, neutron-induced differential void-swelling as a source of stress in the cladding and irradiation creep of stainless steel material, corrosion by fission products. Under irradiation these load-restraining mechanisms are accentuated by stainless steel embrittlement and strength alterations. To account for the numerous uncertainties involved in the analysis by theoretical models and computer codes statistical tools are unavoidably requested, i.e. Monte Carlo simulation methods. Thanks to these techniques, uncertainties in nominal characteristics, material properties and environmental conditions can be linked up in a correct way and used for a more accurate conceptual design. (Auth.)
Multiple-shock initiation via statistical crack mechanics
Energy Technology Data Exchange (ETDEWEB)
Dienes, J.K.; Kershner, J.D.
1998-12-31
Statistical Crack Mechanics (SCRAM) is a theoretical approach to the behavior of brittle materials that accounts for the behavior of an ensemble of microcracks, including their opening, shear, growth, and coalescence. Mechanical parameters are based on measured strain-softening behavior. In applications to explosive and propellant sensitivity it is assumed that closed cracks act as hot spots, and that the heating due to interfacial friction initiates reactions which are modeled as one-dimensional heat flow with an Arrhenius source term, and computed in a subscale grid. Post-ignition behavior of hot spots is treated with the burn model of Ward, Son and Brewster. Numerical calculations using SCRAM-HYDROX are compared with the multiple-shock experiments of Mulford et al. in which the particle velocity in PBX 9501 is measured with embedded wires, and reactions are initiated and quenched.
International Nuclear Information System (INIS)
Remler, E.A.
1977-01-01
A gauge-invariant version of the Wigner representation is used to relate relativistic mechanics, statistical mechanics, and quantum field theory in the context of the electrodynamics of scalar particles. A unified formulation of quantum field theory and statistical mechanics is developed which clarifies the physics interpretation of the single-particle Wigner function. A covariant form of Ehrenfest's theorem is derived. Classical electrodynamics is derived from quantum field theory after making a random-phase approximation. The validity of this approximation is discussed
Budiyono, Agung; Rohrlich, Daniel
2017-11-03
Where does quantum mechanics part ways with classical mechanics? How does quantum randomness differ fundamentally from classical randomness? We cannot fully explain how the theories differ until we can derive them within a single axiomatic framework, allowing an unambiguous account of how one theory is the limit of the other. Here we derive non-relativistic quantum mechanics and classical statistical mechanics within a common framework. The common axioms include conservation of average energy and conservation of probability current. But two axioms distinguish quantum mechanics from classical statistical mechanics: an "ontic extension" defines a nonseparable (global) random variable that generates physical correlations, and an "epistemic restriction" constrains allowed phase space distributions. The ontic extension and epistemic restriction, with strength on the order of Planck's constant, imply quantum entanglement and uncertainty relations. This framework suggests that the wave function is epistemic, yet it does not provide an ontic dynamics for individual systems.
Statistical mechanics in the context of special relativity. II.
Kaniadakis, G
2005-09-01
The special relativity laws emerge as one-parameter (light speed) generalizations of the corresponding laws of classical physics. These generalizations, imposed by the Lorentz transformations, affect both the definition of the various physical observables (e.g., momentum, energy, etc.), as well as the mathematical apparatus of the theory. Here, following the general lines of [Phys. Rev. E 66, 056125 (2002)], we show that the Lorentz transformations impose also a proper one-parameter generalization of the classical Boltzmann-Gibbs-Shannon entropy. The obtained relativistic entropy permits us to construct a coherent and self-consistent relativistic statistical theory, preserving the main features of the ordinary statistical theory, which is recovered in the classical limit. The predicted distribution function is a one-parameter continuous deformation of the classical Maxwell-Boltzmann distribution and has a simple analytic form, showing power law tails in accordance with the experimental evidence. Furthermore, this statistical mechanics can be obtained as the stationary case of a generalized kinetic theory governed by an evolution equation obeying the H theorem and reproducing the Boltzmann equation of the ordinary kinetics in the classical limit.
Statistical mechanics of the vertex-cover problem
Hartmann, Alexander K.; Weigt, Martin
2003-10-01
We review recent progress in the study of the vertex-cover problem (VC). The VC belongs to the class of NP-complete graph theoretical problems, which plays a central role in theoretical computer science. On ensembles of random graphs, VC exhibits a coverable-uncoverable phase transition. Very close to this transition, depending on the solution algorithm, easy-hard transitions in the typical running time of the algorithms occur. We explain a statistical mechanics approach, which works by mapping the VC to a hard-core lattice gas, and then applying techniques such as the replica trick or the cavity approach. Using these methods, the phase diagram of the VC could be obtained exactly for connectivities c e, the solution of the VC exhibits full replica symmetry breaking. The statistical mechanics approach can also be used to study analytically the typical running time of simple complete and incomplete algorithms for the VC. Finally, we describe recent results for the VC when studied on other ensembles of finite- and infinite-dimensional graphs.
Statistical mechanics of the vertex-cover problem
International Nuclear Information System (INIS)
Hartmann, Alexander K; Weigt, Martin
2003-01-01
We review recent progress in the study of the vertex-cover problem (VC). The VC belongs to the class of NP-complete graph theoretical problems, which plays a central role in theoretical computer science. On ensembles of random graphs, VC exhibits a coverable-uncoverable phase transition. Very close to this transition, depending on the solution algorithm, easy-hard transitions in the typical running time of the algorithms occur. We explain a statistical mechanics approach, which works by mapping the VC to a hard-core lattice gas, and then applying techniques such as the replica trick or the cavity approach. Using these methods, the phase diagram of the VC could be obtained exactly for connectivities c e, the solution of the VC exhibits full replica symmetry breaking. The statistical mechanics approach can also be used to study analytically the typical running time of simple complete and incomplete algorithms for the VC. Finally, we describe recent results for the VC when studied on other ensembles of finite- and infinite-dimensional graphs
Statistical mechanics of learning orthogonal signals for general covariance models
International Nuclear Information System (INIS)
Hoyle, David C
2010-01-01
Statistical mechanics techniques have proved to be useful tools in quantifying the accuracy with which signal vectors are extracted from experimental data. However, analysis has previously been limited to specific model forms for the population covariance C, which may be inappropriate for real world data sets. In this paper we obtain new statistical mechanical results for a general population covariance matrix C. For data sets consisting of p sample points in R N we use the replica method to study the accuracy of orthogonal signal vectors estimated from the sample data. In the asymptotic limit of N,p→∞ at fixed α = p/N, we derive analytical results for the signal direction learning curves. In the asymptotic limit the learning curves follow a single universal form, each displaying a retarded learning transition. An explicit formula for the location of the retarded learning transition is obtained and we find marked variation in the location of the retarded learning transition dependent on the distribution of population covariance eigenvalues. The results of the replica analysis are confirmed against simulation
Statistical learning: a powerful mechanism that operates by mere exposure.
Aslin, Richard N
2017-01-01
How do infants learn so rapidly and with little apparent effort? In 1996, Saffran, Aslin, and Newport reported that 8-month-old human infants could learn the underlying temporal structure of a stream of speech syllables after only 2 min of passive listening. This demonstration of what was called statistical learning, involving no instruction, reinforcement, or feedback, led to dozens of confirmations of this powerful mechanism of implicit learning in a variety of modalities, domains, and species. These findings reveal that infants are not nearly as dependent on explicit forms of instruction as we might have assumed from studies of learning in which children or adults are taught facts such as math or problem solving skills. Instead, at least in some domains, infants soak up the information around them by mere exposure. Learning and development in these domains thus appear to occur automatically and with little active involvement by an instructor (parent or teacher). The details of this statistical learning mechanism are discussed, including how exposure to specific types of information can, under some circumstances, generalize to never-before-observed information, thereby enabling transfer of learning. WIREs Cogn Sci 2017, 8:e1373. doi: 10.1002/wcs.1373 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.
Zenil, Hector; Kiani, Narsis A.; Tegner, Jesper
2018-01-01
To extract and learn representations leading to generative mechanisms from data, especially without making arbitrary decisions and biased assumptions, is a central challenge in most areas of scientific research particularly in connection to current
Statistical methods in the mechanical design of fuel assemblies
Energy Technology Data Exchange (ETDEWEB)
Radsak, C.; Streit, D.; Muench, C.J. [AREVA NP GmbH, Erlangen (Germany)
2013-07-01
The mechanical design of a fuel assembly is still being mainly performed in a de terministic way. This conservative approach is however not suitable to provide a realistic quantification of the design margins with respect to licensing criter ia for more and more demanding operating conditions (power upgrades, burnup increase,..). This quantification can be provided by statistical methods utilizing all available information (e.g. from manufacturing, experience feedback etc.) of the topic under consideration. During optimization e.g. of the holddown system certain objectives in the mechanical design of a fuel assembly (FA) can contradict each other, such as sufficient holddown forces enough to prevent fuel assembly lift-off and reducing the holddown forces to minimize axial loads on the fuel assembly structure to ensure no negative effect on the control rod movement.By u sing a statistical method the fuel assembly design can be optimized much better with respect to these objectives than it would be possible based on a deterministic approach. This leads to a more realistic assessment and safer way of operating fuel assemblies. Statistical models are defined on the one hand by the quanti le that has to be maintained concerning the design limit requirements (e.g. one FA quantile) and on the other hand by the confidence level which has to be met. Using the above example of the holddown force, a feasible quantile can be define d based on the requirement that less than one fuel assembly (quantile > 192/19 3 [%] = 99.5 %) in the core violates the holddown force limit w ith a confidence of 95%. (orig.)
Comparison of phase space dynamics of Kopenhagen and causal interpretations of quantum mechanics
Energy Technology Data Exchange (ETDEWEB)
Tempel, Christoph; Schleich, Wolfgang P. [Institut fuer Quantenphysik, Universitaet Ulm, D-89069 Ulm (Germany)
2013-07-01
Recent publications pursue the attempt to reconstruct Bohm trajectories experimentally utilizing the technique of weak measurements. We study the phase space dynamics of a specific double slit setup in terms of the Bohm de-Broglie formulation of quantum mechanics. We want to compare the results of those Bohmian phase space dynamics to the usual quantum mechanical phase space formulation with the Wigner function as a quasi probability density.
On the necessity and sufficiency of local commutativity for causality in quantum mechanics
International Nuclear Information System (INIS)
Muynck, W.M. de.
1984-01-01
This thesis contributes to the resolution of the question whether quantum mechanics admits an objectivistic interpretation if the description is restricted to the phenomenalistic domain of the quantum mechanical observables. Without touching the realism-phenomenalism dichotomy, this thesis investigates the possibility to disregard the influence of the measurement interaction on the qm measuring results. In the first part, the measuring process is studied and its influence on the objectivity of measuring results. The measurement of a local observable is interpreted as a local operation. Its local commutativity is a necessary condition for macrocausality. In the second part the converse question is studied, viz. Whether local commutativity is sufficient for macrocausality. (Auth.)
The association between autism and errors in early embryogenesis: what is the causal mechanism?
Ploeger, A.; Raijmakers, M.E.J.; van der Maas, H.L.J.; Galis, F.
2010-01-01
The association between embryonic errors and the development of autism has been recognized in the literature, but the mechanism underlying this association remains unknown. We propose that pleiotropic effects during a very early and specific stage of embryonic development—early organogenesis—can
Causal mechanisms of seismo-EM phenomena during the 1965-1967 Matsushiro earthquake swarm
Enomoto, Yuji; Yamabe, Tsuneaki; Okumura, Nobuo
2017-03-01
The 1965-1967 Matsushiro earthquake swarm in central Japan exhibited two unique characteristics. The first was a hydro-mechanical crust rupture resulting from degassing, volume expansion of CO2/water, and a crack opening within the critically stressed crust under a strike-slip stress. The other was, despite the lower total seismic energy, the occurrence of complexed seismo-electromagnetic (seismo-EM) phenomena of the geomagnetic intensity increase, unusual earthquake lights (EQLs) and atmospheric electric field (AEF) variations. Although the basic rupture process of this swarm of earthquakes is reasonably understood in terms of hydro-mechanical crust rupture, the associated seismo-EM processes remain largely unexplained. Here, we describe a series of seismo-EM mechanisms involved in the hydro-mechanical rupture process, as observed by coupling the electric interaction of rock rupture with CO2 gas and the dielectric-barrier discharge of the modelled fields in laboratory experiments. We found that CO2 gases passing through the newly created fracture surface of the rock were electrified to generate pressure-impressed current/electric dipoles, which could induce a magnetic field following Biot-Savart’s law, decrease the atmospheric electric field and generate dielectric-barrier discharge lightning affected by the coupling effect between the seismic and meteorological activities.
Ellis, George FR; Pabjan, Tadeusz
2013-01-01
Written by philosophers, cosmologists, and physicists, this collection of essays deals with causality, which is a core issue for both science and philosophy. Readers will learn about different types of causality in complex systems and about new perspectives on this issue based on physical and cosmological considerations. In addition, the book includes essays pertaining to the problem of causality in ancient Greek philosophy, and to the problem of God's relation to the causal structures of nature viewed in the light of contemporary physics and cosmology.
From statistic mechanic outside equilibrium to transport equations
International Nuclear Information System (INIS)
Balian, R.
1995-01-01
This lecture notes give a synthetic view on the foundations of non-equilibrium statistical mechanics. The purpose is to establish the transport equations satisfied by the relevant variables, starting from the microscopic dynamics. The Liouville representation is introduced, and a projection associates with any density operator , for given choice of relevant observables, a reduced density operator. An exact integral-differential equation for the relevant variables is thereby derived. A short-memory approximation then yields the transport equations. A relevant entropy which characterizes the coarseness of the description is associated with each level of description. As an illustration, the classical gas, with its three levels of description and with the Chapman-Enskog method, is discussed. (author). 3 figs., 5 refs
Statistical mechanics of lattice systems a concrete mathematical introduction
Friedli, Sacha
2017-01-01
This motivating textbook gives a friendly, rigorous introduction to fundamental concepts in equilibrium statistical mechanics, covering a selection of specific models, including the Curie–Weiss and Ising models, the Gaussian free field, O(n) models, and models with Kać interactions. Using classical concepts such as Gibbs measures, pressure, free energy, and entropy, the book exposes the main features of the classical description of large systems in equilibrium, in particular the central problem of phase transitions. It treats such important topics as the Peierls argument, the Dobrushin uniqueness, Mermin–Wagner and Lee–Yang theorems, and develops from scratch such workhorses as correlation inequalities, the cluster expansion, Pirogov–Sinai Theory, and reflection positivity. Written as a self-contained course for advanced undergraduate or beginning graduate students, the detailed explanations, large collection of exercises (with solutions), and appendix of mathematical results and concepts also make i...
Statistical mechanics analysis of LDPC coding in MIMO Gaussian channels
Energy Technology Data Exchange (ETDEWEB)
Alamino, Roberto C; Saad, David [Neural Computing Research Group, Aston University, Birmingham B4 7ET (United Kingdom)
2007-10-12
Using analytical methods of statistical mechanics, we analyse the typical behaviour of a multiple-input multiple-output (MIMO) Gaussian channel with binary inputs under low-density parity-check (LDPC) network coding and joint decoding. The saddle point equations for the replica symmetric solution are found in particular realizations of this channel, including a small and large number of transmitters and receivers. In particular, we examine the cases of a single transmitter, a single receiver and symmetric and asymmetric interference. Both dynamical and thermodynamical transitions from the ferromagnetic solution of perfect decoding to a non-ferromagnetic solution are identified for the cases considered, marking the practical and theoretical limits of the system under the current coding scheme. Numerical results are provided, showing the typical level of improvement/deterioration achieved with respect to the single transmitter/receiver result, for the various cases.
Statistical mechanics analysis of LDPC coding in MIMO Gaussian channels
International Nuclear Information System (INIS)
Alamino, Roberto C; Saad, David
2007-01-01
Using analytical methods of statistical mechanics, we analyse the typical behaviour of a multiple-input multiple-output (MIMO) Gaussian channel with binary inputs under low-density parity-check (LDPC) network coding and joint decoding. The saddle point equations for the replica symmetric solution are found in particular realizations of this channel, including a small and large number of transmitters and receivers. In particular, we examine the cases of a single transmitter, a single receiver and symmetric and asymmetric interference. Both dynamical and thermodynamical transitions from the ferromagnetic solution of perfect decoding to a non-ferromagnetic solution are identified for the cases considered, marking the practical and theoretical limits of the system under the current coding scheme. Numerical results are provided, showing the typical level of improvement/deterioration achieved with respect to the single transmitter/receiver result, for the various cases
Statistical mechanics of the fashion game on random networks
International Nuclear Information System (INIS)
Sun, YiFan
2016-01-01
A model of fashion on networks is studied. This model consists of two groups of agents that are located on a network and have opposite viewpoints towards being fashionable: behaving consistently with either the majority or the minority of adjacent agents. Checking whether the fashion game has a pure Nash equilibrium (pure NE) is a non-deterministic polynomial complete problem. Using replica-symmetric mean field theory, the largest proportion of satisfied agents and the region where at least one pure NE should exist are determined for several types of random networks. Furthermore, a quantitive analysis of the asynchronous best response dynamics yields the phase diagram of existence and detectability of pure NE in the fashion game on some random networks. (paper: classical statistical mechanics, equilibrium and non-equilibrium).
Statistical mechanics of sparse generalization and graphical model selection
International Nuclear Information System (INIS)
Lage-Castellanos, Alejandro; Pagnani, Andrea; Weigt, Martin
2009-01-01
One of the crucial tasks in many inference problems is the extraction of an underlying sparse graphical model from a given number of high-dimensional measurements. In machine learning, this is frequently achieved using, as a penalty term, the L p norm of the model parameters, with p≤1 for efficient dilution. Here we propose a statistical mechanics analysis of the problem in the setting of perceptron memorization and generalization. Using a replica approach, we are able to evaluate the relative performance of naive dilution (obtained by learning without dilution, following by applying a threshold to the model parameters), L 1 dilution (which is frequently used in convex optimization) and L 0 dilution (which is optimal but computationally hard to implement). Whereas both L p diluted approaches clearly outperform the naive approach, we find a small region where L 0 works almost perfectly and strongly outperforms the simpler to implement L 1 dilution
Anomalous behavior of q-averages in nonextensive statistical mechanics
International Nuclear Information System (INIS)
Abe, Sumiyoshi
2009-01-01
A generalized definition of average, termed the q-average, is widely employed in the field of nonextensive statistical mechanics. Recently, it has however been pointed out that such an average value may behave unphysically under specific deformations of probability distributions. Here, the following three issues are discussed and clarified. Firstly, the deformations considered are physical and may be realized experimentally. Secondly, in view of the thermostatistics, the q-average is unstable in both finite and infinite discrete systems. Thirdly, a naive generalization of the discussion to continuous systems misses a point, and a norm better than the L 1 -norm should be employed for measuring the distance between two probability distributions. Consequently, stability of the q-average is shown not to be established in all of the cases
Statistical mechanics of the interacting Yang-Mills instanton gas
International Nuclear Information System (INIS)
Ilgenfritz, E.-M.; Mueller-Preussker, M.
1980-01-01
Within the framework of the dilute gas approximation the instanton gas with dipole-like interaction is studied, including hard-core repulsion necessarily implied by the consistency of this approximation. A new, selfconsistent scheme is obtained of instanton calculations provided by a cooperative suppression of large instantons instead of the usual ad hoc infrared cut-off. Diluteness is better under control by a single, regularization prescription independent parameter. Functional methods known from statistical mechanics are used to treat the hard-core and dipole interactions simultaneously. The permeability of the instanton gas is calculated and used to discuss the Gell-Mann-Low β-function in the intermediate coupling range. The results are confronted with recent lattice calculations
Brøns, Charlotte; Grunnet, Louise Groth
2017-02-01
Dysfunctional adipose tissue is associated with an increased risk of developing type 2 diabetes (T2D). One characteristic of a dysfunctional adipose tissue is the reduced expandability of the subcutaneous adipose tissue leading to ectopic storage of fat in organs and/or tissues involved in the pathogenesis of T2D that can cause lipotoxicity. Accumulation of lipids in the skeletal muscle is associated with insulin resistance, but the majority of previous studies do not prove any causality. Most studies agree that it is not the intramuscular lipids per se that causes insulin resistance, but rather lipid intermediates such as diacylglycerols, fatty acyl-CoAs and ceramides and that it is the localization, composition and turnover of these intermediates that play an important role in the development of insulin resistance and T2D. Adipose tissue is a more active tissue than previously thought, and future research should thus aim at examining the exact role of lipid composition, cellular localization and the dynamics of lipid turnover on the development of insulin resistance. In addition, ectopic storage of fat has differential impact on various organs in different phenotypes at risk of developing T2D; thus, understanding how adipogenesis is regulated, the interference with metabolic outcomes and what determines the capacity of adipose tissue expandability in distinct population groups is necessary. This study is a review of the current literature on the adipose tissue expandability hypothesis and how the following ectopic lipid accumulation as a consequence of a limited adipose tissue expandability may be associated with insulin resistance in muscle and liver. © 2017 European Society of Endocrinology.
Neural Correlates of Causal Power Judgments
Directory of Open Access Journals (Sweden)
Denise Dellarosa Cummins
2014-12-01
Full Text Available Causal inference is a fundamental component of cognition and perception. Probabilistic theories of causal judgment (most notably causal Bayes networks derive causal judgments using metrics that integrate contingency information. But human estimates typically diverge from these normative predictions. This is because human causal power judgments are typically strongly influenced by beliefs concerning underlying causal mechanisms, and because of the way knowledge is retrieved from human memory during the judgment process. Neuroimaging studies indicate that the brain distinguishes causal events from mere covariation, and between perceived and inferred causality. Areas involved in error prediction are also activated, implying automatic activation of possible exception cases during causal decision-making.
On the statistical mechanics of species abundance distributions.
Bowler, Michael G; Kelly, Colleen K
2012-09-01
A central issue in ecology is that of the factors determining the relative abundance of species within a natural community. The proper application of the principles of statistical physics to species abundance distributions (SADs) shows that simple ecological properties could account for the near universal features observed. These properties are (i) a limit on the number of individuals in an ecological guild and (ii) per capita birth and death rates. They underpin the neutral theory of Hubbell (2001), the master equation approach of Volkov et al. (2003, 2005) and the idiosyncratic (extreme niche) theory of Pueyo et al. (2007); they result in an underlying log series SAD, regardless of neutral or niche dynamics. The success of statistical mechanics in this application implies that communities are in dynamic equilibrium and hence that niches must be flexible and that temporal fluctuations on all sorts of scales are likely to be important in community structure. Copyright © 2012 Elsevier Inc. All rights reserved.
Directory of Open Access Journals (Sweden)
Hilary Zelko
Full Text Available BACKGROUND: Although scientific innovation has been a long-standing topic of interest for historians, philosophers and cognitive scientists, few studies in biomedical research have examined from researchers' perspectives how high impact publications are developed and why they are consistently produced by a small group of researchers. Our objective was therefore to interview a group of researchers with a track record of high impact publications to explore what mechanism they believe contribute to the generation of high impact publications. METHODOLOGY/PRINCIPAL FINDINGS: Researchers were located in universities all over the globe and interviews were conducted by phone. All interviews were transcribed using standard qualitative methods. A Grounded Theory approach was used to code each transcript, later aggregating concept and categories into overarching explanation model. The model was then translated into a System Dynamics mathematical model to represent its structure and behavior. Five emerging themes were found in our study. First, researchers used heuristics or rules of thumb that came naturally to them. Second, these heuristics were reinforced by positive feedback from their peers and mentors. Third, good communication skills allowed researchers to provide feedback to their peers, thus closing a positive feedback loop. Fourth, researchers exhibited a number of psychological attributes such as curiosity or open-mindedness that constantly motivated them, even when faced with discouraging situations. Fifth, the system is dominated by randomness and serendipity and is far from a linear and predictable environment. Some researchers, however, took advantage of this randomness by incorporating mechanisms that would allow them to benefit from random findings. The aggregation of these themes into a policy model represented the overall expected behavior of publications and their impact achieved by high impact researchers. CONCLUSIONS: The proposed
Process, System, Causality, and Quantum Mechanics: A Psychoanalysis of Animal Faith
Etter, Tom; Noyes, H. Pierre
We shall argue in this paper that a central piece of modern physics does not really belong to physics at all but to elementary probability theory. Given a joint probability distribution J on a set of random variables containing x and y, define a link between x and y to be the condition x=y on J. Define the {\\it state} D of a link x=y as the joint probability distribution matrix on x and y without the link. The two core laws of quantum mechanics are the Born probability rule, and the unitary dynamical law whose best known form is the Schrodinger's equation. Von Neumann formulated these two laws in the language of Hilbert space as prob(P) = trace(PD) and D'T = TD respectively, where P is a projection, D and D' are (von Neumann) density matrices, and T is a unitary transformation. We'll see that if we regard link states as density matrices, the algebraic forms of these two core laws occur as completely general theorems about links. When we extend probability theory by allowing cases to count negatively, we find that the Hilbert space framework of quantum mechanics proper emerges from the assumption that all D's are symmetrical in rows and columns. On the other hand, Markovian systems emerge when we assume that one of every linked variable pair has a uniform probability distribution. By representing quantum and Markovian structure in this way, we see clearly both how they differ, and also how they can coexist in natural harmony with each other, as they must in quantum measurement, which we'll examine in some detail. Looking beyond quantum mechanics, we see how both structures have their special places in a much larger continuum of formal systems that we have yet to look for in nature.
Statistical mechanics view of quantum chromodynamics: Lattice gauge theory
International Nuclear Information System (INIS)
Kogut, J.B.
1984-01-01
Recent developments in lattice gauge theory are discussed from a statistial mechanics viewpoint. The basic physics problems of quantum chromodynamics (QCD) are reviewed for an audience of critical phenomena theorists. The idea of local gauge symmetry and color, the connection between statistical mechanics and field theory, asymptotic freedom and the continuum limit of lattice gauge theories, and the order parameters (confinement and chiral symmetry) of QCD are reviewed. Then recent developments in the field are discussed. These include the proof of confinement in the lattice theory, numerical evidence for confinement in the continuum limit of lattice gauge theory, and perturbative improvement programs for lattice actions. Next, we turn to the new challenges facing the subject. These include the need for a better understanding of the lattice Dirac equation and recent progress in the development of numerical methods for fermions (the pseudofermion stochastic algorithm and the microcanonical, molecular dynamics equation of motion approach). Finally, some of the applications of lattice gauge theory to QCD spectrum calculations and the thermodynamics of QCD will be discussed and a few remarks concerning future directions of the field will be made
Rajkumar, Ravi Philip
2014-01-01
Gender identity disorder (GID), recently renamed gender dysphoria (GD), is a rare condition characterized by an incongruity between gender identity and biological sex. Clinical evidence suggests that schizophrenia occurs in patients with GID at rates higher than in the general population and that patients with GID may have schizophrenia-like personality traits. Conversely, patients with schizophrenia may experience alterations in gender identity and gender role perception. Neurobiological research, including brain imaging and studies of finger length ratio and handedness, suggests that both these disorders are associated with altered cerebral sexual dimorphism and changes in cerebral lateralization. Various mechanisms, such as Toxoplasma infection, reduced levels of brain-derived neurotrophic factor (BDNF), early childhood adversity, and links with autism spectrum disorders, may account for some of this overlap. The implications of this association for further research are discussed.
Directory of Open Access Journals (Sweden)
Ravi Philip Rajkumar
2014-01-01
Full Text Available Gender identity disorder (GID, recently renamed gender dysphoria (GD, is a rare condition characterized by an incongruity between gender identity and biological sex. Clinical evidence suggests that schizophrenia occurs in patients with GID at rates higher than in the general population and that patients with GID may have schizophrenia-like personality traits. Conversely, patients with schizophrenia may experience alterations in gender identity and gender role perception. Neurobiological research, including brain imaging and studies of finger length ratio and handedness, suggests that both these disorders are associated with altered cerebral sexual dimorphism and changes in cerebral lateralization. Various mechanisms, such as Toxoplasma infection, reduced levels of brain-derived neurotrophic factor (BDNF, early childhood adversity, and links with autism spectrum disorders, may account for some of this overlap. The implications of this association for further research are discussed.
Mahajan, Anubha; Locke, Adam; Rayner, N William; Robertson, Neil; Scott, Robert A; Prokopenko, Inga; Scott, Laura J; Green, Todd; Sparso, Thomas; Thuillier, Dorothee; Yengo, Loic; Grallert, Harald; Wahl, Simone; Frånberg, Mattias; Strawbridge, Rona J; Kestler, Hans; Chheda, Himanshu; Eisele, Lewin; Gustafsson, Stefan; Steinthorsdottir, Valgerdur; Thorleifsson, Gudmar; Qi, Lu; Karssen, Lennart C; van Leeuwen, Elisabeth M; Willems, Sara M; Li, Man; Chen, Han; Fuchsberger, Christian; Kwan, Phoenix; Ma, Clement; Linderman, Michael; Lu, Yingchang; Thomsen, Soren K; Rundle, Jana K; Beer, Nicola L; van de Bunt, Martijn; Chalisey, Anil; Kang, Hyun Min; Voight, Benjamin F; Abecasis, Goncalo R; Almgren, Peter; Baldassarre, Damiano; Balkau, Beverley; Benediktsson, Rafn; Blüher, Matthias; Boeing, Heiner; Bonnycastle, Lori L; Borringer, Erwin P; Burtt, Noël P; Carey, Jason; Charpentier, Guillaume; Chines, Peter S; Cornelis, Marilyn C; Couper, David J; Crenshaw, Andrew T; van Dam, Rob M; Doney, Alex SF; Dorkhan, Mozhgan; Edkins, Sarah; Eriksson, Johan G; Esko, Tonu; Eury, Elodie; Fadista, João; Flannick, Jason; Fontanillas, Pierre; Fox, Caroline; Franks, Paul W; Gertow, Karl; Gieger, Christian; Gigante, Bruna; Gottesman, Omri; Grant, George B; Grarup, Niels; Groves, Christopher J; Hassinen, Maija; Have, Christian T; Herder, Christian; Holmen, Oddgeir L; Hreidarsson, Astradur B; Humphries, Steve E; Hunter, David J; Jackson, Anne U; Jonsson, Anna; Jørgensen, Marit E; Jørgensen, Torben; Kerrison, Nicola D; Kinnunen, Leena; Klopp, Norman; Kong, Augustine; Kovacs, Peter; Kraft, Peter; Kravic, Jasmina; Langford, Cordelia; Leander, Karin; Liang, Liming; Lichtner, Peter; Lindgren, Cecilia M; Lindholm, Eero; Linneberg, Allan; Liu, Ching-Ti; Lobbens, Stéphane; Luan, Jian’an; Lyssenko, Valeriya; Männistö, Satu; McLeod, Olga; Meyer, Julia; Mihailov, Evelin; Mirza, Ghazala; Mühleisen, Thomas W; Müller-Nurasyid, Martina; Navarro, Carmen; Nöthen, Markus M; Oskolkov, Nikolay N; Owen, Katharine R; Palli, Domenico; Pechlivanis, Sonali; Perry, John RB; Platou, Carl GP; Roden, Michael; Ruderfer, Douglas; Rybin, Denis; van der Schouw, Yvonne T; Sennblad, Bengt; Sigurðsson, Gunnar; Stančáková, Alena; Steinbach, Gerald; Storm, Petter; Strauch, Konstantin; Stringham, Heather M; Sun, Qi; Thorand, Barbara; Tikkanen, Emmi; Tonjes, Anke; Trakalo, Joseph; Tremoli, Elena; Tuomi, Tiinamaija; Wennauer, Roman; Wood, Andrew R; Zeggini, Eleftheria; Dunham, Ian; Birney, Ewan; Pasquali, Lorenzo; Ferrer, Jorge; Loos, Ruth JF; Dupuis, Josée; Florez, Jose C; Boerwinkle, Eric; Pankow, James S; van Duijn, Cornelia; Sijbrands, Eric; Meigs, James B; Hu, Frank B; Thorsteinsdottir, Unnur; Stefansson, Kari; Lakka, Timo A; Rauramaa, Rainer; Stumvoll, Michael; Pedersen, Nancy L; Lind, Lars; Keinanen-Kiukaanniemi, Sirkka M; Korpi-Hyövälti, Eeva; Saaristo, Timo E; Saltevo, Juha; Kuusisto, Johanna; Laakso, Markku; Metspalu, Andres; Erbel, Raimund; Jöckel, Karl-Heinz; Moebus, Susanne; Ripatti, Samuli; Salomaa, Veikko; Ingelsson, Erik; Boehm, Bernhard O; Bergman, Richard N; Collins, Francis S; Mohlke, Karen L; Koistinen, Heikki; Tuomilehto, Jaakko; Hveem, Kristian; Njølstad, Inger; Deloukas, Panagiotis; Donnelly, Peter J; Frayling, Timothy M; Hattersley, Andrew T; de Faire, Ulf; Hamsten, Anders; Illig, Thomas; Peters, Annette; Cauchi, Stephane; Sladek, Rob; Froguel, Philippe; Hansen, Torben; Pedersen, Oluf; Morris, Andrew D; Palmer, Collin NA; Kathiresan, Sekar; Melander, Olle; Nilsson, Peter M; Groop, Leif C; Barroso, Inês; Langenberg, Claudia; Wareham, Nicholas J; O’Callaghan, Christopher A; Gloyn, Anna L; Altshuler, David; Boehnke, Michael; Teslovich, Tanya M; McCarthy, Mark I; Morris, Andrew P
2015-01-01
We performed fine-mapping of 39 established type 2 diabetes (T2D) loci in 27,206 cases and 57,574 controls of European ancestry. We identified 49 distinct association signals at these loci, including five mapping in/near KCNQ1. “Credible sets” of variants most likely to drive each distinct signal mapped predominantly to non-coding sequence, implying that T2D association is mediated through gene regulation. Credible set variants were enriched for overlap with FOXA2 chromatin immunoprecipitation binding sites in human islet and liver cells, including at MTNR1B, where fine-mapping implicated rs10830963 as driving T2D association. We confirmed that this T2D-risk allele increases FOXA2-bound enhancer activity in islet- and liver-derived cells. We observed allele-specific differences in NEUROD1 binding in islet-derived cells, consistent with evidence that the T2D-risk allele increases islet MTNR1B expression. Our study demonstrates how integration of genetic and genomic information can define molecular mechanisms through which variants underlying association signals exert their effects on disease. PMID:26551672
Statistical characteristics of mechanical heart valve cavitation in accelerated testing.
Wu, Changfu; Hwang, Ned H C; Lin, Yu-Kweng M
2004-07-01
Cavitation damage has been observed on mechanical heart valves (MHVs) undergoing accelerated testing. Cavitation itself can be modeled as a stochastic process, as it varies from beat to beat of the testing machine. This in-vitro study was undertaken to investigate the statistical characteristics of MHV cavitation. A 25-mm St. Jude Medical bileaflet MHV (SJM 25) was tested in an accelerated tester at various pulse rates, ranging from 300 to 1,000 bpm, with stepwise increments of 100 bpm. A miniature pressure transducer was placed near a leaflet tip on the inflow side of the valve, to monitor regional transient pressure fluctuations at instants of valve closure. The pressure trace associated with each beat was passed through a 70 kHz high-pass digital filter to extract the high-frequency oscillation (HFO) components resulting from the collapse of cavitation bubbles. Three intensity-related measures were calculated for each HFO burst: its time span; its local root-mean-square (LRMS) value; and the area enveloped by the absolute value of the HFO pressure trace and the time axis, referred to as cavitation impulse. These were treated as stochastic processes, of which the first-order probability density functions (PDFs) were estimated for each test rate. Both the LRMS value and cavitation impulse were log-normal distributed, and the time span was normal distributed. These distribution laws were consistent at different test rates. The present investigation was directed at understanding MHV cavitation as a stochastic process. The results provide a basis for establishing further the statistical relationship between cavitation intensity and time-evolving cavitation damage on MHV surfaces. These data are required to assess and compare the performance of MHVs of different designs.
The role of angular momentum conservation law in statistical mechanics
Directory of Open Access Journals (Sweden)
I.M. Dubrovskii
2008-12-01
Full Text Available Within the limits of Khinchin ideas [A.Y. Khinchin, Mathematical Foundation of Statistical Mechanics. NY, Ed. Dover, 1949] the importance of momentum and angular momentum conservation laws was analyzed for two cases: for uniform magnetic field and when magnetic field is absent. The law of momentum conservation does not change the density of probability distribution in both cases, just as it is assumed in the conventional theory. It is shown that in systems where the kinetic energy depends only on particle momenta canonically conjugated with Cartesian coordinates being their diagonal quadric form,the angular momentum conservation law changes the density of distribution of the system only in case the full angular momentum of a system is not equal to zero. In the gas of charged particles in a uniform magnetic field the density of distribution also varies if the angular momentum is zero [see Dubrovskii I.M., Condensed Matter Physics, 2206, 9, 23]. Two-dimensional gas of charged particles located within a section of an endless strip filled with gas in magnetic field is considered. Under such conditions the angular momentum is not conserved. Directional particle flows take place close to the strip boundaries, and, as a consequence, the phase trajectory of the considered set of particles does not remain within the limited volume of the phase space. In order to apply a statistical thermodynamics method, it was suggested to consider near-boundary trajectories relative to a reference system that moves uniformly. It was shown that if the diameter of an orbit having average thermal energy is much smaller than a strip width, the corrections to thermodynamic functions are small depending on magnetic field. Only the average velocity of near-boundary particles that form near-boundary electric currents creating the paramagnetic moment turn out to be essential.
Directory of Open Access Journals (Sweden)
Joanna Petrasek MacDonald
2013-12-01
Full Text Available Objectives . To review the protective factors and causal mechanisms which promote and enhance Indigenous youth mental health in the Circumpolar North. Study design . A systematic literature review of peer-reviewed English-language research was conducted to systematically examine the protective factors and causal mechanisms which promote and enhance Indigenous youth mental health in the Circumpolar North. Methods . This review followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA guidelines, with elements of a realist review. From 160 records identified in the initial search of 3 databases, 15 met the inclusion criteria and were retained for full review. Data were extracted using a codebook to organize and synthesize relevant information from the articles. Results . More than 40 protective factors at the individual, family, and community levels were identified as enhancing Indigenous youth mental health. These included practicing and holding traditional knowledge and skills, the desire to be useful and to contribute meaningfully to one's community, having positive role models, and believing in one's self. Broadly, protective factors at the family and community levels were identified as positively creating and impacting one's social environment, which interacts with factors at the individual level to enhance resilience. An emphasis on the roles of cultural and land-based activities, history, and language, as well as on the importance of social and family supports, also emerged throughout the literature. Conclusions . Healthy communities and families foster and support youth who are resilient to mental health challenges and able to adapt and cope with multiple stressors, be they social, economic, or environmental. Creating opportunities and environments where youth can successfully navigate challenges and enhance their resilience can in turn contribute to fostering healthy Circumpolar communities. Looking at the
MacDonald, Joanna Petrasek; Ford, James D; Willox, Ashlee Cunsolo; Ross, Nancy A
2013-12-09
To review the protective factors and causal mechanisms which promote and enhance Indigenous youth mental health in the Circumpolar North. A systematic literature review of peer-reviewed English-language research was conducted to systematically examine the protective factors and causal mechanisms which promote and enhance Indigenous youth mental health in the Circumpolar North. This review followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines, with elements of a realist review. From 160 records identified in the initial search of 3 databases, 15 met the inclusion criteria and were retained for full review. Data were extracted using a codebook to organize and synthesize relevant information from the articles. More than 40 protective factors at the individual, family, and community levels were identified as enhancing Indigenous youth mental health. These included practicing and holding traditional knowledge and skills, the desire to be useful and to contribute meaningfully to one's community, having positive role models, and believing in one's self. Broadly, protective factors at the family and community levels were identified as positively creating and impacting one's social environment, which interacts with factors at the individual level to enhance resilience. An emphasis on the roles of cultural and land-based activities, history, and language, as well as on the importance of social and family supports, also emerged throughout the literature. More than 40 protective factors at the individual, family, and community levels were identified as enhancing Indigenous youth mental health. These included practicing and holding traditional knowledge and skills, the desire to be useful and to contribute meaningfully to one's community, having positive role models, and believing in one's self. Broadly, protective factors at the family and community levels were identified as positively creating and impacting one's social
Optimal causal inference: estimating stored information and approximating causal architecture.
Still, Susanne; Crutchfield, James P; Ellison, Christopher J
2010-09-01
We introduce an approach to inferring the causal architecture of stochastic dynamical systems that extends rate-distortion theory to use causal shielding--a natural principle of learning. We study two distinct cases of causal inference: optimal causal filtering and optimal causal estimation. Filtering corresponds to the ideal case in which the probability distribution of measurement sequences is known, giving a principled method to approximate a system's causal structure at a desired level of representation. We show that in the limit in which a model-complexity constraint is relaxed, filtering finds the exact causal architecture of a stochastic dynamical system, known as the causal-state partition. From this, one can estimate the amount of historical information the process stores. More generally, causal filtering finds a graded model-complexity hierarchy of approximations to the causal architecture. Abrupt changes in the hierarchy, as a function of approximation, capture distinct scales of structural organization. For nonideal cases with finite data, we show how the correct number of the underlying causal states can be found by optimal causal estimation. A previously derived model-complexity control term allows us to correct for the effect of statistical fluctuations in probability estimates and thereby avoid overfitting.
Causality and headache triggers
Turner, Dana P.; Smitherman, Todd A.; Martin, Vincent T.; Penzien, Donald B.; Houle, Timothy T.
2013-01-01
Objective The objective of this study was to explore the conditions necessary to assign causal status to headache triggers. Background The term “headache trigger” is commonly used to label any stimulus that is assumed to cause headaches. However, the assumptions required for determining if a given stimulus in fact has a causal-type relationship in eliciting headaches have not been explicated. Methods A synthesis and application of Rubin’s Causal Model is applied to the context of headache causes. From this application the conditions necessary to infer that one event (trigger) causes another (headache) are outlined using basic assumptions and examples from relevant literature. Results Although many conditions must be satisfied for a causal attribution, three basic assumptions are identified for determining causality in headache triggers: 1) constancy of the sufferer; 2) constancy of the trigger effect; and 3) constancy of the trigger presentation. A valid evaluation of a potential trigger’s effect can only be undertaken once these three basic assumptions are satisfied during formal or informal studies of headache triggers. Conclusions Evaluating these assumptions is extremely difficult or infeasible in clinical practice, and satisfying them during natural experimentation is unlikely. Researchers, practitioners, and headache sufferers are encouraged to avoid natural experimentation to determine the causal effects of headache triggers. Instead, formal experimental designs or retrospective diary studies using advanced statistical modeling techniques provide the best approaches to satisfy the required assumptions and inform causal statements about headache triggers. PMID:23534872
Griffiths, John D.
2015-12-01
The modern understanding of the brain as a large, complex network of interacting elements is a natural consequence of the Neuron Doctrine [1,2] that has been bolstered in recent years by the tools and concepts of connectomics. In this abstracted, network-centric view, the essence of neural and cognitive function derives from the flows between network elements of activity and information - or, more generally, causal influence. The appropriate characterization of causality in neural systems, therefore, is a question at the very heart of systems neuroscience.
Statistical mechanics of high-density bond percolation
Timonin, P. N.
2018-05-01
High-density (HD) percolation describes the percolation of specific κ -clusters, which are the compact sets of sites each connected to κ nearest filled sites at least. It takes place in the classical patterns of independently distributed sites or bonds in which the ordinary percolation transition also exists. Hence, the study of series of κ -type HD percolations amounts to the description of classical clusters' structure for which κ -clusters constitute κ -cores nested one into another. Such data are needed for description of a number of physical, biological, and information properties of complex systems on random lattices, graphs, and networks. They range from magnetic properties of semiconductor alloys to anomalies in supercooled water and clustering in biological and social networks. Here we present the statistical mechanics approach to study HD bond percolation on an arbitrary graph. It is shown that the generating function for κ -clusters' size distribution can be obtained from the partition function of the specific q -state Potts-Ising model in the q →1 limit. Using this approach we find exact κ -clusters' size distributions for the Bethe lattice and Erdos-Renyi graph. The application of the method to Euclidean lattices is also discussed.
Nonextensive statistical mechanics: a brief review of its present status
Directory of Open Access Journals (Sweden)
CONSTANTINO TSALLIS
2002-09-01
Full Text Available We briefly review the present status of nonextensive statistical mechanics. We focus on (i the central equations of the formalism, (ii the most recent applications in physics and other sciences, (iii the a priori determination (from microscopic dynamics of the entropic index q for two important classes of physical systems, namely low-dimensional maps (both dissipative and conservative and long-range interacting many-body hamiltonian classical systems.Revisamos sumariamente o estado presente da mecânica estatística não-extensiva. Focalizamos em (i as equacões centrais do formalismo; (ii as aplicações mais recentes na física e em outras ciências, (iii a determinação a priori (da dinâmica microscópica do índice entrópico q para duas classes importantes de sistemas físicos, a saber, mapas de baixa dimensão (tanto dissipativos quanto conservativos e sistemas clássicos hamiltonianos de muitos corpos com interações de longo alcance.
Universal biology and the statistical mechanics of early life
Goldenfeld, Nigel; Biancalani, Tommaso; Jafarpour, Farshid
2017-11-01
All known life on the Earth exhibits at least two non-trivial common features: the canonical genetic code and biological homochirality, both of which emerged prior to the Last Universal Common Ancestor state. This article describes recent efforts to provide a narrative of this epoch using tools from statistical mechanics. During the emergence of self-replicating life far from equilibrium in a period of chemical evolution, minimal models of autocatalysis show that homochirality would have necessarily co-evolved along with the efficiency of early-life self-replicators. Dynamical system models of the evolution of the genetic code must explain its universality and its highly refined error-minimization properties. These have both been accounted for in a scenario where life arose from a collective, networked phase where there was no notion of species and perhaps even individuality itself. We show how this phase ultimately terminated during an event sometimes known as the Darwinian transition, leading to the present epoch of tree-like vertical descent of organismal lineages. These examples illustrate concrete examples of universal biology: the quest for a fundamental understanding of the basic properties of living systems, independent of precise instantiation in chemistry or other media. This article is part of the themed issue 'Reconceptualizing the origins of life'.
The statistical mechanics of vortex-acoustic ion wave turbulence
International Nuclear Information System (INIS)
Giles, M.J.
1980-01-01
The equilibrium statistical mechanics of electrostatic ion wave turbulence is studied within the framework of a continuum ion flow with adiabatic electrons. The wave field consists in general of two components, namely ion-acoustic and ion vortex modes. It is shown that the latter can significantly affect the equilibria on account of their ability both to emit and to scatter ion sound. Exact equilibria for the vortex-acoustic wave field are given in terms of a canonical distribution and the correlation functions are expressed in terms of a generating functional. Detailed calculations are carried out for the case in which the dominant coupling is an indirect interaction of the vortex modes mediated by the sound field. An equation for the spectrum of the vortex modes is obtained for this case, which is shown to possess a simple exact solution. This solution shows that the spectrum of fluctuations changes considerably as the total energy increases. Condensed vortex states could occur in the plasma sheet of the earth's magnetosphere and it is shown that the predicted ratio of the mean ion energy to the mean electron energy is consistent with the trend of observed values. (author)
Statistical mechanics and stability of random lattice field theory
International Nuclear Information System (INIS)
Baskaran, G.
1984-01-01
The averaging procedure in the random lattice field theory is studied by viewing it as a statistical mechanics of a system of classical particles. The corresponding thermodynamic phase is shown to determine the random lattice configuration which contributes dominantly to the generating function. The non-abelian gauge theory in four (space plus time) dimensions in the annealed and quenched averaging versions is shown to exist as an ideal classical gas, implying that macroscopically homogeneous configurations dominate the configurational averaging. For the free massless scalar field theory with O(n) global symmetry, in the annealed average, the pressure becomes negative for dimensions greater than two when n exceeds a critical number. This implies that macroscopically inhomogeneous collapsed configurations contribute dominantly. In the quenched averaging, the collapse of the massless scalar field theory is prevented and the system becomes an ideal gas which is at infinite temperature. Our results are obtained using exact scaling analysis. We also show approximately that SU(N) gauge theory collapses for dimensions greater than four in the annealed average. Within the same approximation, the collapse is prevented in the quenched average. We also obtain exact scaling differential equations satisfied by the generating function and physical quantities. (orig.)
Statistical mechanics approach to 1-bit compressed sensing
International Nuclear Information System (INIS)
Xu, Yingying; Kabashima, Yoshiyuki
2013-01-01
Compressed sensing is a framework that makes it possible to recover an N-dimensional sparse vector x∈R N from its linear transformation y∈R M of lower dimensionality M 1 -norm-based signal recovery scheme for 1-bit compressed sensing using statistical mechanics methods. We show that the signal recovery performance predicted by the replica method under the replica symmetric ansatz, which turns out to be locally unstable for modes breaking the replica symmetry, is in good consistency with experimental results of an approximate recovery algorithm developed earlier. This suggests that the l 1 -based recovery problem typically has many local optima of a similar recovery accuracy, which can be achieved by the approximate algorithm. We also develop another approximate recovery algorithm inspired by the cavity method. Numerical experiments show that when the density of nonzero entries in the original signal is relatively large the new algorithm offers better performance than the abovementioned scheme and does so with a lower computational cost. (paper)
A Simulational approach to teaching statistical mechanics and kinetic theory
International Nuclear Information System (INIS)
Karabulut, H.
2005-01-01
A computer simulation demonstrating how Maxwell-Boltzmann distribution is reached in gases from a nonequilibrium distribution is presented. The algorithm can be generalized to the cases of gas particles (atoms or molecules) with internal degrees of freedom such as electronic excitations and vibrational-rotational energy levels. Another generalization of the algorithm is the case of mixture of two different gases. By choosing the collision cross sections properly one can create quasi equilibrium distributions. For example by choosing same atom cross sections large and different atom cross sections very small one can create mixture of two gases with different temperatures where two gases slowly interact and come to equilibrium in a long time. Similarly, for the case one kind of atom with internal degrees of freedom one can create situations that internal degrees of freedom come to the equilibrium much later than translational degrees of freedom. In all these cases the equilibrium distribution that the algorithm gives is the same as expected from the statistical mechanics. The algorithm can also be extended to cover the case of chemical equilibrium where species A and B react to form AB molecules. The laws of chemical equilibrium can be observed from this simulation. The chemical equilibrium simulation can also help to teach the elusive concept of chemical potential
An Introduction to Thermodynamics and Statistical Mechanics - 2nd Edition
Stowe, Keith
2003-03-01
This introductory textbook for standard undergraduate courses in thermodynamics has been completely rewritten. Starting with an overview of important quantum behaviours, the book teaches students how to calculate probabilities, in order to provide a firm foundation for later chapters. It introduces the ideas of classical thermodynamics and explores them both in general and as they are applied to specific processes and interactions. The remainder of the book deals with statistical mechanics - the study of small systems interacting with huge reservoirs. The changes to this second edition have been made after more than 10 years classroom testing and student feedback. Each topic ends with a boxed summary of ideas and results, and every chapter contains numerous homework problems, covering a broad range of difficulties. Answers are given to odd numbered problems, and solutions to even problems are available to instructors at www.cambridge.org/9780521865579. The entire book has been re-written and now covers more topics It has a greater number of homework problems which range in difficulty from warm-ups to challenges It is concise and has an easy reading style
Park, Soojin; Steiner, Peter M; Kaplan, David
2018-06-01
Considering that causal mechanisms unfold over time, it is important to investigate the mechanisms over time, taking into account the time-varying features of treatments and mediators. However, identification of the average causal mediation effect in the presence of time-varying treatments and mediators is often complicated by time-varying confounding. This article aims to provide a novel approach to uncovering causal mechanisms in time-varying treatments and mediators in the presence of time-varying confounding. We provide different strategies for identification and sensitivity analysis under homogeneous and heterogeneous effects. Homogeneous effects are those in which each individual experiences the same effect, and heterogeneous effects are those in which the effects vary over individuals. Most importantly, we provide an alternative definition of average causal mediation effects that evaluates a partial mediation effect; the effect that is mediated by paths other than through an intermediate confounding variable. We argue that this alternative definition allows us to better assess at least a part of the mediated effect and provides meaningful and unique interpretations. A case study using ECLS-K data that evaluates kindergarten retention policy is offered to illustrate our proposed approach.
Maximum entropy principle and hydrodynamic models in statistical mechanics
International Nuclear Information System (INIS)
Trovato, M.; Reggiani, L.
2012-01-01
This review presents the state of the art of the maximum entropy principle (MEP) in its classical and quantum (QMEP) formulation. Within the classical MEP we overview a general theory able to provide, in a dynamical context, the macroscopic relevant variables for carrier transport in the presence of electric fields of arbitrary strength. For the macroscopic variables the linearized maximum entropy approach is developed including full-band effects within a total energy scheme. Under spatially homogeneous conditions, we construct a closed set of hydrodynamic equations for the small-signal (dynamic) response of the macroscopic variables. The coupling between the driving field and the energy dissipation is analyzed quantitatively by using an arbitrary number of moments of the distribution function. Analogously, the theoretical approach is applied to many one-dimensional n + nn + submicron Si structures by using different band structure models, different doping profiles, different applied biases and is validated by comparing numerical calculations with ensemble Monte Carlo simulations and with available experimental data. Within the quantum MEP we introduce a quantum entropy functional of the reduced density matrix, the principle of quantum maximum entropy is then asserted as fundamental principle of quantum statistical mechanics. Accordingly, we have developed a comprehensive theoretical formalism to construct rigorously a closed quantum hydrodynamic transport within a Wigner function approach. The theory is formulated both in thermodynamic equilibrium and nonequilibrium conditions, and the quantum contributions are obtained by only assuming that the Lagrange multipliers can be expanded in powers of ħ 2 , being ħ the reduced Planck constant. In particular, by using an arbitrary number of moments, we prove that: i) on a macroscopic scale all nonlocal effects, compatible with the uncertainty principle, are imputable to high-order spatial derivatives both of the
Statistical mechanics of polymer networks of any topology
International Nuclear Information System (INIS)
Duplantier, B.
1989-01-01
The statistical mechanics is considered of any polymer network with a prescribed topology, in dimension d, which was introduced previously. The basic direct renormalization theory of the associated continuum model is established. It has a very simple multiplicative structure in terms of the partition functions of the star polymers constituting the vertices of the network. A calculation is made to O(ε 2 ), where d = 4 -ε, of the basic critical dimensions σ L associated with any L=leg vertex (L ≥ 1). From this infinite series of critical exponents, any topology-dependent critical exponent can be derived. This is applied to the configuration exponent γ G of any network G to O(ε 2 ), including L-leg star polymers. The infinite sets of contact critical exponents θ between multiple points of polymers or between the cores of several star polymers are also deduced. As a particular case, the three exponents θ 0 , θ 1 , θ 2 calculated by des Cloizeaux by field-theoretic methods are recovered. The limiting exact logarithmic laws are derived at the upper critical dimension d = 4. The results are generalized to the series of topological exponents of polymer networks near a surface and of tricritical polymers at the Θ-point. Intersection properties of networks of random walks can be studied similarly. The above factorization theory of the partition function of any polymer network over its constituting L-vertices also applies to two dimensions, where it can be related to conformal invariance. The basic critical exponents σ L and thus any topological polymer exponents are then exactly known. Principal results published elsewhere are recalled
On some boundary value problems in quantum statistical mechanics
International Nuclear Information System (INIS)
Angelescu, N.
1978-01-01
The following two topics of equilibrium quantum statistical mechanics are discussed in this thesis: (i) the independence of the thermodynamic limit of grand-canonical pressure on the boundary conditions; (ii) the magnetic properties of free quantum gases. Problem (i) is handled with a functional integration technique. Wiener-type conditional measures are constructed for a given domain and a general class of mixed conditions on its boundary, these measures are used to write down Feynman-Kac formulae for the kernels of exp(-βH), where H is the Hamiltonian of N interacting particles in the given domain. These measures share the property that they assign the same mass as the usual Wiener measure to any set of trajectories not intersecting the boundary. Local estimates on the kernels of exp(-βH) are derived, which imply independence of the pressure on the boundary conditions in the thermodynamic limit. Problem (ii) has a historical development: since Landau's work (1930), much discussion has been devoted to the influence of the finite size on the susceptibility. In finite volume, Dirichlet boundary conditions are imposed, on the ground that they ensure gauge invariance. The thermodynamic limit of the pressure is proved, using again functional integration. The functional measure is now complex but absolutely continuous with respect to Wiener measure, so the usual local estimates hold true. The controversy in the literature was concentrated on the commutativity of the operations of H-derivation and thermodynamic limit, so the existence of this limit for the zero-field susceptibility and its surface term are proved separately, demonstrating this commutativity. The proof relies on the following result of independent interest: the perturbation theory of self-adjoint trace-class semigroups is trace-class convergent and analytic. (author)
Causal Diagrams for Empirical Research
Pearl, Judea
1994-01-01
The primary aim of this paper is to show how graphical models can be used as a mathematical language for integrating statistical and subject-matter information. In particular, the paper develops a principled, nonparametric framework for causal inference, in which diagrams are queried to determine if the assumptions available are sufficient for identifiying causal effects from non-experimental data. If so the diagrams can be queried to produce mathematical expressions for causal effects in ter...
D'Ariano, Giacomo Mauro
2018-07-13
Causality has never gained the status of a 'law' or 'principle' in physics. Some recent literature has even popularized the false idea that causality is a notion that should be banned from theory. Such misconception relies on an alleged universality of the reversibility of the laws of physics, based either on the determinism of classical theory, or on the multiverse interpretation of quantum theory, in both cases motivated by mere interpretational requirements for realism of the theory. Here, I will show that a properly defined unambiguous notion of causality is a theorem of quantum theory, which is also a falsifiable proposition of the theory. Such a notion of causality appeared in the literature within the framework of operational probabilistic theories. It is a genuinely theoretical notion, corresponding to establishing a definite partial order among events, in the same way as we do by using the future causal cone on Minkowski space. The notion of causality is logically completely independent of the misidentified concept of 'determinism', and, being a consequence of quantum theory, is ubiquitous in physics. In addition, as classical theory can be regarded as a restriction of quantum theory, causality holds also in the classical case, although the determinism of the theory trivializes it. I then conclude by arguing that causality naturally establishes an arrow of time. This implies that the scenario of the 'block Universe' and the connected 'past hypothesis' are incompatible with causality, and thus with quantum theory: they are both doomed to remain mere interpretations and, as such, are not falsifiable, similar to the hypothesis of 'super-determinism'.This article is part of a discussion meeting issue 'Foundations of quantum mechanics and their impact on contemporary society'. © 2018 The Author(s).
Morabia, Alfredo
2005-01-01
Epidemiological methods, which combine population thinking and group comparisons, can primarily identify causes of disease in populations. There is therefore a tension between our intuitive notion of a cause, which we want to be deterministic and invariant at the individual level, and the epidemiological notion of causes, which are invariant only at the population level. Epidemiologists have given heretofore a pragmatic solution to this tension. Causal inference in epidemiology consists in checking the logical coherence of a causality statement and determining whether what has been found grossly contradicts what we think we already know: how strong is the association? Is there a dose-response relationship? Does the cause precede the effect? Is the effect biologically plausible? Etc. This approach to causal inference can be traced back to the English philosophers David Hume and John Stuart Mill. On the other hand, the mode of establishing causality, devised by Jakob Henle and Robert Koch, which has been fruitful in bacteriology, requires that in every instance the effect invariably follows the cause (e.g., inoculation of Koch bacillus and tuberculosis). This is incompatible with epidemiological causality which has to deal with probabilistic effects (e.g., smoking and lung cancer), and is therefore invariant only for the population.
Counting in Lattices: Combinatorial Problems from Statistical Mechanics.
Randall, Dana Jill
In this thesis we consider two classical combinatorial problems arising in statistical mechanics: counting matchings and self-avoiding walks in lattice graphs. The first problem arises in the study of the thermodynamical properties of monomers and dimers (diatomic molecules) in crystals. Fisher, Kasteleyn and Temperley discovered an elegant technique to exactly count the number of perfect matchings in two dimensional lattices, but it is not applicable for matchings of arbitrary size, or in higher dimensional lattices. We present the first efficient approximation algorithm for computing the number of matchings of any size in any periodic lattice in arbitrary dimension. The algorithm is based on Monte Carlo simulation of a suitable Markov chain and has rigorously derived performance guarantees that do not rely on any assumptions. In addition, we show that these results generalize to counting matchings in any graph which is the Cayley graph of a finite group. The second problem is counting self-avoiding walks in lattices. This problem arises in the study of the thermodynamics of long polymer chains in dilute solution. While there are a number of Monte Carlo algorithms used to count self -avoiding walks in practice, these are heuristic and their correctness relies on unproven conjectures. In contrast, we present an efficient algorithm which relies on a single, widely-believed conjecture that is simpler than preceding assumptions and, more importantly, is one which the algorithm itself can test. Thus our algorithm is reliable, in the sense that it either outputs answers that are guaranteed, with high probability, to be correct, or finds a counterexample to the conjecture. In either case we know we can trust our results and the algorithm is guaranteed to run in polynomial time. This is the first algorithm for counting self-avoiding walks in which the error bounds are rigorously controlled. This work was supported in part by an AT&T graduate fellowship, a University of
He, Ping
2012-01-01
The long-standing puzzle surrounding the statistical mechanics of self-gravitating systems has not yet been solved successfully. We formulate a systematic theoretical framework of entropy-based statistical mechanics for spherically symmetric collisionless self-gravitating systems. We use an approach that is very different from that of the conventional statistical mechanics of short-range interaction systems. We demonstrate that the equilibrium states of self-gravitating systems consist of both mechanical and statistical equilibria, with the former characterized by a series of velocity-moment equations and the latter by statistical equilibrium equations, which should be derived from the entropy principle. The velocity-moment equations of all orders are derived from the steady-state collisionless Boltzmann equation. We point out that the ergodicity is invalid for the whole self-gravitating system, but it can be re-established locally. Based on the local ergodicity, using Fermi-Dirac-like statistics, with the non-degenerate condition and the spatial independence of the local microstates, we rederive the Boltzmann-Gibbs entropy. This is consistent with the validity of the collisionless Boltzmann equation, and should be the correct entropy form for collisionless self-gravitating systems. Apart from the usual constraints of mass and energy conservation, we demonstrate that the series of moment or virialization equations must be included as additional constraints on the entropy functional when performing the variational calculus; this is an extension to the original prescription by White & Narayan. Any possible velocity distribution can be produced by the statistical-mechanical approach that we have developed with the extended Boltzmann-Gibbs/White-Narayan statistics. Finally, we discuss the questions of negative specific heat and ensemble inequivalence for self-gravitating systems.
Student Understanding of Taylor Series Expansions in Statistical Mechanics
Smith, Trevor I.; Thompson, John R.; Mountcastle, Donald B.
2013-01-01
One goal of physics instruction is to have students learn to make physical meaning of specific mathematical expressions, concepts, and procedures in different physical settings. As part of research investigating student learning in statistical physics, we are developing curriculum materials that guide students through a derivation of the Boltzmann…
Statistical mechanics of learning: A variational approach for real data
International Nuclear Information System (INIS)
Malzahn, Doerthe; Opper, Manfred
2002-01-01
Using a variational technique, we generalize the statistical physics approach of learning from random examples to make it applicable to real data. We demonstrate the validity and relevance of our method by computing approximate estimators for generalization errors that are based on training data alone
A Statistical Mechanics Approach to Approximate Analytical Bootstrap Averages
DEFF Research Database (Denmark)
Malzahn, Dorthe; Opper, Manfred
2003-01-01
We apply the replica method of Statistical Physics combined with a variational method to the approximate analytical computation of bootstrap averages for estimating the generalization error. We demonstrate our approach on regression with Gaussian processes and compare our results with averages...
Causal inference based on counterfactuals
Directory of Open Access Journals (Sweden)
Höfler M
2005-09-01
Full Text Available Abstract Background The counterfactual or potential outcome model has become increasingly standard for causal inference in epidemiological and medical studies. Discussion This paper provides an overview on the counterfactual and related approaches. A variety of conceptual as well as practical issues when estimating causal effects are reviewed. These include causal interactions, imperfect experiments, adjustment for confounding, time-varying exposures, competing risks and the probability of causation. It is argued that the counterfactual model of causal effects captures the main aspects of causality in health sciences and relates to many statistical procedures. Summary Counterfactuals are the basis of causal inference in medicine and epidemiology. Nevertheless, the estimation of counterfactual differences pose several difficulties, primarily in observational studies. These problems, however, reflect fundamental barriers only when learning from observations, and this does not invalidate the counterfactual concept.
Causal inference in public health.
Glass, Thomas A; Goodman, Steven N; Hernán, Miguel A; Samet, Jonathan M
2013-01-01
Causal inference has a central role in public health; the determination that an association is causal indicates the possibility for intervention. We review and comment on the long-used guidelines for interpreting evidence as supporting a causal association and contrast them with the potential outcomes framework that encourages thinking in terms of causes that are interventions. We argue that in public health this framework is more suitable, providing an estimate of an action's consequences rather than the less precise notion of a risk factor's causal effect. A variety of modern statistical methods adopt this approach. When an intervention cannot be specified, causal relations can still exist, but how to intervene to change the outcome will be unclear. In application, the often-complex structure of causal processes needs to be acknowledged and appropriate data collected to study them. These newer approaches need to be brought to bear on the increasingly complex public health challenges of our globalized world.
Statistical mechanics and the description of the early universe I
DEFF Research Database (Denmark)
Pessah, Martin Elias; F. Torres, Diego; Vucetich, H.
2001-01-01
We analyze how the thermal history of the universe is influenced by the statistical description, assuming a deviation from the usual Bose-Einstein, Fermi-Dirac and Boltzmann-Gibbs distribution functions. These deviations represent the possible appearance of non-extensive effects related with the ......We analyze how the thermal history of the universe is influenced by the statistical description, assuming a deviation from the usual Bose-Einstein, Fermi-Dirac and Boltzmann-Gibbs distribution functions. These deviations represent the possible appearance of non-extensive effects related...... and to place limits to the range of its validity. The corrections obtained will change with temperature, and consequently, the bounds on the possible amount of non-extensivity will also change with time. We generalize results which can be used in other contexts as well, as the Boltzmann equation and the Saha...
Erwin Schroedinger: Collected papers V. 1. Contributions to statistical mechanics
International Nuclear Information System (INIS)
Schroedinger, E.
1984-01-01
38 publications reprinted in this volume show that the interest for statistical problems accompanied Schroedinger during his entire scientific career. Already in his second paper he worked on the magnetism of solid states. The classical considerations come close to the heart of diamagnetism and also to the origin of paramagnetism. In classical investigations of the specific heat Schroedinger helped through abstract theory but also by analysing a gigantic amount of experimental material. In 1926 he and F. Kohlrausch actually played the 'Urngame of Ehrenfest' as a model of the H-curve and published the results. Inclination towards experimenting, sequences of measurements and statistical evaluation of experimental data led to papers on the foundation of the theory of probability, where he tries to put the subjective probability concept on into a systematic framework. Two earlier papers on dynamics of the elastic chain remained particularly valuable. By solving the initial value problem with Bessel-functions this many-body-problem is led to an explicit discussion. These studies are likely to be the roots of another keynote in Schroedinger's thinking, namely, the irreversibility. 1945 a statistical theory of chain-reactions was published under the inconspicuous title of 'Probability Problems in Nuclear Chemistry'. In his last work Schroedinger turns off in a wrong direction: it is that energy should only be a statistical concept and should not be conserved in elementary processes, but somehow only in the mean. These short remarks only illuminate the diversity of the material in this volume, but testify Schroedinger's deep understanding in this field. (W.K.)
Statistical mechanics of complex neural systems and high dimensional data
International Nuclear Information System (INIS)
Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya
2013-01-01
Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks. (paper)
Student understanding of Taylor series expansions in statistical mechanics
Directory of Open Access Journals (Sweden)
Trevor I. Smith
2013-08-01
Full Text Available One goal of physics instruction is to have students learn to make physical meaning of specific mathematical expressions, concepts, and procedures in different physical settings. As part of research investigating student learning in statistical physics, we are developing curriculum materials that guide students through a derivation of the Boltzmann factor using a Taylor series expansion of entropy. Using results from written surveys, classroom observations, and both individual think-aloud and teaching interviews, we present evidence that many students can recognize and interpret series expansions, but they often lack fluency in creating and using a Taylor series appropriately, despite previous exposures in both calculus and physics courses.
Student understanding of Taylor series expansions in statistical mechanics
Smith, Trevor I.; Thompson, John R.; Mountcastle, Donald B.
2013-12-01
One goal of physics instruction is to have students learn to make physical meaning of specific mathematical expressions, concepts, and procedures in different physical settings. As part of research investigating student learning in statistical physics, we are developing curriculum materials that guide students through a derivation of the Boltzmann factor using a Taylor series expansion of entropy. Using results from written surveys, classroom observations, and both individual think-aloud and teaching interviews, we present evidence that many students can recognize and interpret series expansions, but they often lack fluency in creating and using a Taylor series appropriately, despite previous exposures in both calculus and physics courses.
Quantum statistical mechanics selected works of N N Bogolubov
Bogolyubov, N N
2015-01-01
In this book we have solved the complicated problem of constructing upper bounds for many-time averages for the case of a fairly broad class of model systems with four-fermion interaction. The methods proposed in this book for solving this problem will undoubtedly find application not only for the model systems associated with the theory of superconductivity considered here. The theoretical methods developed in Chapters 1 and 2 are already applicable to a much broader class of model systems from statistical physics and the theory of elementary particles. Contents: On the Theory of Superfluidit
On second quantization methods applied to classical statistical mechanics
International Nuclear Information System (INIS)
Matos Neto, A.; Vianna, J.D.M.
1984-01-01
A method of expressing statistical classical results in terms of mathematical entities usually associated to quantum field theoretical treatment of many particle systems (Fock space, commutators, field operators, state vector) is discussed. It is developed a linear response theory using the 'second quantized' Liouville equation introduced by Schonberg. The relationship of this method to that of Prigogine et al. is briefly analyzed. The chain of equations and the spectral representations for the new classical Green's functions are presented. Generalized operators defined on Fock space are discussed. It is shown that the correlation functions can be obtained from Green's functions defined with generalized operators. (Author) [pt
Application of few-body methods to statistical mechanics
International Nuclear Information System (INIS)
Bolle, D.
1981-01-01
This paper reviews some of the methods to study the thermodynamic properties of a macroscopic system in terms of the scattering processes between the constituent particles in the system. In particular, we discuss the time delay approach to the virial expansion and the use of the arrangement channel quantum mechanics formulation in kinetic theory. (orig.)
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
Nonequilibrium statistical mechanics of systems with long-range interactions
Energy Technology Data Exchange (ETDEWEB)
Levin, Yan, E-mail: levin@if.ufrgs.br; Pakter, Renato, E-mail: pakter@if.ufrgs.br; Rizzato, Felipe B., E-mail: rizzato@if.ufrgs.br; Teles, Tarcísio N., E-mail: tarcisio.teles@fi.infn.it; Benetti, Fernanda P.C., E-mail: fbenetti@if.ufrgs.br
2014-02-01
Systems with long-range (LR) forces, for which the interaction potential decays with the interparticle distance with an exponent smaller than the dimensionality of the embedding space, remain an outstanding challenge to statistical physics. The internal energy of such systems lacks extensivity and additivity. Although the extensivity can be restored by scaling the interaction potential with the number of particles, the non-additivity still remains. Lack of additivity leads to inequivalence of statistical ensembles. Before relaxing to thermodynamic equilibrium, isolated systems with LR forces become trapped in out-of-equilibrium quasi-stationary states (qSSs), the lifetime of which diverges with the number of particles. Therefore, in the thermodynamic limit LR systems will not relax to equilibrium. The qSSs are attained through the process of collisionless relaxation. Density oscillations lead to particle–wave interactions and excitation of parametric resonances. The resonant particles escape from the main cluster to form a tenuous halo. Simultaneously, this cools down the core of the distribution and dampens out the oscillations. When all the oscillations die out the ergodicity is broken and a qSS is born. In this report, we will review a theory which allows us to quantitatively predict the particle distribution in the qSS. The theory is applied to various LR interacting systems, ranging from plasmas to self-gravitating clusters and kinetic spin models.
Statistical mechanics of flux lines in high-temperature superconductors
International Nuclear Information System (INIS)
Dasgupta, C.
1992-01-01
The shortness of the low temperature coherence lengths of high T c materials leads to new mechanisms of pinning of flux lines. Lattice periodic modulations of the order parameters itself acts to pin vortex lines in regions of the unit cell were the order parameter is small. A presentation of flux creep and flux noise at low temperature and magnetic fields in terms of motion of simple metastable defects on flux lines is made, with a calculation of flux lattice melting. 12 refs
Nonextensive statistical mechanics approach to electron trapping in degenerate plasmas
Mebrouk, Khireddine; Gougam, Leila Ait; Tribeche, Mouloud
2016-06-01
The electron trapping in a weakly nondegenerate plasma is reformulated and re-examined by incorporating the nonextensive entropy prescription. Using the q-deformed Fermi-Dirac distribution function including the quantum as well as the nonextensive statistical effects, we derive a new generalized electron density with a new contribution proportional to the electron temperature T, which may dominate the usual thermal correction (∼T2) at very low temperatures. To make the physics behind the effect of this new contribution more transparent, we analyze the modifications arising in the propagation of ion-acoustic solitary waves. Interestingly, we find that due to the nonextensive correction, our plasma model allows the possibility of existence of quantum ion-acoustic solitons with velocity higher than the Fermi ion-sound velocity. Moreover, as the nonextensive parameter q increases, the critical temperature Tc beyond which coexistence of compressive and rarefactive solitons sets in, is shifted towards higher values.
Einstein's Approach to Statistical Mechanics: The 1902-04 Papers
Peliti, Luca; Rechtman, Raúl
2017-05-01
We summarize the papers published by Einstein in the Annalen der Physik in the years 1902-1904 on the derivation of the properties of thermal equilibrium on the basis of the mechanical equations of motion and of the calculus of probabilities. We point out the line of thought that led Einstein to an especially economical foundation of the discipline, and to focus on fluctuations of the energy as a possible tool for establishing the validity of this foundation. We also sketch a comparison of Einstein's approach with that of Gibbs, suggesting that although they obtained similar results, they had different motivations and interpreted them in very different ways.
Statistical mechanics of sensing and communications: Insights and techniques
International Nuclear Information System (INIS)
Murayama, T; Davis, P
2008-01-01
In this article we review a basic model for analysis of large sensor networks from the point of view of collective estimation under bandwidth constraints. We compare different sensing aggregation levels as alternative 'strategies' for collective estimation: moderate aggregation from a moderate number of sensors for which communication bandwidth is enough that data encoding can be reversible, and large scale aggregation from very many sensors - in which case communication bandwidth constraints require the use of nonreversible encoding. We show the non-trivial trade-off between sensing quality, which can be increased by increasing the number of sensors, and communication quality under bandwidth constraints, which decreases if the number of sensors is too large. From a practical standpoint, we verify that such a trade-off exists in constructively defined communications schemes. We introduce a probabilistic encoding scheme and define rate distortion models that are suitable for analysis of the large network limit. Our description shows that the methods and ideas from statistical physics can play an important role in formulating effective models for such schemes
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
PGT: A Statistical Approach to Prediction and Mechanism Design
Wolpert, David H.; Bono, James W.
One of the biggest challenges facing behavioral economics is the lack of a single theoretical framework that is capable of directly utilizing all types of behavioral data. One of the biggest challenges of game theory is the lack of a framework for making predictions and designing markets in a manner that is consistent with the axioms of decision theory. An approach in which solution concepts are distribution-valued rather than set-valued (i.e. equilibrium theory) has both capabilities. We call this approach Predictive Game Theory (or PGT). This paper outlines a general Bayesian approach to PGT. It also presents one simple example to illustrate the way in which this approach differs from equilibrium approaches in both prediction and mechanism design settings.
David Bohm : causality and chance, letters to three women
2017-01-01
The letters transcribed in this book were written by physicist David Bohm to three close female acquaintances in the period 1950 to 1956. They provide a background to his causal interpretation of quantum mechanics and the Marxist philosophy that inspired his scientific work in quantum theory, probability and statistical mechanics. In his letters, Bohm reveals the ideas that led to his ground breaking book Causality and Chance in Modern Physics. The political arguments as well as the acute personal problems contained in these letters help to give a rounded, human picture of this leading scientist and twentieth century thinker.
Impact initiation of explosives and propellants via statistical crack mechanics
Dienes, J. K.; Zuo, Q. H.; Kershner, J. D.
2006-06-01
A statistical approach has been developed for modeling the dynamic response of brittle materials by superimposing the effects of a myriad of microcracks, including opening, shear, growth and coalescence, taking as a starting point the well-established theory of penny-shaped cracks. This paper discusses the general approach, but in particular an application to the sensitivity of explosives and propellants, which often contain brittle constituents. We examine the hypothesis that the intense heating by frictional sliding between the faces of a closed crack during unstable growth can form a hot spot, causing localized melting, ignition, and fast burn of the reactive material adjacent to the crack. Opening and growth of a closed crack due to the pressure of burned gases inside the crack and interactions of adjacent cracks can lead to violent reaction, with detonation as a possible consequence. This approach was used to model a multiple-shock experiment by Mulford et al. [1993. Initiation of preshocked high explosives PBX-9404, PBX-9502, PBX-9501, monitored with in-material magnetic gauging. In: Proceedings of the 10th International Detonation Symposium, pp. 459-467] involving initiation and subsequent quenching of chemical reactions in a slab of PBX 9501 impacted by a two-material flyer plate. We examine the effects of crack orientation and temperature dependence of viscosity of the melt on the response. Numerical results confirm our theoretical finding [Zuo, Q.H., Dienes, J.K., 2005. On the stability of penny-shaped cracks with friction: the five types of brittle behavior. Int. J. Solids Struct. 42, 1309-1326] that crack orientation has a significant effect on brittle behavior, especially under compressive loading where interfacial friction plays an important role. With a reasonable choice of crack orientation and a temperature-dependent viscosity obtained from molecular dynamics calculations, the calculated particle velocities compare well with those measured using
The problem of phase transitions in statistical mechanics
International Nuclear Information System (INIS)
Martynov, Georgii A
1999-01-01
The first part of this review deals with the single-phase approach to the statistical theory of phase transitions. This approach is based on the assumption that a first-order phase transition is due to the loss of stability of the parent phase. We demonstrate that it is practically impossible to find the coordinates of the transition points using this criterion in the framework of the global Gibbs theory which describes the state of the entire macroscopic system. On the basis of the Ornstein-Zernike equation we formulate a local approach that analyzes the state of matter inside the correlation sphere of radius R c ∼ 10 A. This approach is proved to be as rigorous as the Gibbs theory. In the context of the local approach we formulate a criterion that allows finding the transition points without calculating the chemical potential and the pressure of the second conjugate phase. In the second part of the review we consider second-order phase transitions (critical phenomena). The Kadanoff-Wilson theory of critical phenomena is analyzed, based on the global Gibbs approach. Again we use the Ornstein-Zernike equation to formulate a local theory of critical phenomena. With regard to experimentally established quantities this theory yields precisely the same results as the Kadanoff-Wilson theory; secondly, the local approach allows the prediction of many previously unknown details of critical phenomena, and thirdly, the local approach paves the way for constructing a unified theory of liquids that will describe the behavior of matter not only in the regular domain of the phase diagram, but also at the critical point and in its vicinity. (reviews of topical problems)
Application of nonequilibrium quantum statistical mechanics to homogeneous nucleation
International Nuclear Information System (INIS)
Larson, A.R.; Cantrell, C.D.
1978-01-01
The master equation for cluster growth and evaporation is derived from many-body quantum mechanics and from a modified version of quantum damping theory used in laser physics. For application to nucleation theory, the quantum damping theory has been generalized to include system and reservoir states that are not separate entities. Formulae for rate constants are obtained. Solutions of the master equation yield equations of state and system-averaged quantities recognized as thermodynamic variables. Formulae for Helmholtz free energies of clusters in a Debye approximation are derived. Coexistence-line equations for pressure volume, and number of clusters are obtained from equations-of-state analysis. Coexistence-line and surface-tension data are used to obtain values of parameters for the Debye approximation. These data are employed in calculating both the nucleation current in diffusion cloud chamber experiments and the onset of condensation in expansion nozzle experiments. Theoretical and experimental results are similar for both cloud-chamber and nozzle experiments, which measure water
Parallelism in computations in quantum and statistical mechanics
International Nuclear Information System (INIS)
Clementi, E.; Corongiu, G.; Detrich, J.H.
1985-01-01
Often very fundamental biochemical and biophysical problems defy simulations because of limitations in today's computers. We present and discuss a distributed system composed of two IBM 4341 s and/or an IBM 4381 as front-end processors and ten FPS-164 attached array processors. This parallel system - called LCAP - has presently a peak performance of about 110 Mflops; extensions to higher performance are discussed. Presently, the system applications use a modified version of VM/SP as the operating system: description of the modifications is given. Three applications programs have been migrated from sequential to parallel: a molecular quantum mechanical, a Metropolis-Monte Carlo and a molecular dynamics program. Descriptions of the parallel codes are briefly outlined. Use of these parallel codes has already opened up new capabilities for our research. The very positive performance comparisons with today's supercomputers allow us to conclude that parallel computers and programming, of the type we have considered, represent a pragmatic answer to many computationally intensive problems. (orig.)
Homogeneous nucleation: a problem in nonequilibrium quantum statistical mechanics
International Nuclear Information System (INIS)
1978-08-01
The master equation for cluster growth and evaporation is derived for many-body quantum mechanics and from a modified version of quantum damping theory used in laser physics. For application to nucleation theory, the quantum damping theory is generalized to include system and reservoir states that are not separate entities. Formulas for rate constants are obtained. Solutions of the master equation yield equations of state and system-averaged quantities recognized as thermodynamic variables. Formulas for Helmholtz free energies of clusters in a Debye approximation are derived. Coexistence-line equations for pressure, volume, and number of clusters are obtained from equations-of-state analysis. Coexistence-line and surface-tension data are used to obtain values of parameters for the Debye approximation. These data are employed in calculating both the nucleation current in diffusion cloud chamber experiments and the onset of condensation in expansion nozzle experiments. Theoretical and experimental results are similar for both cloud chamber and nozzle experiments, which measure water. Comparison with other theories reveals that classical theory only accidently agrees with experiment and that the Helmholtz free-energy formula used in the Lothe--Pound theory is incomplete. 27 figures, 3 tables, 149 references
Cortical hierarchies perform Bayesian causal inference in multisensory perception.
Directory of Open Access Journals (Sweden)
Tim Rohe
2015-02-01
Full Text Available To form a veridical percept of the environment, the brain needs to integrate sensory signals from a common source but segregate those from independent sources. Thus, perception inherently relies on solving the "causal inference problem." Behaviorally, humans solve this problem optimally as predicted by Bayesian Causal Inference; yet, the underlying neural mechanisms are unexplored. Combining psychophysics, Bayesian modeling, functional magnetic resonance imaging (fMRI, and multivariate decoding in an audiovisual spatial localization task, we demonstrate that Bayesian Causal Inference is performed by a hierarchy of multisensory processes in the human brain. At the bottom of the hierarchy, in auditory and visual areas, location is represented on the basis that the two signals are generated by independent sources (= segregation. At the next stage, in posterior intraparietal sulcus, location is estimated under the assumption that the two signals are from a common source (= forced fusion. Only at the top of the hierarchy, in anterior intraparietal sulcus, the uncertainty about the causal structure of the world is taken into account and sensory signals are combined as predicted by Bayesian Causal Inference. Characterizing the computational operations of signal interactions reveals the hierarchical nature of multisensory perception in human neocortex. It unravels how the brain accomplishes Bayesian Causal Inference, a statistical computation fundamental for perception and cognition. Our results demonstrate how the brain combines information in the face of uncertainty about the underlying causal structure of the world.
Non-Gaussian Methods for Causal Structure Learning.
Shimizu, Shohei
2018-05-22
Causal structure learning is one of the most exciting new topics in the fields of machine learning and statistics. In many empirical sciences including prevention science, the causal mechanisms underlying various phenomena need to be studied. Nevertheless, in many cases, classical methods for causal structure learning are not capable of estimating the causal structure of variables. This is because it explicitly or implicitly assumes Gaussianity of data and typically utilizes only the covariance structure. In many applications, however, non-Gaussian data are often obtained, which means that more information may be contained in the data distribution than the covariance matrix is capable of containing. Thus, many new methods have recently been proposed for using the non-Gaussian structure of data and inferring the causal structure of variables. This paper introduces prevention scientists to such causal structure learning methods, particularly those based on the linear, non-Gaussian, acyclic model known as LiNGAM. These non-Gaussian data analysis tools can fully estimate the underlying causal structures of variables under assumptions even in the presence of unobserved common causes. This feature is in contrast to other approaches. A simulated example is also provided.
International Nuclear Information System (INIS)
2005-01-01
For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees
International Nuclear Information System (INIS)
2001-01-01
For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
1999-01-01
For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
A statistical mechanical approach for the computation of the climatic response to general forcings
Directory of Open Access Journals (Sweden)
V. Lucarini
2011-01-01
Full Text Available The climate belongs to the class of non-equilibrium forced and dissipative systems, for which most results of quasi-equilibrium statistical mechanics, including the fluctuation-dissipation theorem, do not apply. In this paper we show for the first time how the Ruelle linear response theory, developed for studying rigorously the impact of perturbations on general observables of non-equilibrium statistical mechanical systems, can be applied with great success to analyze the climatic response to general forcings. The crucial value of the Ruelle theory lies in the fact that it allows to compute the response of the system in terms of expectation values of explicit and computable functions of the phase space averaged over the invariant measure of the unperturbed state. We choose as test bed a classical version of the Lorenz 96 model, which, in spite of its simplicity, has a well-recognized prototypical value as it is a spatially extended one-dimensional model and presents the basic ingredients, such as dissipation, advection and the presence of an external forcing, of the actual atmosphere. We recapitulate the main aspects of the general response theory and propose some new general results. We then analyze the frequency dependence of the response of both local and global observables to perturbations having localized as well as global spatial patterns. We derive analytically several properties of the corresponding susceptibilities, such as asymptotic behavior, validity of Kramers-Kronig relations, and sum rules, whose main ingredient is the causality principle. We show that all the coefficients of the leading asymptotic expansions as well as the integral constraints can be written as linear function of parameters that describe the unperturbed properties of the system, such as its average energy. Some newly obtained empirical closure equations for such parameters allow to define such properties as an explicit function of the unperturbed forcing
International Nuclear Information System (INIS)
2003-01-01
For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products
International Nuclear Information System (INIS)
2004-01-01
For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
Lucas, J.R.
1984-01-01
Originating from lectures given to first year undergraduates reading physics and philosophy or mathematics and philosophy, formal logic is applied to issues and the elucidation of problems in space, time and causality. No special knowledge of relativity theory or quantum mechanics is needed. The text is interspersed with exercises and each chapter is preceded by a suggested 'preliminary reading' and followed by 'further reading' references. (U.K.)
STATISTICAL DISTRIBUTION PATTERNS IN MECHANICAL AND FATIGUE PROPERTIES OF METALLIC MATERIALS
Tatsuo, SAKAI; Masaki, NAKAJIMA; Keiro, TOKAJI; Norihiko, HASEGAWA; Department of Mechanical Engineering, Ritsumeikan University; Department of Mechanical Engineering, Toyota College of Technology; Department of Mechanical Engineering, Gifu University; Department of Mechanical Engineering, Gifu University
1997-01-01
Many papers on the statistical aspect of materials strength have been collected and reviewed by The Research Group for Statistical Aspects of Materials Strength.A book of "Statistical Aspects of Materials Strength" was written by this group, and published in 1992.Based on the experimental data compiled in this book, distribution patterns of mechanical properties are systematically surveyed paying an attention to metallic materials.Thus one can obtain the fundamental knowledge for a reliabilit...
Principles of classical statistical mechanics: A perspective from the notion of complementarity
International Nuclear Information System (INIS)
Velazquez Abad, Luisberis
2012-01-01
Quantum mechanics and classical statistical mechanics are two physical theories that share several analogies in their mathematical apparatus and physical foundations. In particular, classical statistical mechanics is hallmarked by the complementarity between two descriptions that are unified in thermodynamics: (i) the parametrization of the system macrostate in terms of mechanical macroscopic observablesI=(I i ), and (ii) the dynamical description that explains the evolution of a system towards the thermodynamic equilibrium. As expected, such a complementarity is related to the uncertainty relations of classical statistical mechanics ΔI i Δη i ≥k. Here, k is the Boltzmann constant, η i =∂S(I|θ)/∂I i are the restituting generalized forces derived from the entropy S(I|θ) of a closed system, which is found in an equilibrium situation driven by certain control parameters θ=(θ α ). These arguments constitute the central ingredients of a reformulation of classical statistical mechanics from the notion of complementarity. In this new framework, Einstein postulate of classical fluctuation theory dp(I|θ)∼exp[S(I|θ)/k]dI appears as the correspondence principle between classical statistical mechanics and thermodynamics in the limit k→0, while the existence of uncertainty relations can be associated with the non-commuting character of certain operators. - Highlights: ► There exists a direct analogy between quantum and classical statistical mechanics. ► Statistical form of Le Chatellier principle leads to the uncertainty principle. ► Einstein postulate is simply the correspondence principle. ► Complementary quantities are associated with non-commuting operators.
International Nuclear Information System (INIS)
Oksengendler, B. L.; Turaeva, N. N.; Uralov, I.; Marasulov, M. B.
2012-01-01
The effect of multiple exciton generation is analyzed based on statistical physics, quantum mechanics, and synergetics. Statistical problems of the effect of multiple exciton generation (MEG) are broadened and take into account not only exciton generation, but also background excitation. The study of the role of surface states of quantum dots is based on the synergy of self-catalyzed electronic reactions. An analysis of the MEG mechanism is based on the idea of electronic shaking using the sudden perturbation method in quantum mechanics. All of the above-mentioned results are applied to the problem of calculating the limiting efficiency to transform solar energy into electric energy. (authors)
Regression to Causality : Regression-style presentation influences causal attribution
DEFF Research Database (Denmark)
Bordacconi, Mats Joe; Larsen, Martin Vinæs
2014-01-01
of equivalent results presented as either regression models or as a test of two sample means. Our experiment shows that the subjects who were presented with results as estimates from a regression model were more inclined to interpret these results causally. Our experiment implies that scholars using regression...... models – one of the primary vehicles for analyzing statistical results in political science – encourage causal interpretation. Specifically, we demonstrate that presenting observational results in a regression model, rather than as a simple comparison of means, makes causal interpretation of the results...... more likely. Our experiment drew on a sample of 235 university students from three different social science degree programs (political science, sociology and economics), all of whom had received substantial training in statistics. The subjects were asked to compare and evaluate the validity...
Causally nonseparable processes admitting a causal model
International Nuclear Information System (INIS)
Feix, Adrien; Araújo, Mateus; Brukner, Caslav
2016-01-01
A recent framework of quantum theory with no global causal order predicts the existence of ‘causally nonseparable’ processes. Some of these processes produce correlations incompatible with any causal order (they violate so-called ‘causal inequalities’ analogous to Bell inequalities ) while others do not (they admit a ‘causal model’ analogous to a local model ). Here we show for the first time that bipartite causally nonseparable processes with a causal model exist, and give evidence that they have no clear physical interpretation. We also provide an algorithm to generate processes of this kind and show that they have nonzero measure in the set of all processes. We demonstrate the existence of processes which stop violating causal inequalities but are still causally nonseparable when mixed with a certain amount of ‘white noise’. This is reminiscent of the behavior of Werner states in the context of entanglement and nonlocality. Finally, we provide numerical evidence for the existence of causally nonseparable processes which have a causal model even when extended with an entangled state shared among the parties. (paper)
Dynamics of Quantum Causal Structures
Castro-Ruiz, Esteban; Giacomini, Flaminia; Brukner, Časlav
2018-01-01
It was recently suggested that causal structures are both dynamical, because of general relativity, and indefinite, because of quantum theory. The process matrix formalism furnishes a framework for quantum mechanics on indefinite causal structures, where the order between operations of local laboratories is not definite (e.g., one cannot say whether operation in laboratory A occurs before or after operation in laboratory B ). Here, we develop a framework for "dynamics of causal structures," i.e., for transformations of process matrices into process matrices. We show that, under continuous and reversible transformations, the causal order between operations is always preserved. However, the causal order between a subset of operations can be changed under continuous yet nonreversible transformations. An explicit example is that of the quantum switch, where a party in the past affects the causal order of operations of future parties, leading to a transition from a channel from A to B , via superposition of causal orders, to a channel from B to A . We generalize our framework to construct a hierarchy of quantum maps based on transformations of process matrices and transformations thereof.
Dynamics of Quantum Causal Structures
Directory of Open Access Journals (Sweden)
Esteban Castro-Ruiz
2018-03-01
Full Text Available It was recently suggested that causal structures are both dynamical, because of general relativity, and indefinite, because of quantum theory. The process matrix formalism furnishes a framework for quantum mechanics on indefinite causal structures, where the order between operations of local laboratories is not definite (e.g., one cannot say whether operation in laboratory A occurs before or after operation in laboratory B. Here, we develop a framework for “dynamics of causal structures,” i.e., for transformations of process matrices into process matrices. We show that, under continuous and reversible transformations, the causal order between operations is always preserved. However, the causal order between a subset of operations can be changed under continuous yet nonreversible transformations. An explicit example is that of the quantum switch, where a party in the past affects the causal order of operations of future parties, leading to a transition from a channel from A to B, via superposition of causal orders, to a channel from B to A. We generalize our framework to construct a hierarchy of quantum maps based on transformations of process matrices and transformations thereof.
To the problem of the statistical basis of evaluation of the mechanical safety factor
International Nuclear Information System (INIS)
Tsyganov, S.V.
2009-01-01
The methodology applied for the safety factor assessment of the WWER fuel cycles uses methods and terms of statistics. Value of the factor is calculated on the basis of estimation of probability to meet predefined limits. Such approach demands the special attention to the statistical properties of parameters of interest. Considering the mechanical constituents of the engineering factor it is assumed uncertainty factors of safety parameters are stochastic values. It characterized by probabilistic distributions that can be unknown. Traditionally in the safety factor assessment process the unknown parameters are estimated from the conservative points of view. This paper analyses how the refinement of the factors distribution parameters is important for the assessment of the mechanical safety factor. For the analysis the statistical approach is applied for modelling of different type of factor probabilistic distributions. It is shown the significant influence of the shape and parameters of distributions for some factors on the value of mechanical safety factor. (Authors)
Statistical mechanics of directed models of polymers in the square lattice
Rensburg, J V
2003-01-01
Directed square lattice models of polymers and vesicles have received considerable attention in the recent mathematical and physical sciences literature. These are idealized geometric directed lattice models introduced to study phase behaviour in polymers, and include Dyck paths, partially directed paths, directed trees and directed vesicles models. Directed models are closely related to models studied in the combinatorics literature (and are often exactly solvable). They are also simplified versions of a number of statistical mechanics models, including the self-avoiding walk, lattice animals and lattice vesicles. The exchange of approaches and ideas between statistical mechanics and combinatorics have considerably advanced the description and understanding of directed lattice models, and this will be explored in this review. The combinatorial nature of directed lattice path models makes a study using generating function approaches most natural. In contrast, the statistical mechanics approach would introduce...
To the problem of the statistical basis of evaluation of the mechanical safety factor
International Nuclear Information System (INIS)
Tsyganov, S.
2009-01-01
The methodology applied for the safety factor assessment of the VVER fuel cycles uses methods and terms of statistics. Value of the factor is calculated on the basis of estimation of probability to meet predefined limits. Such approach demands the special attention to the statistical properties of parameters of interest. Considering the mechanical constituents of the engineering factor it is assumed uncertainty factors of safety parameters are stochastic values. It characterized by probabilistic distributions that can be unknown. Traditionally in the safety factor assessment process the unknown parameters are estimated from the conservative points of view. This paper analyses how the refinement of the factors distribution parameters is important for the assessment of the mechanical safety factor. For the analysis the statistical approach is applied for modelling of different type of factor probabilistic distributions. It is shown the significant influence of the shape and parameters of distributions for some factors on the value of mechanical safety factor. (author)
Menzerath-Altmann Law: Statistical Mechanical Interpretation as Applied to a Linguistic Organization
Eroglu, Sertac
2014-10-01
The distribution behavior described by the empirical Menzerath-Altmann law is frequently encountered during the self-organization of linguistic and non-linguistic natural organizations at various structural levels. This study presents a statistical mechanical derivation of the law based on the analogy between the classical particles of a statistical mechanical organization and the distinct words of a textual organization. The derived model, a transformed (generalized) form of the Menzerath-Altmann model, was termed as the statistical mechanical Menzerath-Altmann model. The derived model allows interpreting the model parameters in terms of physical concepts. We also propose that many organizations presenting the Menzerath-Altmann law behavior, whether linguistic or not, can be methodically examined by the transformed distribution model through the properly defined structure-dependent parameter and the energy associated states.
Reward-Guided Learning with and without Causal Attribution
Jocham, Gerhard; Brodersen, Kay H.; Constantinescu, Alexandra O.; Kahn, Martin C.; Ianni, Angela M.; Walton, Mark E.; Rushworth, Matthew F.S.; Behrens, Timothy E.J.
2016-01-01
Summary When an organism receives a reward, it is crucial to know which of many candidate actions caused this reward. However, recent work suggests that learning is possible even when this most fundamental assumption is not met. We used novel reward-guided learning paradigms in two fMRI studies to show that humans deploy separable learning mechanisms that operate in parallel. While behavior was dominated by precise contingent learning, it also revealed hallmarks of noncontingent learning strategies. These learning mechanisms were separable behaviorally and neurally. Lateral orbitofrontal cortex supported contingent learning and reflected contingencies between outcomes and their causal choices. Amygdala responses around reward times related to statistical patterns of learning. Time-based heuristic mechanisms were related to activity in sensorimotor corticostriatal circuitry. Our data point to the existence of several learning mechanisms in the human brain, of which only one relies on applying known rules about the causal structure of the task. PMID:26971947
Campbell's and Rubin's Perspectives on Causal Inference
West, Stephen G.; Thoemmes, Felix
2010-01-01
Donald Campbell's approach to causal inference (D. T. Campbell, 1957; W. R. Shadish, T. D. Cook, & D. T. Campbell, 2002) is widely used in psychology and education, whereas Donald Rubin's causal model (P. W. Holland, 1986; D. B. Rubin, 1974, 2005) is widely used in economics, statistics, medicine, and public health. Campbell's approach focuses on…
A General Approach to Causal Mediation Analysis
Imai, Kosuke; Keele, Luke; Tingley, Dustin
2010-01-01
Traditionally in the social sciences, causal mediation analysis has been formulated, understood, and implemented within the framework of linear structural equation models. We argue and demonstrate that this is problematic for 3 reasons: the lack of a general definition of causal mediation effects independent of a particular statistical model, the…
Ergodic theory, interpretations of probability and the foundations of statistical mechanics
van Lith, J.H.
2001-01-01
The traditional use of ergodic theory in the foundations of equilibrium statistical mechanics is that it provides a link between thermodynamic observables and microcanonical probabilities. First of all, the ergodic theorem demonstrates the equality of microcanonical phase averages and infinite time
An introduction to conformal invariance in quantum field theory and statistical mechanics
International Nuclear Information System (INIS)
Boyanovsky, D.; Naon, C.M.
1990-01-01
The subject of conformal invariance provides an extraordinarly successful and productive symbiosis between statistical mechanics and quantum field theory. The main goal of this paper, which is tailored to a wide audience, is to give an introduction to such vast subject (C.P.)
The steady state of heterogeneous catalysis, studied by first-principles statistical mechanics
Reuter, K.; Frenkel, D.; Scheffler, M.
2004-01-01
The turnover frequency of the catalytic oxidation of CO at RuO2(110) was calculated as a function of temperature and partial pressures using ab initio statistical mechanics. The underlying energetics of the gas-phase molecules, dissociation, adsorption, surface diffusion, surface chemical reactions,
International Nuclear Information System (INIS)
Chudnovsky, D.V.; Chudnovsky, G.V.
1981-01-01
We consider general expressions of factorized S-matrices with Abelian symmetry expressed in terms of theta-functions. These expressions arise from representations of the Heisenberg group. New examples of factorized S-matrices lead to a large class of completely integrable models of statistical mechanics which generalize the XYZ-model of the eight-vertex model. (orig.)
Metamodelling Messages Conveyed in Five Statistical Mechanical Textbooks from 1936 to 2001
Niss, Martin
2009-01-01
Modelling is a significant aspect of doing physics and it is important how this activity is taught. This paper focuses on the explicit or implicit messages about modelling conveyed to the student in the treatments of phase transitions in statistical mechanics textbooks at beginning graduate level. Five textbooks from the 1930s to the present are…
Statistical mechanical perturbation theory of solid-vapor interfacial free energy
Kalikmanov, Vitalij Iosifovitsj; Hagmeijer, Rob; Venner, Cornelis H.
2017-01-01
The solid–vapor interfacial free energy γsv plays an important role in a number of physical phenomena, such as adsorption, wetting, and adhesion. We propose a closed form expression for the orientation averaged value of this quantity using a statistical mechanical perturbation approach developed in
Statistical Mechanical Perturbation Theory of Solid−Vapor Interfacial Free Energy
Kalikmanov, V.I.; Hagmeijer, R.; Venner, C.H.
2017-01-01
The solid–vapor interfacial free energy γsv plays an important role in a number of physical phenomena, such as adsorption, wetting, and adhesion. We propose a closed form expression for the orientation averaged value of this quantity using a statistical mechanical perturbation approach developed in
Quantum Statistical Mechanics, L-Series and Anabelian Geometry I: Partition Functions
Marcolli, Matilde; Cornelissen, Gunther
2014-01-01
The zeta function of a number field can be interpreted as the partition function of an associated quantum statistical mechanical (QSM) system, built from abelian class field theory. We introduce a general notion of isomorphism of QSM-systems and prove that it preserves (extremal) KMS equilibrium
Energy Technology Data Exchange (ETDEWEB)
Cohen, E G.D.
1985-01-01
The following topics were dealt with: walks, walls and ordering in low dimensions; renormalisation of fluids; wetting transition; phases and phase transitions; liquid-vapour interface; statistical mechanics in lattice gauge theory; hydrodynamic instabilities; complex dynamics and chaos; dynamical transitions; phase separation and pattern formation; kinetic theory of clustering; localisation.
International Nuclear Information System (INIS)
Tsallis, Constantino; Tirnakli, Ugur
2010-01-01
We briefly review central concepts concerning nonextensive statistical mechanics, based on the nonadditive entropy shown. Among others, we focus on possible realizations of the q-generalized Central Limit Theorem, including at the edge of chaos of the logistic map, and for quasi-stationary states of many-body long-range-interacting Hamiltonian systems.
Statistical evaluation of the mechanical properties of high-volume class F fly ash concretes
Yoon, Seyoon; Monteiro, Paulo J.M.; Macphee, Donald E.; Glasser, Fredrik P.; Imbabi, Mohammed Salah-Eldin
2014-01-01
the authors experimentally and statistically investigated the effects of mix-design factors on the mechanical properties of high-volume class F fly ash concretes. A total of 240 and 32 samples were produced and tested in the laboratory to measure compressive
International Nuclear Information System (INIS)
Testard, D.; Centre National de la Recherche Scientifique, 13 - Marseille
1977-09-01
For a finite non zero temperature state in Statistical Mechanics it is proved that the factor obtained in the corresponding representation of the quasilocal algebra has the property of Araki. The same result also holds for the 'wedge-algebras' of a hermitian scalar Wightman field
Statistical and stochastic aspects of the delocalization problem in quantum mechanics
International Nuclear Information System (INIS)
Claverie, P.; Diner, S.
1976-01-01
The space-time behaviour of electrons in atoms and molecules is reviewed. The wave conception of the electron is criticized and the poverty of the non-reductionist attitude is underlined. Further, the two main interpretations of quantum mechanics are recalled: the Copenhagen and the Statistical Interpretations. The meaning and the successes of the Statistical Interpretation are explained and it is shown that it does not solve all problems because quantum mechanics is irreducible to a classical statistical theory. The fluctuation of the particle number and its relationship to loge theory, delocalization and correlation is studied. Finally, different stochastic models for microphysics are reviewed. The markovian Fenyes-Nelson process allows an interpretation of the original heuristic considerations of Schroedinger. Non-markov processes with Schroedinger time evolution are shown to be equivalent to the base state analysis of Feynmann but they are unsatisfactory from a probabilistic point of view. Stochastic electrodynamics is presented as the most satisfactory conception nowadays
Derivation of some new distributions in statistical mechanics using maximum entropy approach
Directory of Open Access Journals (Sweden)
Ray Amritansu
2014-01-01
Full Text Available The maximum entropy principle has been earlier used to derive the Bose Einstein(B.E., Fermi Dirac(F.D. & Intermediate Statistics(I.S. distribution of statistical mechanics. The central idea of these distributions is to predict the distribution of the microstates, which are the particle of the system, on the basis of the knowledge of some macroscopic data. The latter information is specified in the form of some simple moment constraints. One distribution differs from the other in the way in which the constraints are specified. In the present paper, we have derived some new distributions similar to B.E., F.D. distributions of statistical mechanics by using maximum entropy principle. Some proofs of B.E. & F.D. distributions are shown, and at the end some new results are discussed.
Statistical mechanics provides novel insights into microtubule stability and mechanism of shrinkage.
Directory of Open Access Journals (Sweden)
Ishutesh Jain
2015-02-01
Full Text Available Microtubules are nano-machines that grow and shrink stochastically, making use of the coupling between chemical kinetics and mechanics of its constituent protofilaments (PFs. We investigate the stability and shrinkage of microtubules taking into account inter-protofilament interactions and bending interactions of intrinsically curved PFs. Computing the free energy as a function of PF tip position, we show that the competition between curvature energy, inter-PF interaction energy and entropy leads to a rich landscape with a series of minima that repeat over a length-scale determined by the intrinsic curvature. Computing Langevin dynamics of the tip through the landscape and accounting for depolymerization, we calculate the average unzippering and shrinkage velocities of GDP protofilaments and compare them with the experimentally known results. Our analysis predicts that the strength of the inter-PF interaction (E(s(m has to be comparable to the strength of the curvature energy (E(b(m such that E(s(m - E(b(m ≈ 1kBT, and questions the prevalent notion that unzippering results from the domination of bending energy of curved GDP PFs. Our work demonstrates how the shape of the free energy landscape is crucial in explaining the mechanism of MT shrinkage where the unzippered PFs will fluctuate in a set of partially peeled off states and subunit dissociation will reduce the length.
Granger Causality and Unit Roots
DEFF Research Database (Denmark)
Rodríguez-Caballero, Carlos Vladimir; Ventosa-Santaulària, Daniel
2014-01-01
The asymptotic behavior of the Granger-causality test under stochastic nonstationarity is studied. Our results confirm that the inference drawn from the test is not reliable when the series are integrated to the first order. In the presence of deterministic components, the test statistic diverges......, eventually rejecting the null hypothesis, even when the series are independent of each other. Moreover, controlling for these deterministic elements (in the auxiliary regressions of the test) does not preclude the possibility of drawing erroneous inferences. Granger-causality tests should not be used under...
International Nuclear Information System (INIS)
Haken, H.
1980-01-01
In the development of statistical mechanics we can more or less distinguish between two steps. First of all the main objective of statistical mechanics had been to give thermodynamics a solid theoretical basis starting from the microscopic world. The next step has then been performed in parallel with the development of irreversible thermodynamics dealing with processes close to thermal equilibrium. The problems dealt with here are mainly transport and relaxation processes. Over the past years it has become apparent that there is a third field, namely processes far away from thermal equilibrium. The particular interest in those processes stems from the fact that in such situations order can be generated on a macroscopic scale. The ordered states can be maintained by a flux of energy or matter through, these systems. In the realm of synergetjcs we have studied numerous examples and we now know that the occurrence of many of the ordered structures is governed by the same basic principles. (author)
Statistical mechanics of dense plasmas and implications for the plasma polarization shift
International Nuclear Information System (INIS)
Rogers, F.J.
1984-01-01
A brief description of the statistical mechanics of reacting, dense, plasmas is given. The results do not support a Debye-like polarization shift at low density. It is shown that the electronic charge density factors into a strongly quantum mechanical part, that is not much affected by many body correlations and a weakly quantum mechanical part, that is considerably effected by many body correlations. The few body charge density is obtained from direct solution of the Schroedinger equation and the many body charge density is obtained from the hypernetted chain equation through the introduction of a pseudopotential
Causal ubiquity in quantum physics. A superluminal and local-causal physical ontology
International Nuclear Information System (INIS)
Neelamkavil, Raphael
2014-01-01
A fixed highest criterial velocity (of light) in STR (special theory of relativity) is a convention for a layer of physical inquiry. QM (Quantum Mechanics) avoids action-at-a-distance using this concept, but accepts non-causality and action-at-a-distance in EPR (Einstein-Podolsky-Rosen-Paradox) entanglement experiments. Even in such allegedly [non-causal] processes, something exists processually in extension-motion, between the causal and the [non-causal]. If STR theoretically allows real-valued superluminal communication between EPR entangled particles, quantum processes become fully causal. That is, the QM world is sub-luminally, luminally and superluminally local-causal throughout, and the Law of Causality is ubiquitous in the micro-world. Thus, ''probabilistic causality'' is a merely epistemic term.
Causal ubiquity in quantum physics. A superluminal and local-causal physical ontology
Energy Technology Data Exchange (ETDEWEB)
Neelamkavil, Raphael
2014-07-01
A fixed highest criterial velocity (of light) in STR (special theory of relativity) is a convention for a layer of physical inquiry. QM (Quantum Mechanics) avoids action-at-a-distance using this concept, but accepts non-causality and action-at-a-distance in EPR (Einstein-Podolsky-Rosen-Paradox) entanglement experiments. Even in such allegedly [non-causal] processes, something exists processually in extension-motion, between the causal and the [non-causal]. If STR theoretically allows real-valued superluminal communication between EPR entangled particles, quantum processes become fully causal. That is, the QM world is sub-luminally, luminally and superluminally local-causal throughout, and the Law of Causality is ubiquitous in the micro-world. Thus, ''probabilistic causality'' is a merely epistemic term.
Frisch, Mathias
2014-01-01
Much has been written on the role of causal notions and causal reasoning in the so-called 'special sciences' and in common sense. But does causal reasoning also play a role in physics? Mathias Frisch argues that, contrary to what influential philosophical arguments purport to show, the answer is yes. Time-asymmetric causal structures are as integral a part of the representational toolkit of physics as a theory's dynamical equations. Frisch develops his argument partly through a critique of anti-causal arguments and partly through a detailed examination of actual examples of causal notions in physics, including causal principles invoked in linear response theory and in representations of radiation phenomena. Offering a new perspective on the nature of scientific theories and causal reasoning, this book will be of interest to professional philosophers, graduate students, and anyone interested in the role of causal thinking in science.
Jones, Robert
2010-03-01
There are a wide range of views on causality. To some (e.g. Karl Popper) causality is superfluous. Bertrand Russell said ``In advanced science the word cause never occurs. Causality is a relic of a bygone age.'' At the other extreme Rafael Sorkin and L. Bombelli suggest that space and time do not exist but are only an approximation to a reality that is simply a discrete ordered set, a ``causal set.'' For them causality IS reality. Others, like Judea Pearl and Nancy Cartwright are seaking to build a complex fundamental theory of causality (Causality, Cambridge Univ. Press, 2000) Or perhaps a theory of causality is simply the theory of functions. This is more or less my take on causality.
Many-body problem in quantum mechanics and quantum statistical mechanics
International Nuclear Information System (INIS)
Lee, T.D.; Yang, C.N.
1983-01-01
This is a progress report on some work concerning the quantum mechanical calculation of the fugacity coefficients b/sub l/ (which correspond to the classical cluster integrals) of a Bose, a Fermi, and a Boltzmann gas at low temperatures. A binary collision expansion method is developed which allows for the systematic calculation of b/sub l/ as expansions in powers of a/λ, where a represents the parameters of the dimensions of length that characterize the low-energy two-body collision and λ is the thermal wavelength. To any power of (a/λ) the calculation of any specific b/sub l/ is reduced to a finite number of quadratures. The method, therefore, is the low-temperature counterpart of the high-temperature expansion of b/sub l/
Directory of Open Access Journals (Sweden)
Cristina Puente Águeda
2011-10-01
Full Text Available Causality is a fundamental notion in every field of science. Since the times of Aristotle, causal relationships have been a matter of study as a way to generate knowledge and provide for explanations. In this paper I review the notion of causality through different scientific areas such as physics, biology, engineering, etc. In the scientific area, causality is usually seen as a precise relation: the same cause provokes always the same effect. But in the everyday world, the links between cause and effect are frequently imprecise or imperfect in nature. Fuzzy logic offers an adequate framework for dealing with imperfect causality, so a few notions of fuzzy causality are introduced.
Assessing students' beliefs, emotions and causal attribution ...
African Journals Online (AJOL)
Keywords: academic emotion; belief; causal attribution; statistical validation; students' conceptions of learning ... Sadi & Lee, 2015), through their effect on motivation and learning strategies .... to understand why they may or may not be doing.
Statistical mechanical analysis of the linear vector channel in digital communication
International Nuclear Information System (INIS)
Takeda, Koujin; Hatabu, Atsushi; Kabashima, Yoshiyuki
2007-01-01
A statistical mechanical framework to analyze linear vector channel models in digital wireless communication is proposed for a large system. The framework is a generalization of that proposed for code-division multiple-access systems in Takeda et al (2006 Europhys. Lett. 76 1193) and enables the analysis of the system in which the elements of the channel transfer matrix are statistically correlated with each other. The significance of the proposed scheme is demonstrated by assessing the performance of an existing model of multi-input multi-output communication systems
Mascha, Edward J; Dalton, Jarrod E; Kurz, Andrea; Saager, Leif
2013-10-01
In comparative clinical studies, a common goal is to assess whether an exposure, or intervention, affects the outcome of interest. However, just as important is to understand the mechanism(s) for how the intervention affects outcome. For example, if preoperative anemia was shown to increase the risk of postoperative complications by 15%, it would be important to quantify how much of that effect was due to patients receiving intraoperative transfusions. Mediation analysis attempts to quantify how much, if any, of the effect of an intervention on outcome goes though prespecified mediator, or "mechanism" variable(s), that is, variables sitting on the causal pathway between exposure and outcome. Effects of an exposure on outcome can thus be divided into direct and indirect, or mediated, effects. Mediation is claimed when 2 conditions are true: the exposure affects the mediator and the mediator (adjusting for the exposure) affects the outcome. Understanding how an intervention affects outcome can validate or invalidate one's original hypothesis and also facilitate further research to modify the responsible factors, and thus improve patient outcome. We discuss the proper design and analysis of studies investigating mediation, including the importance of distinguishing mediator variables from confounding variables, the challenge of identifying potential mediators when the exposure is chronic versus acute, and the requirements for claiming mediation. Simple designs are considered, as well as those containing multiple mediators, multiple outcomes, and mixed data types. Methods are illustrated with data collected by the National Surgical Quality Improvement Project (NSQIP) and utilized in a companion paper which assessed the effects of preoperative anemic status on postoperative outcomes.
Causal ubiquity in quantum physics a superluminal and local-causal physical ontology
Neelamkavil, Raphael
2014-01-01
A fixed highest criterial velocity (of light) in STR (special theory of relativity) is a convention for a layer of physical inquiry. QM (Quantum Mechanics) avoids action-at-a-distance using this concept, but accepts non-causality and action-at-a-distance in EPR (Einstein-Podolsky-Rosen-Paradox) entanglement experiments. Even in such allegedly non-causal processes, something exists processually in extension-motion, between the causal and the non-causal. If STR theoretically allows real-valued superluminal communication between EPR entangled particles, quantum processes become fully causal. That
Lee, Hopin; Wiggers, John; Kamper, Steven J; Williams, Amanda; O'Brien, Kate M; Hodder, Rebecca K; Wolfenden, Luke; Yoong, Sze Lin; Campbell, Elizabeth; Haskins, Robin; Robson, Emma K; McAuley, James H; Williams, Christopher M
2017-07-03
Low back pain (LBP) and knee osteoarthritis (OA) are highly prevalent and disabling conditions that cause societal and economic impact worldwide. Two randomised controlled trials (RCTs) will evaluate the effectiveness of a multicomponent lifestyle intervention for patients with LBP and knee OA who are overweight or obese. The key targets of this intervention are to improve physical activity, modify diet and correct pain beliefs. These factors may explain how a lifestyle intervention exerts its effects on key patient-relevant outcomes: pain, disability and quality of life. The aim of this protocol is to describe a planned analysis of a mechanism evaluation for a lifestyle intervention for overweight or obese patients with LBP and knee OA. Causal mediation analyses of 2 two-armed RCTs. Both trials are part of a cohort-multiple RCT, embedded in routine health service delivery. In each respective trial, 160 patients with LBP and 120 patients with knee OA waiting for orthopaedic consultation will be randomised to a lifestyle intervention, or to remain part of the original cohort. The intervention consists of education and advice about the benefits of weight loss and physical activity, and the Australian New South Wales Get Healthy Service. All outcome measures including patient characteristics, primary and alternative mediators, outcomes, and potential confounders will be measured at baseline (T0). The primary mediator, weight, will be measured at 6 months post randomisation; alternative mediators including diet, physical activity and pain beliefs will be measured at 6 weeks post randomisation. All outcomes (pain, disability and quality of life) will be measured at 6 months post randomisation. Data will be analysed using causal mediation analysis with sensitivity analyses for sequential ignorability. All mediation models were specified a priori before completing data collection and without prior knowledge about the effectiveness of the intervention. The study is
Three-Dimensional Color Code Thresholds via Statistical-Mechanical Mapping
Kubica, Aleksander; Beverland, Michael E.; Brandão, Fernando; Preskill, John; Svore, Krysta M.
2018-05-01
Three-dimensional (3D) color codes have advantages for fault-tolerant quantum computing, such as protected quantum gates with relatively low overhead and robustness against imperfect measurement of error syndromes. Here we investigate the storage threshold error rates for bit-flip and phase-flip noise in the 3D color code (3DCC) on the body-centered cubic lattice, assuming perfect syndrome measurements. In particular, by exploiting a connection between error correction and statistical mechanics, we estimate the threshold for 1D stringlike and 2D sheetlike logical operators to be p3DCC (1 )≃1.9 % and p3DCC (2 )≃27.6 % . We obtain these results by using parallel tempering Monte Carlo simulations to study the disorder-temperature phase diagrams of two new 3D statistical-mechanical models: the four- and six-body random coupling Ising models.
Equilibration, thermalisation, and the emergence of statistical mechanics in closed quantum systems.
Gogolin, Christian; Eisert, Jens
2016-05-01
We review selected advances in the theoretical understanding of complex quantum many-body systems with regard to emergent notions of quantum statistical mechanics. We cover topics such as equilibration and thermalisation in pure state statistical mechanics, the eigenstate thermalisation hypothesis, the equivalence of ensembles, non-equilibration dynamics following global and local quenches as well as ramps. We also address initial state independence, absence of thermalisation, and many-body localisation. We elucidate the role played by key concepts for these phenomena, such as Lieb-Robinson bounds, entanglement growth, typicality arguments, quantum maximum entropy principles and the generalised Gibbs ensembles, and quantum (non-)integrability. We put emphasis on rigorous approaches and present the most important results in a unified language.
Lehoucq, R B; Sears, Mark P
2011-09-01
The purpose of this paper is to derive the energy and momentum conservation laws of the peridynamic nonlocal continuum theory using the principles of classical statistical mechanics. The peridynamic laws allow the consideration of discontinuous motion, or deformation, by relying on integral operators. These operators sum forces and power expenditures separated by a finite distance and so represent nonlocal interaction. The integral operators replace the differential divergence operators conventionally used, thereby obviating special treatment at points of discontinuity. The derivation presented employs a general multibody interatomic potential, avoiding the standard assumption of a pairwise decomposition. The integral operators are also expressed in terms of a stress tensor and heat flux vector under the assumption that these fields are differentiable, demonstrating that the classical continuum energy and momentum conservation laws are consequences of the more general peridynamic laws. An important conclusion is that nonlocal interaction is intrinsic to continuum conservation laws when derived using the principles of statistical mechanics.
Statistical mechanics of neocortical interactions: Path-integral evolution of short-term memory
Ingber, Lester
1994-05-01
Previous papers in this series of statistical mechanics of neocortical interactions (SMNI) have detailed a development from the relatively microscopic scales of neurons up to the macroscopic scales as recorded by electroencephalography (EEG), requiring an intermediate mesocolumnar scale to be developed at the scale of minicolumns (~=102 neurons) and macrocolumns (~=105 neurons). Opportunity was taken to view SMNI as sets of statistical constraints, not necessarily describing specific synaptic or neuronal mechanisms, on neuronal interactions, on some aspects of short-term memory (STM), e.g., its capacity, stability, and duration. A recently developed c-language code, pathint, provides a non-Monte Carlo technique for calculating the dynamic evolution of arbitrary-dimension (subject to computer resources) nonlinear Lagrangians, such as derived for the two-variable SMNI problem. Here, pathint is used to explicitly detail the evolution of the SMNI constraints on STM.
Statistical-mechanical lattice models for protein-DNA binding in chromatin
International Nuclear Information System (INIS)
Teif, Vladimir B; Rippe, Karsten
2010-01-01
Statistical-mechanical lattice models for protein-DNA binding are well established as a method to describe complex ligand binding equilibria measured in vitro with purified DNA and protein components. Recently, a new field of applications has opened up for this approach since it has become possible to experimentally quantify genome-wide protein occupancies in relation to the DNA sequence. In particular, the organization of the eukaryotic genome by histone proteins into a nucleoprotein complex termed chromatin has been recognized as a key parameter that controls the access of transcription factors to the DNA sequence. New approaches have to be developed to derive statistical-mechanical lattice descriptions of chromatin-associated protein-DNA interactions. Here, we present the theoretical framework for lattice models of histone-DNA interactions in chromatin and investigate the (competitive) DNA binding of other chromosomal proteins and transcription factors. The results have a number of applications for quantitative models for the regulation of gene expression.
Addressing the statistical mechanics of planet orbits in the solar system
Mogavero, Federico
2017-10-01
The chaotic nature of planet dynamics in the solar system suggests the relevance of a statistical approach to planetary orbits. In such a statistical description, the time-dependent position and velocity of the planets are replaced by the probability density function (PDF) of their orbital elements. It is natural to set up this kind of approach in the framework of statistical mechanics. In the present paper, I focus on the collisionless excitation of eccentricities and inclinations via gravitational interactions in a planetary system. The future planet trajectories in the solar system constitute the prototype of this kind of dynamics. I thus address the statistical mechanics of the solar system planet orbits and try to reproduce the PDFs numerically constructed by Laskar (2008, Icarus, 196, 1). I show that the microcanonical ensemble of the Laplace-Lagrange theory accurately reproduces the statistics of the giant planet orbits. To model the inner planets I then investigate the ansatz of equiprobability in the phase space constrained by the secular integrals of motion. The eccentricity and inclination PDFs of Earth and Venus are reproduced with no free parameters. Within the limitations of a stationary model, the predictions also show a reasonable agreement with Mars PDFs and that of Mercury inclination. The eccentricity of Mercury demands in contrast a deeper analysis. I finally revisit the random walk approach of Laskar to the time dependence of the inner planet PDFs. Such a statistical theory could be combined with direct numerical simulations of planet trajectories in the context of planet formation, which is likely to be a chaotic process.
A study of outliers in statistical distributions of mechanical properties of structural steels
International Nuclear Information System (INIS)
Oefverbeck, P.; Oestberg, G.
1977-01-01
The safety against failure of pressure vessels can be assessed by statistical methods, so-called probabilistic fracture mechanics. The data base for such estimations is admittedly rather meagre, making it necessary to assume certain conventional statistical distributions. Since the failure rates arrived at are low, for nuclear vessels of the order of 10 - to 10 - per year, the extremes of the variables involved, among other things the mechanical properties of the steel used, are of particular interest. A question sometimes raised is whether outliers, or values exceeding the extremes in the assumed distributions, might occur. In order to explore this possibility a study has been made of strength values of three qualities of structural steels, available in samples of up to about 12,000. Statistical evaluation of these samples with respect to outliers, using standard methods for this purpose, revealed the presence of such outliers in most cases, with a frequency of occurrence of, typically, a few values per thousand, estimated by the methods described. Obviously, statistical analysis alone cannot be expected to shed any light on the causes of outliers. Thus, the interpretation of these results with respect to their implication for the probabilistic estimation of the integrety of pressure vessels must await further studies of a similar nature in which the test specimens corresponding to outliers can be recovered and examined metallographically. For the moment the results should be regarded only as a factor to be considered in discussions of the safety of pressure vessels. (author)
International Nuclear Information System (INIS)
Ichinose, Shoichi
2010-01-01
A geometric approach to general quantum statistical systems (including the harmonic oscillator) is presented. It is applied to Casimir energy and the dissipative system with friction. We regard the (N+1)-dimensional Euclidean coordinate system (X i ,τ) as the quantum statistical system of N quantum (statistical) variables (X τ ) and one Euclidean time variable (t). Introducing paths (lines or hypersurfaces) in this space (X τ ,t), we adopt the path-integral method to quantize the mechanical system. This is a new view of (statistical) quantization of the mechanical system. The system Hamiltonian appears as the area. We show quantization is realized by the minimal area principle in the present geometric approach. When we take a line as the path, the path-integral expressions of the free energy are shown to be the ordinary ones (such as N harmonic oscillators) or their simple variation. When we take a hyper-surface as the path, the system Hamiltonian is given by the area of the hyper-surface which is defined as a closed-string configuration in the bulk space. In this case, the system becomes a O(N) non-linear model. We show the recently-proposed 5 dimensional Casimir energy (ArXiv:0801.3064,0812.1263) is valid. We apply this approach to the visco-elastic system, and present a new method using the path-integral for the calculation of the dissipative properties.
PREFACE: Counting Complexity: An international workshop on statistical mechanics and combinatorics
de Gier, Jan; Warnaar, Ole
2006-07-01
On 10-15 July 2005 the conference `Counting Complexity: An international workshop on statistical mechanics and combinatorics' was held on Dunk Island, Queensland, Australia in celebration of Tony Guttmann's 60th birthday. Dunk Island provided the perfect setting for engaging in almost all of Tony's life-long passions: swimming, running, food, wine and, of course, plenty of mathematics and physics. The conference was attended by many of Tony's close scientific friends from all over the world, and most talks were presented by his past and present collaborators. This volume contains the proceedings of the meeting and consists of 24 refereed research papers in the fields of statistical mechanics, condensed matter physics and combinatorics. These papers provide an excellent illustration of the breadth and scope of Tony's work. The very first contribution, written by Stu Whittington, contains an overview of the many scientific achievements of Tony over the past 40 years in mathematics and physics. The organizing committee, consisting of Richard Brak, Aleks Owczarek, Jan de Gier, Emma Lockwood, Andrew Rechnitzer and Ole Warnaar, gratefully acknowledges the Australian Mathematical Society (AustMS), the Australian Mathematical Sciences Institute (AMSI), the ARC Centre of Excellence for Mathematics and Statistics of Complex Systems (MASCOS), the ARC Complex Open Systems Research Network (COSNet), the Institute of Physics (IOP) and the Department of Mathematics and Statistics of The University of Melbourne for financial support in organizing the conference. Tony, we hope that your future years in mathematics will be numerous. Count yourself lucky! Tony Guttman
Rehder, Bob
2017-01-01
This article assesses how people reason with categories whose features are related in causal cycles. Whereas models based on causal graphical models (CGMs) have enjoyed success modeling category-based judgments as well as a number of other cognitive phenomena, CGMs are only able to represent causal structures that are acyclic. A number of new…
The statistical mechanics of the classical two-dimensional Coulomb gas is exactly solved
International Nuclear Information System (INIS)
Samaj, L
2003-01-01
The model under consideration is a classical 2D Coulomb gas of pointlike positive and negative unit charges, interacting via a logarithmic potential. In the whole stability range of temperatures, the equilibrium statistical mechanics of this fluid model is exactly solvable via an equivalence with the integrable 2D sine-Gordon field theory. The exact solution includes the bulk thermodynamics, special cases of the surface thermodynamics and the large-distance asymptotic behaviour of the two-body correlation functions
Dimensionally regularized Tsallis' statistical mechanics and two-body Newton's gravitation
Zamora, J. D.; Rocca, M. C.; Plastino, A.; Ferri, G. L.
2018-05-01
Typical Tsallis' statistical mechanics' quantifiers like the partition function and the mean energy exhibit poles. We are speaking of the partition function Z and the mean energy 〈 U 〉 . The poles appear for distinctive values of Tsallis' characteristic real parameter q, at a numerable set of rational numbers of the q-line. These poles are dealt with dimensional regularization resources. The physical effects of these poles on the specific heats are studied here for the two-body classical gravitation potential.
Evaluation of mechanical properties of steel wire ropes by statistical methods
Directory of Open Access Journals (Sweden)
Boroška Ján
1999-12-01
Full Text Available The contribution deals with the evaluation of mechanical properties of steel wire ropes using statistical methods from the viewpoint of the quality of single wires as well as the internal construction of the wire ropes. The evaluation is based on the loading capacity calculated from the strength, number of folds and torsions. For the better ilustration, a box plot has been constructed.
A reductionist perspective on quantum statistical mechanics: Coarse-graining of path integrals.
Sinitskiy, Anton V; Voth, Gregory A
2015-09-07
Computational modeling of the condensed phase based on classical statistical mechanics has been rapidly developing over the last few decades and has yielded important information on various systems containing up to millions of atoms. However, if a system of interest contains important quantum effects, well-developed classical techniques cannot be used. One way of treating finite temperature quantum systems at equilibrium has been based on Feynman's imaginary time path integral approach and the ensuing quantum-classical isomorphism. This isomorphism is exact only in the limit of infinitely many classical quasiparticles representing each physical quantum particle. In this work, we present a reductionist perspective on this problem based on the emerging methodology of coarse-graining. This perspective allows for the representations of one quantum particle with only two classical-like quasiparticles and their conjugate momenta. One of these coupled quasiparticles is the centroid particle of the quantum path integral quasiparticle distribution. Only this quasiparticle feels the potential energy function. The other quasiparticle directly provides the observable averages of quantum mechanical operators. The theory offers a simplified perspective on quantum statistical mechanics, revealing its most reductionist connection to classical statistical physics. By doing so, it can facilitate a simpler representation of certain quantum effects in complex molecular environments.
Introduction to conformal invariance in statistical mechanics and to random surface models
International Nuclear Information System (INIS)
David, F.
1995-01-01
In the first part of these lectures I give a brief and somewhat superficial introduction to the techniques of conformal invariance and to a few applications in statistical mechanics in two dimensions. My purpose is to introduce the basic ideas and some standard results for the students who are not familiar with the theory, and to introduce concepts and tools which will be useful for the other lecturers, rather than to give a complete and up to date review of the subject. In the second part I discuss several problems in the statistical mechanics of two dimensional random surfaces and membranes. As an introduction, I present some basic facts about the statistical mechanics of one-dimensional objects and polymers, which are classical examples of objects with critical properties. Then I emphasize the special role of curvature energy and of the elastic energy associated with the internal structure of membranes, and the corresponding models of random surfaces. Finally, I discuss the specific problem of self-avoiding tethered surfaces, whose critical properties are still poorly understood, and for which the applicability of some basic techniques of field theory, such as renormalization group calculations, has been understood only recently. (orig.)
A reductionist perspective on quantum statistical mechanics: Coarse-graining of path integrals
International Nuclear Information System (INIS)
Sinitskiy, Anton V.; Voth, Gregory A.
2015-01-01
Computational modeling of the condensed phase based on classical statistical mechanics has been rapidly developing over the last few decades and has yielded important information on various systems containing up to millions of atoms. However, if a system of interest contains important quantum effects, well-developed classical techniques cannot be used. One way of treating finite temperature quantum systems at equilibrium has been based on Feynman’s imaginary time path integral approach and the ensuing quantum-classical isomorphism. This isomorphism is exact only in the limit of infinitely many classical quasiparticles representing each physical quantum particle. In this work, we present a reductionist perspective on this problem based on the emerging methodology of coarse-graining. This perspective allows for the representations of one quantum particle with only two classical-like quasiparticles and their conjugate momenta. One of these coupled quasiparticles is the centroid particle of the quantum path integral quasiparticle distribution. Only this quasiparticle feels the potential energy function. The other quasiparticle directly provides the observable averages of quantum mechanical operators. The theory offers a simplified perspective on quantum statistical mechanics, revealing its most reductionist connection to classical statistical physics. By doing so, it can facilitate a simpler representation of certain quantum effects in complex molecular environments
On the geometry of the spin-statistics connection in quantum mechanics
Energy Technology Data Exchange (ETDEWEB)
Reyes, A.
2006-07-01
The Spin-Statistics theorem states that the statistics of a system of identical particles is determined by their spin: Particles of integer spin are Bosons (i.e. obey Bose-Einstein statistics), whereas particles of half-integer spin are Fermions (i.e. obey Fermi-Dirac statistics). Since the original proof by Fierz and Pauli, it has been known that the connection between Spin and Statistics follows from the general principles of relativistic Quantum Field Theory. In spite of this, there are different approaches to Spin-Statistics and it is not clear whether the theorem holds under assumptions that are different, and even less restrictive, than the usual ones (e.g. Lorentz-covariance). Additionally, in Quantum Mechanics there is a deep relation between indistinguishability and the geometry of the configuration space. This is clearly illustrated by Gibbs' paradox. Therefore, for many years efforts have been made in order to find a geometric proof of the connection between Spin and Statistics. Recently, various proposals have been put forward, in which an attempt is made to derive the Spin-Statistics connection from assumptions different from the ones used in the relativistic, quantum field theoretic proofs. Among these, there is the one due to Berry and Robbins (BR), based on the postulation of a certain single-valuedness condition, that has caused a renewed interest in the problem. In the present thesis, we consider the problem of indistinguishability in Quantum Mechanics from a geometric-algebraic point of view. An approach is developed to study configuration spaces Q having a finite fundamental group, that allows us to describe different geometric structures of Q in terms of spaces of functions on the universal cover of Q. In particular, it is shown that the space of complex continuous functions over the universal cover of Q admits a decomposition into C(Q)-submodules, labelled by the irreducible representations of the fundamental group of Q, that can be
Causal imprinting in causal structure learning.
Taylor, Eric G; Ahn, Woo-Kyoung
2012-11-01
Suppose one observes a correlation between two events, B and C, and infers that B causes C. Later one discovers that event A explains away the correlation between B and C. Normatively, one should now dismiss or weaken the belief that B causes C. Nonetheless, participants in the current study who observed a positive contingency between B and C followed by evidence that B and C were independent given A, persisted in believing that B causes C. The authors term this difficulty in revising initially learned causal structures "causal imprinting." Throughout four experiments, causal imprinting was obtained using multiple dependent measures and control conditions. A Bayesian analysis showed that causal imprinting may be normative under some conditions, but causal imprinting also occurred in the current study when it was clearly non-normative. It is suggested that causal imprinting occurs due to the influence of prior knowledge on how reasoners interpret later evidence. Consistent with this view, when participants first viewed the evidence showing that B and C are independent given A, later evidence with only B and C did not lead to the belief that B causes C. Copyright © 2012 Elsevier Inc. All rights reserved.
Causal electromagnetic interaction equations
International Nuclear Information System (INIS)
Zinoviev, Yury M.
2011-01-01
For the electromagnetic interaction of two particles the relativistic causal quantum mechanics equations are proposed. These equations are solved for the case when the second particle moves freely. The initial wave functions are supposed to be smooth and rapidly decreasing at the infinity. This condition is important for the convergence of the integrals similar to the integrals of quantum electrodynamics. We also consider the singular initial wave functions in the particular case when the second particle mass is equal to zero. The discrete energy spectrum of the first particle wave function is defined by the initial wave function of the free-moving second particle. Choosing the initial wave functions of the free-moving second particle it is possible to obtain a practically arbitrary discrete energy spectrum.
Directory of Open Access Journals (Sweden)
Evgeni B. Starikov
2018-02-01
Full Text Available This work has shown the way to put the formal statistical-mechanical basement under the hotly debated notion of enthalpy-entropy compensation. The possibility of writing down the universal equation of state based upon the statistical mechanics is discussed here.
International Nuclear Information System (INIS)
Gonchar, N.S.
1986-01-01
This paper presents a mathematical method developed for investigating a class of systems of infinite-dimensional integral equations which have application in statistical mechanics. Necessary and sufficient conditions are obtained for the uniqueness and bifurcation of the solution of this class of systems of equations. Problems of equilibrium statistical mechanics are considered on the basis of this method
A statistical mechanics model for free-for-all airplane passenger boarding
Energy Technology Data Exchange (ETDEWEB)
Steffen, Jason H.; /Fermilab
2008-08-01
I discuss a model for free-for-all passenger boarding which is employed by some discount air carriers. The model is based on the principles of statistical mechanics where each seat in the aircraft has an associated energy which reflects the preferences of travelers. As each passenger enters the airplane they select their seats using Boltzmann statistics, proceed to that location, load their luggage, sit down, and the partition function seen by remaining passengers is modified to reflect this fact. I discuss the various model parameters and make qualitative comparisons of this passenger boarding model with those that involve assigned seats. The model can be used to predict the probability that certain seats will be occupied at different times during the boarding process. These results might provide a useful description of this boarding method. The model is a relatively unusual application of undergraduate level physics and describes a situation familiar to many students and faculty.
A statistical mechanics model for free-for-all airplane passenger boarding
Steffen, Jason H.
2008-12-01
I discuss a model for free-for-all passenger boarding which is employed by some discount air carriers. The model is based on the principles of statistical mechanics, where each seat in the aircraft has an associated energy which reflects the preferences of travelers. As each passenger enters the airplane they select their seats using Boltzmann statistics, proceed to that location, load their luggage, sit down, and the partition function seen by remaining passengers is modified to reflect this fact. I discuss the various model parameters and make qualitative comparisons of this passenger boarding model with those that involve assigned seats. The model can be used to predict the probability that certain seats will be occupied at different times during the boarding process. These results might provide a useful description of this boarding method. The model is a relatively unusual application of undergraduate level physics and describes a situation familiar to many students and faculty.
Statistical mechanics of gravitons in a box and the black hole entropy
Viaggiu, Stefano
2017-05-01
This paper is devoted to the study of the statistical mechanics of trapped gravitons obtained by 'trapping' a spherical gravitational wave in a box. As a consequence, a discrete spectrum dependent on the Legendre index ℓ similar to the harmonic oscillator one is obtained and a statistical study is performed. The mean energy 〈 E 〉 results as a sum of two discrete Planck distributions with different dependent frequencies. As an important application, we derive the semiclassical Bekenstein-Hawking entropy formula for a static Schwarzschild black hole by only requiring that the black hole internal energy U is provided by its ADM rest energy, without invoking particular quantum gravity theories. This seriously suggests that the interior of a black hole can be composed of trapped gravitons at a thermodynamical temperature proportional by a factor ≃ 2 to the horizon temperature Th.
A statistical mechanics model for free-for-all airplane passenger boarding
International Nuclear Information System (INIS)
Steffen, Jason H.; Fermilab
2008-01-01
I discuss a model for free-for-all passenger boarding which is employed by some discount air carriers. The model is based on the principles of statistical mechanics where each seat in the aircraft has an associated energy which reflects the preferences of travelers. As each passenger enters the airplane they select their seats using Boltzmann statistics, proceed to that location, load their luggage, sit down, and the partition function seen by remaining passengers is modified to reflect this fact. I discuss the various model parameters and make qualitative comparisons of this passenger boarding model with those that involve assigned seats. The model can be used to predict the probability that certain seats will be occupied at different times during the boarding process. These results might provide a useful description of this boarding method. The model is a relatively unusual application of undergraduate level physics and describes a situation familiar to many students and faculty
Repair of Partly Misspecified Causal Diagrams.
Oates, Chris J; Kasza, Jessica; Simpson, Julie A; Forbes, Andrew B
2017-07-01
Errors in causal diagrams elicited from experts can lead to the omission of important confounding variables from adjustment sets and render causal inferences invalid. In this report, a novel method is presented that repairs a misspecified causal diagram through the addition of edges. These edges are determined using a data-driven approach designed to provide improved statistical efficiency relative to de novo structure learning methods. Our main assumption is that the expert is "directionally informed," meaning that "false" edges provided by the expert would not create cycles if added to the "true" causal diagram. The overall procedure is cast as a preprocessing technique that is agnostic to subsequent causal inferences. Results based on simulated data and data derived from an observational cohort illustrate the potential for data-assisted elicitation in epidemiologic applications. See video abstract at, http://links.lww.com/EDE/B208.
Directory of Open Access Journals (Sweden)
Sheng Wang
2007-10-01
Full Text Available The recent availability of low cost and miniaturized hardware has allowedwireless sensor networks (WSNs to retrieve audio and video data in real worldapplications, which has fostered the development of wireless multimedia sensor networks(WMSNs. Resource constraints and challenging multimedia data volume makedevelopment of efficient algorithms to perform in-network processing of multimediacontents imperative. This paper proposes solving problems in the domain of WMSNs fromthe perspective of multi-agent systems. The multi-agent framework enables flexible networkconfiguration and efficient collaborative in-network processing. The focus is placed ontarget classification in WMSNs where audio information is retrieved by microphones. Todeal with the uncertainties related to audio information retrieval, the statistical approachesof power spectral density estimates, principal component analysis and Gaussian processclassification are employed. A multi-agent negotiation mechanism is specially developed toefficiently utilize limited resources and simultaneously enhance classification accuracy andreliability. The negotiation is composed of two phases, where an auction based approach isfirst exploited to allocate the classification task among the agents and then individual agentdecisions are combined by the committee decision mechanism. Simulation experiments withreal world data are conducted and the results show that the proposed statistical approachesand negotiation mechanism not only reduce memory and computation requi
Oravec, Heather Ann; Daniels, Christopher C.
2014-01-01
The National Aeronautics and Space Administration has been developing a novel docking system to meet the requirements of future exploration missions to low-Earth orbit and beyond. A dynamic gas pressure seal is located at the main interface between the active and passive mating components of the new docking system. This seal is designed to operate in the harsh space environment, but is also to perform within strict loading requirements while maintaining an acceptable level of leak rate. In this study, a candidate silicone elastomer seal was designed, and multiple subscale test articles were manufactured for evaluation purposes. The force required to fully compress each test article at room temperature was quantified and found to be below the maximum allowable load for the docking system. However, a significant amount of scatter was observed in the test results. Due to the stochastic nature of the mechanical performance of this candidate docking seal, a statistical process control technique was implemented to isolate unusual compression behavior from typical mechanical performance. The results of this statistical analysis indicated a lack of process control, suggesting a variation in the manufacturing phase of the process. Further investigation revealed that changes in the manufacturing molding process had occurred which may have influenced the mechanical performance of the seal. This knowledge improves the chance of this and future space seals to satisfy or exceed design specifications.
Wang, Xue; Bi, Dao-wei; Ding, Liang; Wang, Sheng
2007-01-01
The recent availability of low cost and miniaturized hardware has allowed wireless sensor networks (WSNs) to retrieve audio and video data in real world applications, which has fostered the development of wireless multimedia sensor networks (WMSNs). Resource constraints and challenging multimedia data volume make development of efficient algorithms to perform in-network processing of multimedia contents imperative. This paper proposes solving problems in the domain of WMSNs from the perspective of multi-agent systems. The multi-agent framework enables flexible network configuration and efficient collaborative in-network processing. The focus is placed on target classification in WMSNs where audio information is retrieved by microphones. To deal with the uncertainties related to audio information retrieval, the statistical approaches of power spectral density estimates, principal component analysis and Gaussian process classification are employed. A multi-agent negotiation mechanism is specially developed to efficiently utilize limited resources and simultaneously enhance classification accuracy and reliability. The negotiation is composed of two phases, where an auction based approach is first exploited to allocate the classification task among the agents and then individual agent decisions are combined by the committee decision mechanism. Simulation experiments with real world data are conducted and the results show that the proposed statistical approaches and negotiation mechanism not only reduce memory and computation requirements in WMSNs but also significantly enhance classification accuracy and reliability. PMID:28903223
Monteiro, Mayra; Oliveira, Victor; Santos, Francisco; Barros Neto, Eduardo; Silva, Karyn; Silva, Rayane; Henrique, João; Chibério, Abimaelle
2017-08-01
In order to obtain cassava starch films with improved mechanical properties in relation to the synthetic polymer in the packaging production, a complete factorial design 23 was carried out in order to investigate which factor significantly influences the tensile strength of the biofilm. The factors to be investigated were cassava starch, glycerol and modified clay contents. Modified bentonite clay was used as a filling material of the biofilm. Glycerol was the plasticizer used to thermoplastify cassava starch. The factorial analysis suggested a regression model capable of predicting the optimal mechanical property of the cassava starch film from the maximization of the tensile strength. The reliability of the regression model was tested by the correlation established with the experimental data through the following statistical analyse: Pareto graph. The modified clay was the factor of greater statistical significance on the observed response variable, being the factor that contributed most to the improvement of the mechanical property of the starch film. The factorial experiments showed that the interaction of glycerol with both modified clay and cassava starch was significant for the reduction of biofilm ductility. Modified clay and cassava starch contributed to the maximization of biofilm ductility, while glycerol contributed to the minimization.
Repeated causal decision making.
Hagmayer, York; Meder, Björn
2013-01-01
Many of our decisions refer to actions that have a causal impact on the external environment. Such actions may not only allow for the mere learning of expected values or utilities but also for acquiring knowledge about the causal structure of our world. We used a repeated decision-making paradigm to examine what kind of knowledge people acquire in such situations and how they use their knowledge to adapt to changes in the decision context. Our studies show that decision makers' behavior is strongly contingent on their causal beliefs and that people exploit their causal knowledge to assess the consequences of changes in the decision problem. A high consistency between hypotheses about causal structure, causally expected values, and actual choices was observed. The experiments show that (a) existing causal hypotheses guide the interpretation of decision feedback, (b) consequences of decisions are used to revise existing causal beliefs, and (c) decision makers use the experienced feedback to induce a causal model of the choice situation even when they have no initial causal hypotheses, which (d) enables them to adapt their choices to changes of the decision problem. (PsycINFO Database Record (c) 2013 APA, all rights reserved).
Ingber, Lester
1991-09-01
A series of papers has developed a statistical mechanics of neocortical interactions (SMNI), deriving aggregate behavior of experimentally observed columns of neurons from statistical electrical-chemical properties of synaptic interactions. While not useful to yield insights at the single-neuron level, SMNI has demonstrated its capability in describing large-scale properties of short-term memory and electroencephalographic (EEG) systematics. The necessity of including nonlinear and stochastic structures in this development has been stressed. In this paper, a more stringent test is placed on SMNI: The algebraic and numerical algorithms previously developed in this and similar systems are brought to bear to fit large sets of EEG and evoked-potential data being collected to investigate genetic predispositions to alcoholism and to extract brain ``signatures'' of short-term memory. Using the numerical algorithm of very fast simulated reannealing, it is demonstrated that SMNI can indeed fit these data within experimentally observed ranges of its underlying neuronal-synaptic parameters, and the quantitative modeling results are used to examine physical neocortical mechanisms to discriminate high-risk and low-risk populations genetically predisposed to alcoholism. Since this study is a control to span relatively long time epochs, similar to earlier attempts to establish such correlations, this discrimination is inconclusive because of other neuronal activity which can mask such effects. However, the SMNI model is shown to be consistent with EEG data during selective attention tasks and with neocortical mechanisms describing short-term memory previously published using this approach. This paper explicitly identifies similar nonlinear stochastic mechanisms of interaction at the microscopic-neuronal, mesoscopic-columnar, and macroscopic-regional scales of neocortical interactions. These results give strong quantitative support for an accurate intuitive picture, portraying
Exploring the Connection Between Sampling Problems in Bayesian Inference and Statistical Mechanics
Pohorille, Andrew
2006-01-01
The Bayesian and statistical mechanical communities often share the same objective in their work - estimating and integrating probability distribution functions (pdfs) describing stochastic systems, models or processes. Frequently, these pdfs are complex functions of random variables exhibiting multiple, well separated local minima. Conventional strategies for sampling such pdfs are inefficient, sometimes leading to an apparent non-ergodic behavior. Several recently developed techniques for handling this problem have been successfully applied in statistical mechanics. In the multicanonical and Wang-Landau Monte Carlo (MC) methods, the correct pdfs are recovered from uniform sampling of the parameter space by iteratively establishing proper weighting factors connecting these distributions. Trivial generalizations allow for sampling from any chosen pdf. The closely related transition matrix method relies on estimating transition probabilities between different states. All these methods proved to generate estimates of pdfs with high statistical accuracy. In another MC technique, parallel tempering, several random walks, each corresponding to a different value of a parameter (e.g. "temperature"), are generated and occasionally exchanged using the Metropolis criterion. This method can be considered as a statistically correct version of simulated annealing. An alternative approach is to represent the set of independent variables as a Hamiltonian system. Considerab!e progress has been made in understanding how to ensure that the system obeys the equipartition theorem or, equivalently, that coupling between the variables is correctly described. Then a host of techniques developed for dynamical systems can be used. Among them, probably the most powerful is the Adaptive Biasing Force method, in which thermodynamic integration and biased sampling are combined to yield very efficient estimates of pdfs. The third class of methods deals with transitions between states described
[Causal analysis approaches in epidemiology].
Dumas, O; Siroux, V; Le Moual, N; Varraso, R
2014-02-01
formulation of causal hypotheses, which will be a basis for all methodological choices. Beyond this step, statistical analysis tools recently developed offer new possibilities to delineate complex relationships, in particular in life course epidemiology. Copyright © 2013 Elsevier Masson SAS. All rights reserved.
Portfolio selection problem with liquidity constraints under non-extensive statistical mechanics
International Nuclear Information System (INIS)
Zhao, Pan; Xiao, Qingxian
2016-01-01
In this study, we consider the optimal portfolio selection problem with liquidity limits. A portfolio selection model is proposed in which the risky asset price is driven by the process based on non-extensive statistical mechanics instead of the classic Wiener process. Using dynamic programming and Lagrange multiplier methods, we obtain the optimal policy and value function. Moreover, the numerical results indicate that this model is considerably different from the model based on the classic Wiener process, the optimal strategy is affected by the non-extensive parameter q, the increase in the investment in the risky asset is faster at a larger parameter q and the increase in wealth is similar.
Shinzato, Takashi
2016-12-01
The portfolio optimization problem in which the variances of the return rates of assets are not identical is analyzed in this paper using the methodology of statistical mechanical informatics, specifically, replica analysis. We defined two characteristic quantities of an optimal portfolio, namely, minimal investment risk and investment concentration, in order to solve the portfolio optimization problem and analytically determined their asymptotical behaviors using replica analysis. Numerical experiments were also performed, and a comparison between the results of our simulation and those obtained via replica analysis validated our proposed method.
PREFACE: International Workshop on Statistical-Mechanical Informatics 2008 (IW-SMI 2008)
Hayashi, Masahito; Inoue, Jun-ichi; Kabashima, Yoshiyuki; Tanaka, Kazuyuki
2009-01-01
Statistical mechanical informatics (SMI) is an approach that applies physics to information science, in which many-body problems in information processing are tackled using statistical mechanics methods. In the last decade, the use of SMI has resulted in great advances in research into classical information processing, in particular, theories of information and communications, probabilistic inference and combinatorial optimization problems. It is expected that the success of SMI can be extended to quantum systems. The importance of many-body problems is also being recognized in quantum information theory (QIT), for which quantification of entanglement of bipartite systems has recently been almost completely established after considerable effort. SMI and QIT are sufficiently well developed that it is now appropriate to consider applying SMI to quantum systems and developing many-body theory in QIT. This combination of SMI and QIT is highly likely to contribute significantly to the development of both research fields. The International Workshop on Statistical-Mechanical Informatics has been organized in response to this situation. This workshop, held at Sendai International Conference Center, Sendai, Japan, 14-17 September 2008, and sponsored by the Grant-in-Aid for Scientific Research on Priority Areas `Deepening and Expansion of Statistical Mechanical Informatics (DEX-SMI)' (Head investigator: Yoshiyuki Kabashima, Tokyo Institute of Technology) (Project http://dex-smi.sp.dis.titech.ac.jp/DEX-SMI), was intended to provide leading researchers with strong interdisciplinary interests in QIT and SMI with the opportunity to engage in intensive discussions. The aim of the workshop was to expand SMI to quantum systems and QIT research on quantum (entangled) many-body systems, to discuss possible future directions, and to offer researchers the opportunity to exchange ideas that may lead to joint research initiatives. We would like to thank the contributors of the workshop
Statistical mechanics of Fermi-Pasta-Ulam chains with the canonical ensemble
Demirel, Melik C.; Sayar, Mehmet; Atılgan, Ali R.
1997-03-01
Low-energy vibrations of a Fermi-Pasta-Ulam-Β (FPU-Β) chain with 16 repeat units are analyzed with the aid of numerical experiments and the statistical mechanics equations of the canonical ensemble. Constant temperature numerical integrations are performed by employing the cubic coupling scheme of Kusnezov et al. [Ann. Phys. 204, 155 (1990)]. Very good agreement is obtained between numerical results and theoretical predictions for the probability distributions of the generalized coordinates and momenta both of the chain and of the thermal bath. It is also shown that the average energy of the chain scales linearly with the bath temperature.
Statistical mechanics of a one-component fluid of charged hard rods in 1D
International Nuclear Information System (INIS)
Vericat, F.; Blum, L.
1986-09-01
The statistical mechanics of a classical one component system of charged hard rods in a neutralizing background is investigated in 1D stressing on the effects of the hard core interactions over the thermodynamic and the structure of the system. The crystalline status of the system at all temperatures and densities and the absence of phase transitions is shown by extending previous results of Baxter and Kunz on the one-component plasma of point particles. Explicit expressions for the thermodynamic functions and the one-particle correlation function are given in the limits of small and strong couplings. (author)
Reason of method of density functional in classical and quantum statistical mechanisms
International Nuclear Information System (INIS)
Dinariev, O.Yu.
2000-01-01
Interaction between phenomenological description of a multi-component mixture on the basis of entropy functional with members, square in terms of component density gradients and temperature, on the one hand, and description in the framework of classical and quantum statistical mechanics, on the other hand, was investigated. Explicit expressions for the entropy functional in the classical and quantum theory were derived. Then a square approximation for the case of minor disturbances of uniform state was calculated. In the approximation the addends square in reference to the gradient were singlet out. It permits calculation of the relevant phenomenological coefficients from the leading principles [ru
Statistical mechanics of neocortical interactions. Derivation of short-term-memory capacity
Ingber, Lester
1984-06-01
A theory developed by the author to describe macroscopic neocortical interactions demonstrates that empirical values of chemical and electrical parameters of synaptic interactions establish several minima of the path-integral Lagrangian as a function of excitatory and inhibitory columnar firings. The number of possible minima, their time scales of hysteresis and probable reverberations, and their nearest-neighbor columnar interactions are all consistent with well-established empirical rules of human short-term memory. Thus, aspects of conscious experience are derived from neuronal firing patterns, using modern methods of nonlinear nonequilibrium statistical mechanics to develop realistic explicit synaptic interactions.
Existence and uniqueness of Gibbs states for a statistical mechanical polyacetylene model
International Nuclear Information System (INIS)
Park, Y.M.
1987-01-01
One-dimensional polyacetylene is studied as a model of statistical mechanics. In a semiclassical approximation the system is equivalent to a quantum XY model interacting with unbounded classical spins in one-dimensional lattice space Z. By establishing uniform estimates, an infinite-volume-limit Hilbert space, a strongly continuous time evolution group of unitary operators, and an invariant vector are constructed. Moreover, it is proven that any infinite-limit state satisfies Gibbs conditions. Finally, a modification of Araki's relative entropy method is used to establish the uniqueness of Gibbs states
Statistical mechanics of directed models of polymers in the square lattice
International Nuclear Information System (INIS)
Rensburg, E J Janse van
2003-01-01
Directed square lattice models of polymers and vesicles have received considerable attention in the recent mathematical and physical sciences literature. These are idealized geometric directed lattice models introduced to study phase behaviour in polymers, and include Dyck paths, partially directed paths, directed trees and directed vesicles models. Directed models are closely related to models studied in the combinatorics literature (and are often exactly solvable). They are also simplified versions of a number of statistical mechanics models, including the self-avoiding walk, lattice animals and lattice vesicles. The exchange of approaches and ideas between statistical mechanics and combinatorics have considerably advanced the description and understanding of directed lattice models, and this will be explored in this review. The combinatorial nature of directed lattice path models makes a study using generating function approaches most natural. In contrast, the statistical mechanics approach would introduce partition functions and free energies, and then investigate these using the general framework of critical phenomena. Generating function and statistical mechanics approaches are closely related. For example, questions regarding the limiting free energy may be approached by considering the radius of convergence of a generating function, and the scaling properties of thermodynamic quantities are related to the asymptotic properties of the generating function. In this review the methods for obtaining generating functions and determining free energies in directed lattice path models of linear polymers is presented. These methods include decomposition methods leading to functional recursions, as well as the Temperley method (that is implemented by creating a combinatorial object, one slice at a time). A constant term formulation of the generating function will also be reviewed. The thermodynamic features and critical behaviour in models of directed paths may be
Non-extensive statistical mechanics and black hole entropy from quantum geometry
Directory of Open Access Journals (Sweden)
Abhishek Majhi
2017-12-01
Full Text Available Using non-extensive statistical mechanics, the BekensteinâHawking area law is obtained from microstates of black holes in loop quantum gravity, for arbitrary real positive values of the BarberoâImmirzi parameter (Î³. The arbitrariness of Î³ is encoded in the strength of the âbiasâ created in the horizon microstates through the coupling with the quantum geometric fields exterior to the horizon. An experimental determination of Î³ will fix this coupling, leaving out the macroscopic area of the black hole to be the only free quantity of the theory.
International Nuclear Information System (INIS)
Sorelli, Luca; Constantinides, Georgios; Ulm, Franz-Josef; Toutlemonde, Francois
2008-01-01
Advances in engineering the microstructure of cementitious composites have led to the development of fiber reinforced Ultra High Performance Concretes (UHPC). The scope of this paper is twofold, first to characterize the nano-mechanical properties of the phases governing the UHPC microstructure by means of a novel statistical nanoindentation technique; then to upscale those nanoscale properties, by means of continuum micromechanics, to the macroscopic scale of engineering applications. In particular, a combined investigation of nanoindentation, scanning electron microscope (SEM) and X-ray Diffraction (XRD) indicates that the fiber-matrix transition zone is relatively defect free. On this basis, a four-level multiscale model with defect free interfaces allows to accurately determine the composite stiffness from the measured nano-mechanical properties. Besides evidencing the dominant role of high density calcium silicate hydrates and the stiffening effect of residual clinker, the suggested model may become a useful tool for further optimizing cement-based engineered composites
International Nuclear Information System (INIS)
Ng, Felix S.L.
2016-01-01
We develop a statistical-mechanical model of one-dimensional normal grain growth that does not require any drift-velocity parameterization for grain size, such as used in the continuity equation of traditional mean-field theories. The model tracks the population by considering grain sizes in neighbour pairs; the probability of a pair having neighbours of certain sizes is determined by the size-frequency distribution of all pairs. Accordingly, the evolution obeys a partial integro-differential equation (PIDE) over ‘grain size versus neighbour grain size’ space, so that the grain-size distribution is a projection of the PIDE's solution. This model, which is applicable before as well as after statistically self-similar grain growth has been reached, shows that the traditional continuity equation is invalid outside this state. During statistically self-similar growth, the PIDE correctly predicts the coarsening rate, invariant grain-size distribution and spatial grain size correlations observed in direct simulations. The PIDE is then reducible to the standard continuity equation, and we derive an explicit expression for the drift velocity. It should be possible to formulate similar parameterization-free models of normal grain growth in two and three dimensions.
International Nuclear Information System (INIS)
Daunys, Mykolas; Sniuolis, Raimondas
2006-01-01
About 300 welded joint materials that are used in nuclear power energy were tested under monotonous tension and low cycle loading in Kaunas University of Technology together with St. Peterburg Central Research Institute of Structural Materials in 1970-2000. The main mechanical, low cycle loading and fracture characteristics of base metals, weld metals and some heat-affected zones of welded joints metals were determined during these experiments. Analytical dependences of low cycle fatigue parameters on mechanical characteristics of structural materials were proposed on the basis of a large number of experimental data, obtained by the same methods and testing equipment. When these dependences are used, expensive low cycle fatigue tests may be omitted and it is possible to compute low cycle loading curves parameters and lifetime for structural materials according to the main mechanical characteristics given in technical manuals. Dependences of low cycle loading curves parameters on mechanical characteristics for several groups of structural materials used in Russian nuclear power energy are obtained by statistical methods and proposed in this paper
Dunne, Lawrence J.; Manos, George
2018-03-01
Although crucial for designing separation processes little is known experimentally about multi-component adsorption isotherms in comparison with pure single components. Very few binary mixture adsorption isotherms are to be found in the literature and information about isotherms over a wide range of gas-phase composition and mechanical pressures and temperature is lacking. Here, we present a quasi-one-dimensional statistical mechanical model of binary mixture adsorption in metal-organic frameworks (MOFs) treated exactly by a transfer matrix method in the osmotic ensemble. The experimental parameter space may be very complex and investigations into multi-component mixture adsorption may be guided by theoretical insights. The approach successfully models breathing structural transitions induced by adsorption giving a good account of the shape of adsorption isotherms of CO2 and CH4 adsorption in MIL-53(Al). Binary mixture isotherms and co-adsorption-phase diagrams are also calculated and found to give a good description of the experimental trends in these properties and because of the wide model parameter range which reproduces this behaviour suggests that this is generic to MOFs. Finally, a study is made of the influence of mechanical pressure on the shape of CO2 and CH4 adsorption isotherms in MIL-53(Al). Quite modest mechanical pressures can induce significant changes to isotherm shapes in MOFs with implications for binary mixture separation processes. This article is part of the theme issue `Modern theoretical chemistry'.
A statistical model of uplink inter-cell interference with slow and fast power control mechanisms
Tabassum, Hina
2013-09-01
Uplink power control is in essence an interference mitigation technique that aims at minimizing the inter-cell interference (ICI) in cellular networks by reducing the transmit power levels of the mobile users while maintaining their target received signal quality levels at base stations. Power control mechanisms directly impact the interference dynamics and, thus, affect the overall achievable capacity and consumed power in cellular networks. Due to the stochastic nature of wireless channels and mobile users\\' locations, it is important to derive theoretical models for ICI that can capture the impact of design alternatives related to power control mechanisms. To this end, we derive and verify a novel statistical model for uplink ICI in Generalized-K composite fading environments as a function of various slow and fast power control mechanisms. The derived expressions are then utilized to quantify numerically key network performance metrics that include average resource fairness, average reduction in power consumption, and ergodic capacity. The accuracy of the derived expressions is validated via Monte-Carlo simulations. Results are generated for multiple network scenarios, and insights are extracted to assess various power control mechanisms as a function of system parameters. © 1972-2012 IEEE.
Ingber, Lester; Nunez, Paul L
2011-02-01
The dynamic behavior of scalp potentials (EEG) is apparently due to some combination of global and local processes with important top-down and bottom-up interactions across spatial scales. In treating global mechanisms, we stress the importance of myelinated axon propagation delays and periodic boundary conditions in the cortical-white matter system, which is topologically close to a spherical shell. By contrast, the proposed local mechanisms are multiscale interactions between cortical columns via short-ranged non-myelinated fibers. A mechanical model consisting of a stretched string with attached nonlinear springs demonstrates the general idea. The string produces standing waves analogous to large-scale coherent EEG observed in some brain states. The attached springs are analogous to the smaller (mesoscopic) scale columnar dynamics. Generally, we expect string displacement and EEG at all scales to result from both global and local phenomena. A statistical mechanics of neocortical interactions (SMNI) calculates oscillatory behavior consistent with typical EEG, within columns, between neighboring columns via short-ranged non-myelinated fibers, across cortical regions via myelinated fibers, and also derives a string equation consistent with the global EEG model. Copyright © 2010 Elsevier Inc. All rights reserved.
A statistical model of uplink inter-cell interference with slow and fast power control mechanisms
Tabassum, Hina; Yilmaz, Ferkan; Dawy, Zaher; Alouini, Mohamed-Slim
2013-01-01
Uplink power control is in essence an interference mitigation technique that aims at minimizing the inter-cell interference (ICI) in cellular networks by reducing the transmit power levels of the mobile users while maintaining their target received signal quality levels at base stations. Power control mechanisms directly impact the interference dynamics and, thus, affect the overall achievable capacity and consumed power in cellular networks. Due to the stochastic nature of wireless channels and mobile users' locations, it is important to derive theoretical models for ICI that can capture the impact of design alternatives related to power control mechanisms. To this end, we derive and verify a novel statistical model for uplink ICI in Generalized-K composite fading environments as a function of various slow and fast power control mechanisms. The derived expressions are then utilized to quantify numerically key network performance metrics that include average resource fairness, average reduction in power consumption, and ergodic capacity. The accuracy of the derived expressions is validated via Monte-Carlo simulations. Results are generated for multiple network scenarios, and insights are extracted to assess various power control mechanisms as a function of system parameters. © 1972-2012 IEEE.
A unified treatment of dynamics and scattering in classical and quantum statistical mechanics
International Nuclear Information System (INIS)
Prugovecki, E.
1978-01-01
The common formal features of classical and quantum statistical mechanics are investigated at three separate levels: at the level of L 2 spaces of wave-packets on GAMMA-space, of Liouville spaces B 2 consisting of density operators constructed from such wave-packets, and of phase-space representation spaces P of GAMMA distribution functions. It is shown that at the last level the formal similarities become so outstanding that all key quantities in P-space, such as Liouville operators, Hamiltonian functions, position and momentum observables, etc., are represented by expressions which to the zeroth order in (h/2π) coincide in the classical and quantum case, and in some instances coincide completely. Scattering theory on the B 2 Liouville spaces takes on the same formal appearance for classical and quantum statistical mechanics, and to the zeroth order in (h/2π) it coincides in both cases. This makes possible the formulation of a classical approximation to quantum scattering, and of a computational scheme for determining rhosup(out) from rhosup(in) for successive order of (h/2π). (Auth.)
Gautestad, Arild O
2012-09-07
Animals moving under the influence of spatio-temporal scaling and long-term memory generate a kind of space-use pattern that has proved difficult to model within a coherent theoretical framework. An extended kind of statistical mechanics is needed, accounting for both the effects of spatial memory and scale-free space use, and put into a context of ecological conditions. Simulations illustrating the distinction between scale-specific and scale-free locomotion are presented. The results show how observational scale (time lag between relocations of an individual) may critically influence the interpretation of the underlying process. In this respect, a novel protocol is proposed as a method to distinguish between some main movement classes. For example, the 'power law in disguise' paradox-from a composite Brownian motion consisting of a superposition of independent movement processes at different scales-may be resolved by shifting the focus from pattern analysis at one particular temporal resolution towards a more process-oriented approach involving several scales of observation. A more explicit consideration of system complexity within a statistical mechanical framework, supplementing the more traditional mechanistic modelling approach, is advocated.
A Note on Burg’s Modified Entropy in Statistical Mechanics
Directory of Open Access Journals (Sweden)
Amritansu Ray
2016-02-01
Full Text Available Burg’s entropy plays an important role in this age of information euphoria, particularly in understanding the emergent behavior of a complex system such as statistical mechanics. For discrete or continuous variable, maximization of Burg’s Entropy subject to its only natural and mean constraint always provide us a positive density function though the Entropy is always negative. On the other hand, Burg’s modified entropy is a better measure than the standard Burg’s entropy measure since this is always positive and there is no computational problem for small probabilistic values. Moreover, the maximum value of Burg’s modified entropy increases with the number of possible outcomes. In this paper, a premium has been put on the fact that if Burg’s modified entropy is used instead of conventional Burg’s entropy in a maximum entropy probability density (MEPD function, the result yields a better approximation of the probability distribution. An important lemma in basic algebra and a suitable example with tables and graphs in statistical mechanics have been given to illustrate the whole idea appropriately.
Statistical-Mechanical Analysis of Pre-training and Fine Tuning in Deep Learning
Ohzeki, Masayuki
2015-03-01
In this paper, we present a statistical-mechanical analysis of deep learning. We elucidate some of the essential components of deep learning — pre-training by unsupervised learning and fine tuning by supervised learning. We formulate the extraction of features from the training data as a margin criterion in a high-dimensional feature-vector space. The self-organized classifier is then supplied with small amounts of labelled data, as in deep learning. Although we employ a simple single-layer perceptron model, rather than directly analyzing a multi-layer neural network, we find a nontrivial phase transition that is dependent on the number of unlabelled data in the generalization error of the resultant classifier. In this sense, we evaluate the efficacy of the unsupervised learning component of deep learning. The analysis is performed by the replica method, which is a sophisticated tool in statistical mechanics. We validate our result in the manner of deep learning, using a simple iterative algorithm to learn the weight vector on the basis of belief propagation.
Takabe, Satoshi; Hukushima, Koji
2016-05-01
Typical behavior of the linear programming (LP) problem is studied as a relaxation of the minimum vertex cover (min-VC), a type of integer programming (IP) problem. A lattice-gas model on the Erdös-Rényi random graphs of α-uniform hyperedges is proposed to express both the LP and IP problems of the min-VC in the common statistical mechanical model with a one-parameter family. Statistical mechanical analyses reveal for α=2 that the LP optimal solution is typically equal to that given by the IP below the critical average degree c=e in the thermodynamic limit. The critical threshold for good accuracy of the relaxation extends the mathematical result c=1 and coincides with the replica symmetry-breaking threshold of the IP. The LP relaxation for the minimum hitting sets with α≥3, minimum vertex covers on α-uniform random graphs, is also studied. Analytic and numerical results strongly suggest that the LP relaxation fails to estimate optimal values above the critical average degree c=e/(α-1) where the replica symmetry is broken.
Takabe, Satoshi; Hukushima, Koji
2016-05-01
Typical behavior of the linear programming (LP) problem is studied as a relaxation of the minimum vertex cover (min-VC), a type of integer programming (IP) problem. A lattice-gas model on the Erdös-Rényi random graphs of α -uniform hyperedges is proposed to express both the LP and IP problems of the min-VC in the common statistical mechanical model with a one-parameter family. Statistical mechanical analyses reveal for α =2 that the LP optimal solution is typically equal to that given by the IP below the critical average degree c =e in the thermodynamic limit. The critical threshold for good accuracy of the relaxation extends the mathematical result c =1 and coincides with the replica symmetry-breaking threshold of the IP. The LP relaxation for the minimum hitting sets with α ≥3 , minimum vertex covers on α -uniform random graphs, is also studied. Analytic and numerical results strongly suggest that the LP relaxation fails to estimate optimal values above the critical average degree c =e /(α -1 ) where the replica symmetry is broken.
Statistical mechanics of relativistic spin-1 bosons in a magnetic field
International Nuclear Information System (INIS)
Daicic, J.; Frankel, N.E.
1993-01-01
This paper investigates the statistical mechanics of a gas of spin-1 particles with pair creation in a homogeneous magnetic field. It is shown that expansions for the thermodynamic potential and magnetization in fields below the mass scale of the constituent particles are well behaved. However, when the field is at or above the mass scale, an intrinsic pathology of the single-particle energy spectrum manifests itself in the statistical mechanics of the system. Whilst for the spin-0 and spin-1/2 analog of this system there seemed to be no barrier ab initio to the field strength, the nature of the vacuum, and the role of interactions, were always borne in mind as matters to be considered in a high-order treatment, particularly when the field was at or above the mass scale. In the spin-1 case, the pathology in the single-particle energy spectrum heralds this from the beginning, and seems to be a warning that a single particle non-interacting picture of physics at high energies needs some reconsideration. 10 refs
The non-equilibrium statistical mechanics of a simple geophysical fluid dynamics model
Verkley, Wim; Severijns, Camiel
2014-05-01
Lorenz [1] has devised a dynamical system that has proved to be very useful as a benchmark system in geophysical fluid dynamics. The system in its simplest form consists of a periodic array of variables that can be associated with an atmospheric field on a latitude circle. The system is driven by a constant forcing, is damped by linear friction and has a simple advection term that causes the model to behave chaotically if the forcing is large enough. Our aim is to predict the statistics of Lorenz' model on the basis of a given average value of its total energy - obtained from a numerical integration - and the assumption of statistical stationarity. Our method is the principle of maximum entropy [2] which in this case reads: the information entropy of the system's probability density function shall be maximal under the constraints of normalization, a given value of the average total energy and statistical stationarity. Statistical stationarity is incorporated approximately by using `stationarity constraints', i.e., by requiring that the average first and possibly higher-order time-derivatives of the energy are zero in the maximization of entropy. The analysis [3] reveals that, if the first stationarity constraint is used, the resulting probability density function rather accurately reproduces the statistics of the individual variables. If the second stationarity constraint is used as well, the correlations between the variables are also reproduced quite adequately. The method can be generalized straightforwardly and holds the promise of a viable non-equilibrium statistical mechanics of the forced-dissipative systems of geophysical fluid dynamics. [1] E.N. Lorenz, 1996: Predictability - A problem partly solved, in Proc. Seminar on Predictability (ECMWF, Reading, Berkshire, UK), Vol. 1, pp. 1-18. [2] E.T. Jaynes, 2003: Probability Theory - The Logic of Science (Cambridge University Press, Cambridge). [3] W.T.M. Verkley and C.A. Severijns, 2014: The maximum entropy
THE CAUSAL ANALYSIS / DIAGNOSIS DECISION ...
CADDIS is an on-line decision support system that helps investigators in the regions, states and tribes find, access, organize, use and share information to produce causal evaluations in aquatic systems. It is based on the US EPA's Stressor Identification process which is a formal method for identifying causes of impairments in aquatic systems. CADDIS 2007 increases access to relevant information useful for causal analysis and provides methods and tools that practitioners can use to analyze their own data. The new Candidate Cause section provides overviews of commonly encountered causes of impairments to aquatic systems: metals, sediments, nutrients, flow alteration, temperature, ionic strength, and low dissolved oxygen. CADDIS includes new Conceptual Models that illustrate the relationships from sources to stressors to biological effects. An Interactive Conceptual Model for phosphorus links the diagram with supporting literature citations. The new Analyzing Data section helps practitioners analyze their data sets and interpret and use those results as evidence within the USEPA causal assessment process. Downloadable tools include a graphical user interface statistical package (CADStat), and programs for use with the freeware R statistical package, and a Microsoft Excel template. These tools can be used to quantify associations between causes and biological impairments using innovative methods such as species-sensitivity distributions, biological inferenc
International Nuclear Information System (INIS)
Rebhan, E.
2005-01-01
The present second volume treats quantum mechanics, relativistic quantum mechanics, the foundations of quantum-field and elementary-particle theory as well as thermodynamics and statistics. Both volumes comprehend all fields, which are usually offered in a course about theoretical physics. In all treated fields a very careful introduction to the basic natural laws forms the starting point, whereby it is thoroughly analysed, which of them is based on empirics, which is logically deducible, and which role play basic definitions. Extendingly the matter extend of the corresponding courses starting from the relativistic quantum theory an introduction to the elementary particles is developed. All problems are very thoroughly and such extensively studied, that each step is singularly reproducible. On motivation and good understandability is cared much about. The mixing of mathematical difficulties with problems of physical nature often obstructive in the learning is so circumvented, that important mathematical methods are presented in own chapters (for instance Hilbert spaces, Lie groups). By means of many examples and problems (for a large part with solutions) the matter worked out is deepened and exercised. Developments, which are indeed important, but seem for the first approach abandonable, are pursued in excurses. This book starts from courses, which the author has held at the Heinrich-Heine university in Duesseldorf, and was in many repetitions fitted to the requirements of the students. It is conceived in such a way, that it is also after the study suited as dictionary or for the regeneration
Directory of Open Access Journals (Sweden)
A. Jackson Stenner
2013-08-01
Full Text Available Rasch’s unidimensional models for measurement show how to connect object measures (e.g., reader abilities, measurement mechanisms (e.g., machine-generated cloze reading items, and observational outcomes (e.g., counts correct on reading instruments. Substantive theory shows what interventions or manipulations to the measurement mechanism can be traded off against a change to the object measure to hold the observed outcome constant. A Rasch model integrated with a substantive theory dictates the form and substance of permissible interventions. Rasch analysis, absent construct theory and an associated specification equation, is a black box in which understanding may be more illusory than not. Finally, the quantitative hypothesis can be tested by comparing theory-based trade-off relations with observed trade-off relations. Only quantitative variables (as measured support such trade-offs. Note that to test the quantitative hypothesis requires more than manipulation of the algebraic equivalencies in the Rasch model or descriptively fitting data to the model. A causal Rasch model involves experimental intervention/manipulation on either reader ability or text complexity or a conjoint intervention on both simultaneously to yield a successful prediction of the resultant observed outcome (count correct. We conjecture that when this type of manipulation is introduced for individual reader text encounters and model predictions are consistent with observations, the quantitative hypothesis is sustained.
Stenner, A Jackson; Fisher, William P; Stone, Mark H; Burdick, Donald S
2013-01-01
Rasch's unidimensional models for measurement show how to connect object measures (e.g., reader abilities), measurement mechanisms (e.g., machine-generated cloze reading items), and observational outcomes (e.g., counts correct on reading instruments). Substantive theory shows what interventions or manipulations to the measurement mechanism can be traded off against a change to the object measure to hold the observed outcome constant. A Rasch model integrated with a substantive theory dictates the form and substance of permissible interventions. Rasch analysis, absent construct theory and an associated specification equation, is a black box in which understanding may be more illusory than not. Finally, the quantitative hypothesis can be tested by comparing theory-based trade-off relations with observed trade-off relations. Only quantitative variables (as measured) support such trade-offs. Note that to test the quantitative hypothesis requires more than manipulation of the algebraic equivalencies in the Rasch model or descriptively fitting data to the model. A causal Rasch model involves experimental intervention/manipulation on either reader ability or text complexity or a conjoint intervention on both simultaneously to yield a successful prediction of the resultant observed outcome (count correct). We conjecture that when this type of manipulation is introduced for individual reader text encounters and model predictions are consistent with observations, the quantitative hypothesis is sustained.
Stenner, A. Jackson; Fisher, William P.; Stone, Mark H.; Burdick, Donald S.
2013-01-01
Rasch's unidimensional models for measurement show how to connect object measures (e.g., reader abilities), measurement mechanisms (e.g., machine-generated cloze reading items), and observational outcomes (e.g., counts correct on reading instruments). Substantive theory shows what interventions or manipulations to the measurement mechanism can be traded off against a change to the object measure to hold the observed outcome constant. A Rasch model integrated with a substantive theory dictates the form and substance of permissible interventions. Rasch analysis, absent construct theory and an associated specification equation, is a black box in which understanding may be more illusory than not. Finally, the quantitative hypothesis can be tested by comparing theory-based trade-off relations with observed trade-off relations. Only quantitative variables (as measured) support such trade-offs. Note that to test the quantitative hypothesis requires more than manipulation of the algebraic equivalencies in the Rasch model or descriptively fitting data to the model. A causal Rasch model involves experimental intervention/manipulation on either reader ability or text complexity or a conjoint intervention on both simultaneously to yield a successful prediction of the resultant observed outcome (count correct). We conjecture that when this type of manipulation is introduced for individual reader text encounters and model predictions are consistent with observations, the quantitative hypothesis is sustained. PMID:23986726
Statistical evaluation of the mechanical properties of high-volume class F fly ash concretes
Yoon, Seyoon
2014-03-01
High-Volume Fly Ash (HVFA) concretes are seen by many as a feasible solution for sustainable, low embodied carbon construction. At the moment, fly ash is classified as a waste by-product, primarily of thermal power stations. In this paper the authors experimentally and statistically investigated the effects of mix-design factors on the mechanical properties of high-volume class F fly ash concretes. A total of 240 and 32 samples were produced and tested in the laboratory to measure compressive strength and Young\\'s modulus respectively. Applicability of the CEB-FIP (Comite Euro-international du Béton - Fédération Internationale de la Précontrainte) and ACI (American Concrete Institute) Building Model Code (Thomas, 2010; ACI Committee 209, 1982) [1,2] to the experimentally-derived mechanical property data for HVFA concretes was established. Furthermore, using multiple linear regression analysis, Mean Squared Residuals (MSRs) were obtained to determine whether a weight- or volume-based mix proportion is better to predict the mechanical properties of HVFA concrete. The significance levels of the design factors, which indicate how significantly the factors affect the HVFA concrete\\'s mechanical properties, were determined using analysis of variance (ANOVA) tests. The results show that a weight-based mix proportion is a slightly better predictor of mechanical properties than volume-based one. The significance level of fly ash substitution rate was higher than that of w/b ratio initially but reduced over time. © 2014 Elsevier Ltd. All rights reserved.
Repeated Causal Decision Making
Hagmayer, York; Meder, Bjorn
2013-01-01
Many of our decisions refer to actions that have a causal impact on the external environment. Such actions may not only allow for the mere learning of expected values or utilities but also for acquiring knowledge about the causal structure of our world. We used a repeated decision-making paradigm to examine what kind of knowledge people acquire in…
Causal Analysis After Haavelmo
Heckman, James; Pinto, Rodrigo
2014-01-01
Haavelmo's seminal 1943 and 1944 papers are the first rigorous treatment of causality. In them, he distinguished the definition of causal parameters from their identification. He showed that causal parameters are defined using hypothetical models that assign variation to some of the inputs determining outcomes while holding all other inputs fixed. He thus formalized and made operational Marshall's (1890) ceteris paribus analysis. We embed Haavelmo's framework into the recursive framework of Directed Acyclic Graphs (DAGs) used in one influential recent approach to causality (Pearl, 2000) and in the related literature on Bayesian nets (Lauritzen, 1996). We compare the simplicity of an analysis of causality based on Haavelmo's methodology with the complex and nonintuitive approach used in the causal literature of DAGs—the “do-calculus” of Pearl (2009). We discuss the severe limitations of DAGs and in particular of the do-calculus of Pearl in securing identification of economic models. We extend our framework to consider models for simultaneous causality, a central contribution of Haavelmo. In general cases, DAGs cannot be used to analyze models for simultaneous causality, but Haavelmo's approach naturally generalizes to cover them. PMID:25729123
Causality in Classical Electrodynamics
Savage, Craig
2012-01-01
Causality in electrodynamics is a subject of some confusion, especially regarding the application of Faraday's law and the Ampere-Maxwell law. This has led to the suggestion that we should not teach students that electric and magnetic fields can cause each other, but rather focus on charges and currents as the causal agents. In this paper I argue…
Causality in Europeanization Research
DEFF Research Database (Denmark)
Lynggaard, Kennet
2012-01-01
to develop discursive institutional analytical frameworks and something that comes close to the formulation of hypothesis on the effects of European Union (EU) policies and institutions on domestic change. Even if these efforts so far do not necessarily amount to substantive theories or claims of causality......Discourse analysis as a methodology is perhaps not readily associated with substantive causality claims. At the same time the study of discourses is very much the study of conceptions of causal relations among a set, or sets, of agents. Within Europeanization research we have seen endeavours......, it suggests that discourse analysis and the study of causality are by no means opposites. The study of Europeanization discourses may even be seen as an essential step in the move towards claims of causality in Europeanization research. This chapter deals with the question of how we may move from the study...
Directory of Open Access Journals (Sweden)
Thomas eWidlok
2014-11-01
Full Text Available Cognitive Scientists interested in causal cognition increasingly search for evidence from non-WEIRD people but find only very few cross-cultural studies that specifically target causal cognition. This article suggests how information about causality can be retrieved from ethnographic monographs, specifically from ethnographies that discuss agency and concepts of time. Many apparent cultural differences with regard to causal cognition dissolve when cultural extensions of agency and personhood to non-humans are taken into account. At the same time considerable variability remains when we include notions of time, linearity and sequence. The article focuses on ethnographic case studies from Africa but provides a more general perspective on the role of ethnography in research on the diversity and universality of causal cognition.
Directory of Open Access Journals (Sweden)
Ämin Baumeler
2017-07-01
Full Text Available Computation models such as circuits describe sequences of computation steps that are carried out one after the other. In other words, algorithm design is traditionally subject to the restriction imposed by a fixed causal order. We address a novel computing paradigm beyond quantum computing, replacing this assumption by mere logical consistency: We study non-causal circuits, where a fixed time structure within a gate is locally assumed whilst the global causal structure between the gates is dropped. We present examples of logically consistent non-causal circuits outperforming all causal ones; they imply that suppressing loops entirely is more restrictive than just avoiding the contradictions they can give rise to. That fact is already known for correlations as well as for communication, and we here extend it to computation.
Inferential, non-parametric statistics to assess the quality of probabilistic forecast systems
Maia, A.H.N.; Meinke, H.B.; Lennox, S.; Stone, R.C.
2007-01-01
Many statistical forecast systems are available to interested users. To be useful for decision making, these systems must be based on evidence of underlying mechanisms. Once causal connections between the mechanism and its statistical manifestation have been firmly established, the forecasts must
Physics colloquium: Single-electron counting in quantum metrology and in statistical mechanics
Geneva University
2011-01-01
GENEVA UNIVERSITY Ecole de physique Département de physique nucléaire et corspusculaire 24, quai Ernest-Ansermet 1211 Genève 4 Tél.: (022) 379 62 73 Fax: (022) 379 69 92olé Lundi 17 octobre 2011 17h00 - Ecole de Physique, Auditoire Stueckelberg PHYSICS COLLOQUIUM « Single-electron counting in quantum metrology and in statistical mechanics » Prof. Jukka Pekola Low Temperature Laboratory, Aalto University Helsinki, Finland First I discuss the basics of single-electron tunneling and its potential applications in metrology. My main focus is in developing an accurate source of single-electron current for the realization of the unit ampere. I discuss the principle and the present status of the so-called single- electron turnstile. Investigation of errors in transporting electrons one by one has revealed a wealth of observations on fundamental phenomena in mesoscopic superconductivity, including individual Andreev...
Statistical mechanics of competitive resource allocation using agent-based models
Chakraborti, Anirban; Challet, Damien; Chatterjee, Arnab; Marsili, Matteo; Zhang, Yi-Cheng; Chakrabarti, Bikas K.
2015-01-01
Demand outstrips available resources in most situations, which gives rise to competition, interaction and learning. In this article, we review a broad spectrum of multi-agent models of competition (El Farol Bar problem, Minority Game, Kolkata Paise Restaurant problem, Stable marriage problem, Parking space problem and others) and the methods used to understand them analytically. We emphasize the power of concepts and tools from statistical mechanics to understand and explain fully collective phenomena such as phase transitions and long memory, and the mapping between agent heterogeneity and physical disorder. As these methods can be applied to any large-scale model of competitive resource allocation made up of heterogeneous adaptive agent with non-linear interaction, they provide a prospective unifying paradigm for many scientific disciplines.
International Nuclear Information System (INIS)
Gao, Li-Na; Liu, Fu-Hu; Lacey, Roy A.
2016-01-01
Experimental results of the transverse-momentum distributions of φ mesons and Ω hyperons produced in gold-gold (Au-Au) collisions with different centrality intervals, measured by the STAR Collaboration at different energies (7.7, 11.5, 19.6, 27, and 39 GeV) in the beam energy scan (BES) program at the relativistic heavy-ion collider (RHIC), are approximately described by the single Erlang distribution and the two-component Schwinger mechanism. Moreover, the STAR experimental transverse-momentum distributions of negatively charged particles, produced in Au-Au collisions at RHIC BES energies, are approximately described by the two-component Erlang distribution and the single Tsallis statistics. The excitation functions of free parameters are obtained from the fit to the experimental data. A weak softest point in the string tension in Ω hyperon spectra is observed at 7.7 GeV. (orig.)
Statistical mechanical model of gas adsorption in porous crystals with dynamic moieties.
Simon, Cory M; Braun, Efrem; Carraro, Carlo; Smit, Berend
2017-01-17
Some nanoporous, crystalline materials possess dynamic constituents, for example, rotatable moieties. These moieties can undergo a conformation change in response to the adsorption of guest molecules, which qualitatively impacts adsorption behavior. We pose and solve a statistical mechanical model of gas adsorption in a porous crystal whose cages share a common ligand that can adopt two distinct rotational conformations. Guest molecules incentivize the ligands to adopt a different rotational configuration than maintained in the empty host. Our model captures inflections, steps, and hysteresis that can arise in the adsorption isotherm as a signature of the rotating ligands. The insights disclosed by our simple model contribute a more intimate understanding of the response and consequence of rotating ligands integrated into porous materials to harness them for gas storage and separations, chemical sensing, drug delivery, catalysis, and nanoscale devices. Particularly, our model reveals design strategies to exploit these moving constituents and engineer improved adsorbents with intrinsic thermal management for pressure-swing adsorption processes.
Statistical-mechanics analysis of Gaussian labeled-unlabeled classification problems
International Nuclear Information System (INIS)
Tanaka, Toshiyuki
2013-01-01
The labeled-unlabeled classification problem in semi-supervised learning is studied via statistical-mechanics approach. We analytically investigate performance of a learner with an equal-weight mixture of two symmetrically-located Gaussians, performing posterior mean estimation of the parameter vector on the basis of a dataset consisting of labeled and unlabeled data generated from the same probability model as that assumed by the learner. Under the assumption of replica symmetry, we have analytically obtained a set of saddle-point equations, which allows us to numerically evaluate performance of the learner. On the basis of the analytical result we have observed interesting phenomena, in particular the coexistence of good and bad solutions, which may happen when the number of unlabeled data is relatively large compared with that of labeled data
Granular statistical mechanics - Building on the legacy of Sir Sam Edwards
Blumenfeld, Raphael
When Sir Sam Edwards laid down the foundations for the statistical mechanics of jammed granular materials he opened a new field in soft condensed matter and many followed. In this presentation we review briefly the Edwards formalism and some of its less discussed consequences. We point out that the formalism is useful for other classes of systems - cellular and porous materials. A certain shortcoming of the original formalism is then discussed and a modification to overcome it is proposed. Finally, a derivation of an equation of state with the new formalism is presented; the equation of state is analogous to the PVT relation for thermal gases, relating the volume, the boundary stress and measures of the structural and stress fluctuations. NUDT, Changsha, China, Imperial College London, UK, Cambridge University, UK.
Noise and the statistical mechanics of distributed transport in a colony of interacting agents
Katifori, Eleni; Graewer, Johannes; Ronellenfitsch, Henrik; Mazza, Marco G.
Inspired by the process of liquid food distribution between individuals in an ant colony, in this work we consider the statistical mechanics of resource dissemination between interacting agents with finite carrying capacity. The agents move inside a confined space (nest), pick up the food at the entrance of the nest and share it with other agents that they encounter. We calculate analytically and via a series of simulations the global food intake rate for the whole colony as well as observables describing how uniformly the food is distributed within the nest. Our model and predictions provide a useful benchmark to assess which strategies can lead to efficient food distribution within the nest and also to what level the observed food uptake rates and efficiency in food distribution are due to stochastic fluctuations or specific food exchange strategies by an actual ant colony.
Multiscale Monte Carlo algorithms in statistical mechanics and quantum field theory
Energy Technology Data Exchange (ETDEWEB)
Lauwers, P G
1990-12-01
Conventional Monte Carlo simulation algorithms for models in statistical mechanics and quantum field theory are afflicted by problems caused by their locality. They become highly inefficient if investigations of critical or nearly-critical systems, i.e., systems with important large scale phenomena, are undertaken. We present two types of multiscale approaches that alleveate problems of this kind: Stochastic cluster algorithms and multigrid Monte Carlo simulation algorithms. Another formidable computational problem in simulations of phenomenologically relevant field theories with fermions is the need for frequently inverting the Dirac operator. This inversion can be accelerated considerably by means of deterministic multigrid methods, very similar to the ones used for the numerical solution of differential equations. (orig.).
Conceptual developments of non-equilibrium statistical mechanics in the early days of Japan
Ichiyanagi, Masakazu
1995-11-01
This paper reviews the research in nonequilibrium statistical mechanics made in Japan in the period between 1930 and 1960. Nearly thirty years have passed since the discovery of the exact formula for the electrical conductivity. With the rise of the linear response theory, the methods and results of which are quickly grasped by anyone, its rationale was pushed aside and even at the stage where the formulation was still incomplete some authors hurried to make physical applications. Such an attitude robbed it of most of its interest for the average physicist, who would approach an understanding of some basic concept, not through abstract and logical analysis but by simply increasing his technical experiences with the concept. The purpose of this review is to rescue the linear response theory from being labeled a mathematical tool and to show that it has considerable physical content. Many key papers, originally written in Japanese, are reproduced.
Moral foundations in an interacting neural networks society: A statistical mechanics analysis
Vicente, R.; Susemihl, A.; Jericó, J. P.; Caticha, N.
2014-04-01
The moral foundations theory supports that people, across cultures, tend to consider a small number of dimensions when classifying issues on a moral basis. The data also show that the statistics of weights attributed to each moral dimension is related to self-declared political affiliation, which in turn has been connected to cognitive learning styles by the recent literature in neuroscience and psychology. Inspired by these data, we propose a simple statistical mechanics model with interacting neural networks classifying vectors and learning from members of their social neighbourhood about their average opinion on a large set of issues. The purpose of learning is to reduce dissension among agents when disagreeing. We consider a family of learning algorithms parametrized by δ, that represents the importance given to corroborating (same sign) opinions. We define an order parameter that quantifies the diversity of opinions in a group with homogeneous learning style. Using Monte Carlo simulations and a mean field approximation we find the relation between the order parameter and the learning parameter δ at a temperature we associate with the importance of social influence in a given group. In concordance with data, groups that rely more strongly on corroborating evidence sustain less opinion diversity. We discuss predictions of the model and propose possible experimental tests.
Ants in a labyrinth: a statistical mechanics approach to the division of labour.
Directory of Open Access Journals (Sweden)
Thomas Owen Richardson
2011-04-01
Full Text Available Division of labour (DoL is a fundamental organisational principle in human societies, within virtual and robotic swarms and at all levels of biological organisation. DoL reaches a pinnacle in the insect societies where the most widely used model is based on variation in response thresholds among individuals, and the assumption that individuals and stimuli are well-mixed. Here, we present a spatially explicit model of DoL. Our model is inspired by Pierre de Gennes' 'Ant in a Labyrinth' which laid the foundations of an entire new field in statistical mechanics. We demonstrate the emergence, even in a simplified one-dimensional model, of a spatial patterning of individuals and a right-skewed activity distribution, both of which are characteristics of division of labour in animal societies. We then show using a two-dimensional model that the work done by an individual within an activity bout is a sigmoidal function of its response threshold. Furthermore, there is an inverse relationship between the overall stimulus level and the skewness of the activity distribution. Therefore, the difference in the amount of work done by two individuals with different thresholds increases as the overall stimulus level decreases. Indeed, spatial fluctuations of task stimuli are minimised at these low stimulus levels. Hence, the more unequally labour is divided amongst individuals, the greater the ability of the colony to maintain homeostasis. Finally, we show that the non-random spatial distribution of individuals within biological and social systems could be caused by indirect (stigmergic interactions, rather than direct agent-to-agent interactions. Our model links the principle of DoL with principles in the statistical mechanics and provides testable hypotheses for future experiments.
International Nuclear Information System (INIS)
Kushnirenko, A.N.
1989-01-01
An attempt was made to substantiate statistical physics from the viewpoint of many-body quantum mechanics in the representation of occupation numbers. This approach enabled to develop the variation method for solution of stationary and nonstationary nonequilibrium problems
DEFF Research Database (Denmark)
Pomogaev, Vladimir; Pomogaeva, Anna; Avramov, Pavel
2011-01-01
Three polycyclic organic molecules in various solvents focused on thermo-dynamical aspects were theoretically investigated using the recently developed statistical quantum mechanical/classical molecular dynamics method for simulating electronic-vibrational spectra. The absorption bands of estradiol...
Klungsøyr, Ole; Antonsen, Bjørnar; Wilberg, Theresa
2017-06-05
Patients with personality disorders commonly exhibit impairment in psychosocial function that persists over time even with diagnostic remission. Further causal knowledge may help to identify and assess factors with a potential to alleviate this impairment. Psychosocial function is associated with personality functioning which describes personality disorder severity in DSM-5 (section III) and which can reportedly be improved by therapy. The reciprocal association between personality functioning and psychosocial function was assessed, in 113 patients with different personality disorders, in a secondary longitudinal analysis of data from a randomized clinical trial, over six years. Personality functioning was represented by three domains of the Severity Indices of Personality Problems: Relational Capacity, Identity Integration, and Self-control. Psychosocial function was measured by Global Assessment of Functioning. The marginal structural model was used for estimation of causal effects of the three personality functioning domains on psychosocial function, and vice versa. The attractiveness of this model lies in the ability to assess an effect of a time - varying exposure on an outcome, while adjusting for time - varying confounding. Strong causal effects were found. A hypothetical intervention to increase Relational Capacity by one standard deviation, both at one and two time-points prior to assessment of psychosocial function, would increase psychosocial function by 3.5 standard deviations (95% CI: 2.0, 4.96). Significant effects of Identity Integration and Self-control on psychosocial function, and from psychosocial function on all three domains of personality functioning, although weaker, were also found. This study indicates that persistent impairment in psychosocial function can be addressed through a causal pathway of personality functioning, with interventions of at least 18 months duration.
Czech Academy of Sciences Publication Activity Database
Jackson, G.; Nezbeda, Ivo
2011-01-01
Roč. 190, 1 Sp.I:Sl (2011), s. 1-2 ISSN 0026-8976. [Liblice Conference on the Statistical Mechanics of Liquids /8./. Brno, 13.06.2010-18.06.2010] Institutional research plan: CEZ:AV0Z40720504 Keywords : editorial material * theories of liquids * statistical mechanics Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 1.819, year: 2011
Statistical mechanics and dynamics of solvable models with long-range interactions
International Nuclear Information System (INIS)
Campa, Alessandro; Dauxois, Thierry; Ruffo, Stefano
2009-01-01
For systems with long-range interactions, the two-body potential decays at large distances as V(r)∼1/r α , with α≤d, where d is the space dimension. Examples are: gravitational systems, two-dimensional hydrodynamics, two-dimensional elasticity, charged and dipolar systems. Although such systems can be made extensive, they are intrinsically non additive: the sum of the energies of macroscopic subsystems is not equal to the energy of the whole system. Moreover, the space of accessible macroscopic thermodynamic parameters might be non convex. The violation of these two basic properties of the thermodynamics of short-range systems is at the origin of ensemble inequivalence. In turn, this inequivalence implies that specific heat can be negative in the microcanonical ensemble, and temperature jumps can appear at microcanonical first order phase transitions. The lack of convexity allows us to easily spot regions of parameter space where ergodicity may be broken. Historically, negative specific heat had been found for gravitational systems and was thought to be a specific property of a system for which the existence of standard equilibrium statistical mechanics itself was doubted. Realizing that such properties may be present for a wider class of systems has renewed the interest in long-range interactions. Here, we present a comprehensive review of the recent advances on the statistical mechanics and out-of-equilibrium dynamics of solvable systems with long-range interactions. The core of the review consists in the detailed presentation of the concept of ensemble inequivalence, as exemplified by the exact solution, in the microcanonical and canonical ensembles, of mean-field type models. Remarkably, the entropy of all these models can be obtained using the method of large deviations. Long-range interacting systems display an extremely slow relaxation towards thermodynamic equilibrium and, what is more striking, the convergence towards quasi-stationary states. The
Granger Causality Testing with Intensive Longitudinal Data.
Molenaar, Peter C M
2018-06-01
The availability of intensive longitudinal data obtained by means of ambulatory assessment opens up new prospects for prevention research in that it allows the derivation of subject-specific dynamic networks of interacting variables by means of vector autoregressive (VAR) modeling. The dynamic networks thus obtained can be subjected to Granger causality testing in order to identify causal relations among the observed time-dependent variables. VARs have two equivalent representations: standard and structural. Results obtained with Granger causality testing depend upon which representation is chosen, yet no criteria exist on which this important choice can be based. A new equivalent representation is introduced called hybrid VARs with which the best representation can be chosen in a data-driven way. Partial directed coherence, a frequency-domain statistic for Granger causality testing, is shown to perform optimally when based on hybrid VARs. An application to real data is provided.
Causality Between Urban Concentration and Environmental Quality
Directory of Open Access Journals (Sweden)
Amin Pujiati
2015-08-01
Full Text Available Population is concentrated in urban areas can cause the external diseconomies on environment if it exceeds the carrying capacity of the space and the urban economy. Otherwise the quality of the environment is getting better, led to the concentration of population in urban areas are increasingly high. This study aims to analyze the relationship of causality between the urban concentration and environmental quality in urban agglomeration areas. The data used in the study of secondary data obtained from the Central Bureau of statistics and the City Government from 2000 to 2013. The analytical method used is the Granger causality and descriptive. Granger causality study results showed no pattern of reciprocal causality, between urban concentration and the quality of the environment, but there unidirectional relationship between the urban concentration and environmental quality. This means that increasing urban concentration led to decreased environmental quality.
Efficient nonparametric estimation of causal mediation effects
Chan, K. C. G.; Imai, K.; Yam, S. C. P.; Zhang, Z.
2016-01-01
An essential goal of program evaluation and scientific research is the investigation of causal mechanisms. Over the past several decades, causal mediation analysis has been used in medical and social sciences to decompose the treatment effect into the natural direct and indirect effects. However, all of the existing mediation analysis methods rely on parametric modeling assumptions in one way or another, typically requiring researchers to specify multiple regression models involving the treat...
The Continuum Limit of Causal Fermion Systems
Finster, Felix
2016-01-01
This monograph introduces the basic concepts of the theory of causal fermion systems, a recent approach to the description of fundamental physics. The theory yields quantum mechanics, general relativity and quantum field theory as limiting cases and is therefore a candidate for a unified physical theory. From the mathematical perspective, causal fermion systems provide a general framework for describing and analyzing non-smooth geometries and "quantum geometries." The dynamics is described by...
International Nuclear Information System (INIS)
Maund, J.B.
1979-01-01
Although the existence of tachyons is not ruled out by special relativity, it appears that causal paradoxes will arise if there are tachyons. The usual solutions to these paradoxes employ some form of the reinterpretation principle. In this paper it is argued first that, the principle is incoherent, second, that even if it is not, some causal paradoxes remain, and third, the most plausible ''solution,'' which appeals to boundary conditions of the universe, will conflict with special relativity
Dynamics and causality constraints
International Nuclear Information System (INIS)
Sousa, Manoelito M. de
2001-04-01
The physical meaning and the geometrical interpretation of causality implementation in classical field theories are discussed. Causality in field theory are kinematical constraints dynamically implemented via solutions of the field equation, but in a limit of zero-distance from the field sources part of these constraints carries a dynamical content that explains old problems of classical electrodynamics away with deep implications to the nature of physicals interactions. (author)
der, R.
1987-01-01
The various approaches to nonequilibrium statistical mechanics may be subdivided into convolution and convolutionless (time-local) ones. While the former, put forward by Zwanzig, Mori, and others, are used most commonly, the latter are less well developed, but have proven very useful in recent applications. The aim of the present series of papers is to develop the time-local picture (TLP) of nonequilibrium statistical mechanics on a new footing and to consider its physical implications for topics such as the formulation of irreversible thermodynamics. The most natural approach to TLP is seen to derive from the Fourier-Laplace transformwidetilde{C}(z)) of pertinent time correlation functions, which on the physical sheet typically displays an essential singularity at z=∞ and a number of macroscopic and microscopic poles in the lower half-plane corresponding to long- and short-lived modes, respectively, the former giving rise to the autonomous macrodynamics, whereas the latter are interpreted as doorway modes mediating the transfer of information from relevant to irrelevant channels. Possible implications of this doorway mode concept for socalled extended irreversible thermodynamics are briefly discussed. The pole structure is used for deriving new kinds of generalized Green-Kubo relations expressing macroscopic quantities, transport coefficients, e.g., by contour integrals over current-current correlation functions obeying Hamiltonian dynamics, the contour integration replacing projection. The conventional Green-Kubo relations valid for conserved quantities only are rederived for illustration. Moreover,widetilde{C}(z) may be expressed by a Laurent series expansion in positive and negative powers of z, from which a rigorous, general, and straightforward method is developed for extracting all macroscopic quantities from so-called secularly divergent expansions ofwidetilde{C}(z) as obtained from the application of conventional many-body techniques to the calculation
Protein logic: a statistical mechanical study of signal integration at the single-molecule level.
de Ronde, Wiet; Rein ten Wolde, Pieter; Mugler, Andrew
2012-09-05
Information processing and decision-making is based upon logic operations, which in cellular networks has been well characterized at the level of transcription. In recent years, however, both experimentalists and theorists have begun to appreciate that cellular decision-making can also be performed at the level of a single protein, giving rise to the notion of protein logic. Here we systematically explore protein logic using a well-known statistical mechanical model. As an example system, we focus on receptors that bind either one or two ligands, and their associated dimers. Notably, we find that a single heterodimer can realize any of the 16 possible logic gates, including the XOR gate, by variation of biochemical parameters. We then introduce what to our knowledge is a novel idea: that a set of receptors with fixed parameters can encode functionally unique logic gates simply by forming different dimeric combinations. An exhaustive search reveals that the simplest set of receptors (two single-ligand receptors and one double-ligand receptor) can realize several different groups of three unique gates, a result for which the parametric analysis of single receptors and dimers provides a clear interpretation. Both results underscore the surprising functional freedom readily available to cells at the single-protein level. Copyright © 2012 Biophysical Society. Published by Elsevier Inc. All rights reserved.
A Framework for Structural Systems Based on the Principles of Statistical Mechanics
Directory of Open Access Journals (Sweden)
Rabindranath Andujar
2014-11-01
Full Text Available A framework is proposed in which certain well-known concepts of statistical mechanics and thermodynamics can be used and applied to characterize structural systems of interconnected Timoshenko beam elements. We first make the assimilation to a network of nodes linked by potential energy functions that are derived from the stiffness properties of the beams. Then we define a series of thermodynamic quantities inherent to a given structure (i.e., internal energy, heat, pressure, temperature, entropy, and kinetic energy. With the exception of entropy, all of them have the dimensions of energy. In order to test this new framework, a series of experiments was performed on four structural specimens within the elastic regime. Their configurations were taken from the seismic regulations known as Eurocode 8 in order to have a better based reference for our comparisons. The results are then explained within this new framework. Very interesting correlations have been found between the parameters given in the code and our concepts.
Statistical mechanics of few-particle systems: exact results for two useful models
Miranda, Enrique N.
2017-11-01
The statistical mechanics of small clusters (n ˜ 10-50 elements) of harmonic oscillators and two-level systems is studied exactly, following the microcanonical, canonical and grand canonical formalisms. For clusters with several hundred particles, the results from the three formalisms coincide with those found in the thermodynamic limit. However, for clusters formed by a few tens of elements, the three ensembles yield different results. For a cluster with a few tens of harmonic oscillators, when the heat capacity per oscillator is evaluated within the canonical formalism, it reaches a limit value equal to k B , as in the thermodynamic case, while within the microcanonical formalism the limit value is k B (1-1/n). This difference could be measured experimentally. For a cluster with a few tens of two-level systems, the heat capacity evaluated within the canonical and microcanonical ensembles also presents differences that could be detected experimentally. Both the microcanonical and grand canonical formalism show that the entropy is non-additive for systems this small, while the canonical ensemble reaches the opposite conclusion. These results suggest that the microcanonical ensemble is the most appropriate for dealing with systems with tens of particles.
Nonequilibrium statistical mechanics in the general theory of relativity. I. A general formalism
International Nuclear Information System (INIS)
Israel, W.; Kandrup, H.E.
1984-01-01
This is the first in a series of papers, the overall objective of which is the formulation of a new covariant approach to nonequilibrium statistical mechanics in classical general relativity. The objecct here is the development of a tractable theory for self-gravitating systems. It is argued that the ''state'' of an N-particle system may be characterized by an N-particle distribution function, defined in an 8N-dimensional phase space, which satisfies a collection of N conservation equations. By mapping the true physics onto a fictitious ''background'' spacetime, which may be chosen to satisfy some ''average'' field equations, one then obtains a useful covariant notion of ''evolution'' in response to a fluctuating ''gravitational force.'' For many cases of practical interest, one may suppose (i) that these fluctuating forces satisfy linear field equations and (ii) that they may be modeled by a direct interaction. In this case, one can use a relativistic projection operator formalism to derive exact closed equations for the evolution of such objects as an appropriately defined reduced one-particle distribution function. By capturing, in a natural way, the notion of a dilute gas, or impulse, approximation, one is then led to a comparatively simple equation for the one-particle distribution. If, furthermore, one treats the effects of the fluctuating forces as ''localized'' in space and time, one obtains a tractable kinetic equation which reduces, in the Newtonian limit, to the stardard Landau equation
Causal structure of analogue spacetimes
International Nuclear Information System (INIS)
Barcelo, Carlos; Liberati, Stefano; Sonego, Sebastiano; Visser, Matt
2004-01-01
The so-called 'analogue models of general relativity' provide a number of specific physical systems, well outside the traditional realm of general relativity, that nevertheless are well-described by the differential geometry of curved spacetime. Specifically, the propagation of perturbations in these condensed matter systems is described by 'effective metrics' that carry with them notions of 'causal structure' as determined by an exchange of quasi-particles. These quasi-particle-induced causal structures serve as specific examples of what can be done in the presence of a Lorentzian metric without having recourse to the Einstein equations of general relativity. (After all, the underlying analogue model is governed by its own specific physics, not necessarily by the Einstein equations.) In this paper we take a careful look at what can be said about the causal structure of analogue spacetimes, focusing on those containing quasi-particle horizons, both with a view to seeing what is different from standard general relativity, and what the similarities might be. For definiteness, and because the physics is particularly simple to understand, we will phrase much of the discussion in terms of acoustic disturbances in moving fluids, where the underlying physics is ordinary fluid mechanics, governed by the equations of traditional hydrodynamics, and the relevant quasi-particles are the phonons. It must however be emphasized that this choice of example is only for the sake of pedagogical simplicity and that our considerations apply generically to wide classes of analogue spacetimes
Causality discovery technology
Chen, M.; Ertl, T.; Jirotka, M.; Trefethen, A.; Schmidt, A.; Coecke, B.; Bañares-Alcántara, R.
2012-11-01
Causality is the fabric of our dynamic world. We all make frequent attempts to reason causation relationships of everyday events (e.g., what was the cause of my headache, or what has upset Alice?). We attempt to manage causality all the time through planning and scheduling. The greatest scientific discoveries are usually about causality (e.g., Newton found the cause for an apple to fall, and Darwin discovered natural selection). Meanwhile, we continue to seek a comprehensive understanding about the causes of numerous complex phenomena, such as social divisions, economic crisis, global warming, home-grown terrorism, etc. Humans analyse and reason causality based on observation, experimentation and acquired a priori knowledge. Today's technologies enable us to make observations and carry out experiments in an unprecedented scale that has created data mountains everywhere. Whereas there are exciting opportunities to discover new causation relationships, there are also unparalleled challenges to benefit from such data mountains. In this article, we present a case for developing a new piece of ICT, called Causality Discovery Technology. We reason about the necessity, feasibility and potential impact of such a technology.
Causal Meta-Analysis : Methodology and Applications
Bax, L.J.
2009-01-01
Meta-analysis is a statistical method to summarize research data from multiple studies in a quantitative manner. This dissertation addresses a number of methodological topics in causal meta-analysis and reports the development and validation of meta-analysis software. In the first (methodological)
Causal Measurement Models: Can Criticism Stimulate Clarification?
Markus, Keith A.
2016-01-01
In their 2016 work, Aguirre-Urreta et al. provided a contribution to the literature on causal measurement models that enhances clarity and stimulates further thinking. Aguirre-Urreta et al. presented a form of statistical identity involving mapping onto the portion of the parameter space involving the nomological net, relationships between the…
Causality relationship between energy demand and economic ...
African Journals Online (AJOL)
This paper attempts to examine the causal relationship between electricity demand and economic growth in Nigeria using data for 1970 – 2003. The study uses the Johansen cointegration VAR approach. The ADF and Phillips – Perron test statistics were used to test for stationarity of the data. It was found that the data were ...
Natural time analysis and Tsallis non-additive entropy statistical mechanics.
Sarlis, N. V.; Skordas, E. S.; Varotsos, P.
2016-12-01
Upon analyzing the seismic data in natural time and employing a sliding natural time window comprising a number of events that would occur in a few months, it has been recently uncovered[1] that a precursory Seismic Electric Signals activity[2] initiates almost simultaneously with the appearance of a minimum in the fluctuations of the order parameter of seismicity [3]. Such minima have been ascertained [4] during periods of the magnitude time series exhibiting long range correlations [5] a few months before all earthquakes of magnitude 7.6 or larger that occurred in the entire Japanese area from 1 January 1984 to 11 March 2011 (the day of the M9 Tohoku-Oki earthquake). Before and after these minima, characteristic changes of the temporal correlations between earthquake magnitudes are observed which cannot be captured by Tsallis non-additive entropy statistical mechanics in the frame of which it has been suggested that kappa distributions arise [6]. Here, we extend the study concerning the existence of such minima in a large area that includes Aegean Sea and its surrounding area which exhibits in general seismo-tectonics [7] different than that of the entire Japanese area. References P. A. Varotsos et al., Tectonophysics, 589 (2013) 116. P. Varotsos and M. Lazaridou, Tectonophysics 188 (1991) 321. P.A. Varotsos et al., Phys Rev E 72 (2005) 041103. N. V. Sarlis et al., Proc Natl Acad Sci USA 110 (2013) 13734. P. A. Varotsos, N. V. Sarlis, and E. S. Skordas, J Geophys Res Space Physics 119 (2014), 9192, doi: 10.1002/2014JA0205800. G. Livadiotis, and D. J. McComas, J Geophys Res 114 (2009) A11105, doi:10.1029/2009JA014352. S. Uyeda et al., Tectonophysics, 304 (1999) 41.
Statistical Comparison of the Baseline Mechanical Properties of NBG-18 and PCEA Graphite
Energy Technology Data Exchange (ETDEWEB)
Mark C. Carroll; David T. Rohrbaugh
2013-08-01
High-purity graphite is the core structural material of choice in the Very High Temperature Reactor (VHTR), a graphite-moderated, helium-cooled design that is capable of producing process heat for power generation and for industrial process that require temperatures higher than the outlet temperatures of present nuclear reactors. The Baseline Graphite Characterization Program is endeavoring to minimize the conservative estimates of as-manufactured mechanical and physical properties by providing comprehensive data that captures the level of variation in measured values. In addition to providing a comprehensive comparison between these values in different nuclear grades, the program is also carefully tracking individual specimen source, position, and orientation information in order to provide comparisons and variations between different lots, different billets, and different positions from within a single billet. This report is a preliminary comparison between the two grades of graphite that were initially favored in the two main VHTR designs. NBG-18, a medium-grain pitch coke graphite from SGL formed via vibration molding, was the favored structural material in the pebble-bed configuration, while PCEA, a smaller grain, petroleum coke, extruded graphite from GrafTech was favored for the prismatic configuration. An analysis of the comparison between these two grades will include not only the differences in fundamental and statistically-significant individual strength levels, but also the differences in variability in properties within each of the grades that will ultimately provide the basis for the prediction of in-service performance. The comparative performance of the different types of nuclear grade graphites will continue to evolve as thousands more specimens are fully characterized from the numerous grades of graphite being evaluated.
Statistical mechanics of a time-homogeneous system of money and antimoney
Schmitt, Matthias; Schacker, Andreas; Braun, Dieter
2014-03-01
Financial crises appear throughout human history. While there are many schools of thought on what the actual causes of such crises are, it has been suggested that the creation of credit money might be a source of financial instability. We discuss how the credit mechanism in a system of fractional reserve banking leads to non-local transfers of purchasing power that also affect non-involved agents. To overcome this issue, we impose the local symmetry of time homogeneity on the monetary system. A bi-currency system of non-bank assets (money) and bank assets (antimoney) is considered. A payment is either made by passing on money or by receiving antimoney. As a result, a free floating exchange rate between non-bank assets and bank assets is established. Credit creation is replaced by the simultaneous transfer of money and antimoney at a negotiated exchange rate. This is in contrast to traditional discussions of full reserve banking, which stalls creditary lending. With money and antimoney, the problem of credit crunches is mitigated while a full time symmetry of the monetary system is maintained. As a test environment for such a monetary system, we discuss an economy of random transfers. Random transfers are a strong criterion to probe the stability of monetary systems. The analysis using statistical physics provides analytical solutions and confirms that a money-antimoney system could be functional. Equally important to the probing of the stability of such a monetary system is the question of how to implement the credit default dynamics. This issue remains open.
Statistical mechanics of a time-homogeneous system of money and antimoney
International Nuclear Information System (INIS)
Schmitt, Matthias; Schacker, Andreas; Braun, Dieter
2014-01-01
Financial crises appear throughout human history. While there are many schools of thought on what the actual causes of such crises are, it has been suggested that the creation of credit money might be a source of financial instability. We discuss how the credit mechanism in a system of fractional reserve banking leads to non-local transfers of purchasing power that also affect non-involved agents. To overcome this issue, we impose the local symmetry of time homogeneity on the monetary system. A bi-currency system of non-bank assets (money) and bank assets (antimoney) is considered. A payment is either made by passing on money or by receiving antimoney. As a result, a free floating exchange rate between non-bank assets and bank assets is established. Credit creation is replaced by the simultaneous transfer of money and antimoney at a negotiated exchange rate. This is in contrast to traditional discussions of full reserve banking, which stalls creditary lending. With money and antimoney, the problem of credit crunches is mitigated while a full time symmetry of the monetary system is maintained. As a test environment for such a monetary system, we discuss an economy of random transfers. Random transfers are a strong criterion to probe the stability of monetary systems. The analysis using statistical physics provides analytical solutions and confirms that a money–antimoney system could be functional. Equally important to the probing of the stability of such a monetary system is the question of how to implement the credit default dynamics. This issue remains open
Kawano, T.; Tatsuta, K.; Hobara, Y.
2015-12-01
Continuous monitoring of signal amplitudes of worldwide VLF transmitters is a powerful tool to study the lower ionospheric condition. Although, lower ionospheric perturbations prior to some of the major earthquakes have been reported for years, their occurrence and coupling mechanism between the ground and overlaying ionosphere prior to the earthquakes are not clear yet. In this paper, we carried out a statistical analysis based on the nighttime averaged signal amplitude data from the UEC's VLF/LF transmitter observation network. Two hundred forty three earthquakes were occurred within the 5th Fresnel zone of transmitter-receiver paths around Japan during the time period of 2007 to 2012. These earthquakes were characterized into three different groups based on the Centroid-Moment-Tensor (CMT) solution such as reverse fault type, normal fault type and stress slip type. The ionospheric anomaly was identified by a large change in the VLF/LF amplitude during nighttime. As a result, we found the ionospheric perturbations associated with both ground and sea earthquakes. Remarkably, the reverse fault type earthquakes have the highest occurrence rate of ionospheric perturbation among the three types both for sea (41%) and ground events (61%). The occurrence rates for normal type fault are 35% and 56% for sea and ground earthquakes respectively and the same for stress slip type are 39% and 20% for sea and ground earthquakes respectively. In both cases the occurrence rates are smaller than the reverse fault type. The clear difference of occurrence rate of the ionospheric perturbations may indicate that the coupling efficiency of seismic activity into the overlaying ionosphere is controlled by the pressure in the earth's crust. This gives us further physical insight of Lithosphere-Atmosphere-Ionosphere (LAI) coupling processes.
Olafsson, Gestur; Helgason, Sigurdur
1996-01-01
This book is intended to introduce researchers and graduate students to the concepts of causal symmetric spaces. To date, results of recent studies considered standard by specialists have not been widely published. This book seeks to bring this information to students and researchers in geometry and analysis on causal symmetric spaces.Includes the newest results in harmonic analysis including Spherical functions on ordered symmetric space and the holmorphic discrete series and Hardy spaces on compactly casual symmetric spacesDeals with the infinitesimal situation, coverings of symmetric spaces, classification of causal symmetric pairs and invariant cone fieldsPresents basic geometric properties of semi-simple symmetric spacesIncludes appendices on Lie algebras and Lie groups, Bounded symmetric domains (Cayley transforms), Antiholomorphic Involutions on Bounded Domains and Para-Hermitian Symmetric Spaces
Causal inference in econometrics
Kreinovich, Vladik; Sriboonchitta, Songsak
2016-01-01
This book is devoted to the analysis of causal inference which is one of the most difficult tasks in data analysis: when two phenomena are observed to be related, it is often difficult to decide whether one of them causally influences the other one, or whether these two phenomena have a common cause. This analysis is the main focus of this volume. To get a good understanding of the causal inference, it is important to have models of economic phenomena which are as accurate as possible. Because of this need, this volume also contains papers that use non-traditional economic models, such as fuzzy models and models obtained by using neural networks and data mining techniques. It also contains papers that apply different econometric models to analyze real-life economic dependencies.
Non-Bayesian Inference: Causal Structure Trumps Correlation
Bes, Benedicte; Sloman, Steven; Lucas, Christopher G.; Raufaste, Eric
2012-01-01
The study tests the hypothesis that conditional probability judgments can be influenced by causal links between the target event and the evidence even when the statistical relations among variables are held constant. Three experiments varied the causal structure relating three variables and found that (a) the target event was perceived as more…
Cause and Event: Supporting Causal Claims through Logistic Models
O'Connell, Ann A.; Gray, DeLeon L.
2011-01-01
Efforts to identify and support credible causal claims have received intense interest in the research community, particularly over the past few decades. In this paper, we focus on the use of statistical procedures designed to support causal claims for a treatment or intervention when the response variable of interest is dichotomous. We identify…
Granger Causality and Transfer Entropy Are Equivalent for Gaussian Variables
Barnett, Lionel; Barrett, Adam B.; Seth, Anil K.
2009-12-01
Granger causality is a statistical notion of causal influence based on prediction via vector autoregression. Developed originally in the field of econometrics, it has since found application in a broader arena, particularly in neuroscience. More recently transfer entropy, an information-theoretic measure of time-directed information transfer between jointly dependent processes, has gained traction in a similarly wide field. While it has been recognized that the two concepts must be related, the exact relationship has until now not been formally described. Here we show that for Gaussian variables, Granger causality and transfer entropy are entirely equivalent, thus bridging autoregressive and information-theoretic approaches to data-driven causal inference.
Directory of Open Access Journals (Sweden)
Higuera, I.
2012-06-01
Full Text Available The study and development of alternative, more ecoefficient binders than portland cement are attracting a good deal of scientific and technological interest. Binders obtained from the chemical interaction between calcium silico-aluminous materials and highly alkaline solutions are one of several types of such possible cements. The present paper discusses the mechanical behaviour and mineralogical composition of blended pastes made from NaOH-activated vitreous blast furnace slag and metakaolin. The aim of the study was to determine how parameters such as the slag/metakaolin ratio, activating solution concentration and curing temperature affect strength development in these binders. A statistical study was conducted to establish the impact of each variable and model strength behaviour in these alkaline cements. The conclusion drawn is that activator concentration and the slag/metakaolin ratio are both determinant parameters.
El estudio y desarrollo de cementos alternativos y más eco-eficientes que el cemento Portland es un tema de gran impacto a nivel científico y tecnológico. Entre esos posibles cementos se encuentran los cementos alcalinos que son materiales conglomerantes obtenidos por la interacción química de materiales silico-aluminosos cálcicos y disoluciones fuertemente alcalinas. En el presente trabajo se estudia el comportamiento mecánico y la composición mineralógica de mezclas de escoria vítrea de horno alto y metacaolín activadas alcalinamente con disoluciones de NaOH. El objetivo de este estudio es conocer cómo afectan parámetros tales como la relación escoria/metacaolín, la concentración de la disolución activadora y la temperatura de curado, al desarrollo resistente de las mezclas. A través del estudio estadístico realizado se ha podido establecer la influencia de cada variable y modelizar el comportamiento resistente de estos cementos alcalinos. Se concluye que la concentración del activador y la relaci
DEFF Research Database (Denmark)
Nielsen, Max; Jensen, Frank; Setälä, Jari
2011-01-01
to fish demand. On the German market for farmed trout and substitutes, it is found that supply sources, i.e. aquaculture and fishery, are not the only determinant of causality. Storing, tightness of management and aggregation level of integrated markets might also be important. The methodological...
Czech Academy of Sciences Publication Activity Database
Hvorecký, Juraj
2012-01-01
Roč. 19, Supp.2 (2012), s. 64-69 ISSN 1335-0668 R&D Projects: GA ČR(CZ) GAP401/12/0833 Institutional support: RVO:67985955 Keywords : conciousness * free will * determinism * causality Subject RIV: AA - Philosophy ; Religion
Energy Technology Data Exchange (ETDEWEB)
Llave, R. de la; Haro, A.
2000-07-01
Statistical mechanics requires a language that unifies probabilistic and deterministic description of physical systems. We describe briefly some of the mathematical ideas needed for this unification. These ideas have also proved important in the study of chaotic systems. (Author) 17 refs.
International Nuclear Information System (INIS)
Yeh, L.
1992-01-01
The phase-space-picture approach to quantum non-equilibrium statistical mechanics via the characteristic function of infinite- mode squeezed coherent states is introduced. We use quantum Brownian motion as an example to show how this approach provides an interesting geometrical interpretation of quantum non-equilibrium phenomena
Czech Academy of Sciences Publication Activity Database
Andrey, Ladislav; Erzan, R.
2002-01-01
Roč. 52, č. 12 (2002), s. 1349-1356 ISSN 0011-4626 R&D Projects: GA ČR GA305/02/1487 Institutional research plan: AV0Z1030915 Keywords : nonlinear gain curve * gain-threshold dependence * non-monotone transfer function * statistical mechanics Subject RIV: BA - General Mathematics Impact factor: 0.311, year: 2002